Hi! The 5th article on concurrency!! This is an extension of our previous article Thread pools. Like in the previous articles, I’ll give a quick summary of the things we have discussed so far.

Suppose we have a big computational problem to solve…

A computational problem can be decomposed in to pieces. Why? Because we can solve those pieces separately and get the final solution.

There are several cores in most of the computer’s processors nowadays so we can share the pieces among the cores.

We have programming components called threads. We put a decomposed piece of the problem into a thread and let the PC to solve the problem by passing the thread to the system. So each piece should be connected to a thread like that.

Creating a thread for each sub-problem is silly. Why? Because it would create an additional processing time. So, we can create a pool of threads.

Then the sub-problems can be directed to the thread pool in a special type of queue data structure called ConcurrentLinkedQueue, which has the structure as follows.

When one sub problem is finished, a vacancy for a new sub problem is available in the thread. So, the next sub-problem in the queue gets the chance.

That’s it. So, now?

The thread pool we have created in our previous example, gets discarded when the items in the queue are over.

For example, imagine we have an image processing software. We create a thread pool to process parts of the image pixel rows. Once we finish processing the image, the thread pool exits.

If we want to process another image, we have to create a new thread pool again. So, this creation and destruction of thread pool also seems to be a wasteful process. What if we can suspend, but don’t kill, all the threads in the pool while the program is still running? Then the user may give a new image to process and the same thread pool can be used.

In the previous ConcurrentLinkedQueue, only tasks are waiting until a vacancy in the thread pool occurs. But if we can introduce another data structure that makes threads wait until tasks are available, we can fix the problem.

Blocking queues are the solution!

They are analogous to the producer consumer pattern 🙂 There should be a supply to fulfill the demand. Otherwise, the consumers have to wait until the producers produce.

Suppose the threads in the thread pool are the consumers. The suppliers are the threads that create the tasks (on the left of the above figure). There should be tasks in the queue or otherwise, the consumer threads in the pool will have to wait until new tasks are added.

The reverse is also required in most applications. It is useless to produce something which has no demand. That is, the producer threads may not produce the tasks until the consumer thread pool is ready accept them. To achieve this,

we have to create a blocking queue that can hold a limited number of tasks. If it is full, the producer threads (NOT the consumers) will be blocked.

Also, note that the queue should be synchronized. (We are dealing with parallel processing. So, only one thread is allowed to access the queue. Otherwise, there is a probability for two threads to access the queue at the same moment, which is REALLY bad. It could produce a race condition. Synchronization is done to avoid such errors. A synchronized queue allows only one thread to use the queue at once)

So now, we have two objectives to be achieved by a special synchronized queue.

Block consumer threads when the queue is empty Block producer threads when the queue is full

What are the suitable queues then?

We are focusing on two such queue types in Java.util.concurrent package. LinkedBlockingQueue and ArrayBlockingQueue.

The syntax is as follows

LinkedBlockingQueue<item_type> queue = new LinkedBlockingQueue<item_type>(no_of_items); ArrayBlockingQueue<item_type> queue2 = new ArrayBlockingQueue<item_type>(no_of_items); 1 2 LinkedBlockingQueue <item_type> queue = new LinkedBlockingQueue <item_type> ( no_of_items ) ; ArrayBlockingQueue <item_type> queue2 = new ArrayBlockingQueue <item_type> ( no_of_items ) ;

Following queues can hold objects called PixelRow. The queue object below can hold 21 such objects while the queue2 object can hold 10 PixelRow objects.

LinkedBlockingQueue<PixelRow> queue = new LinkedBlockingQueue<PixelRow>(21); ArrayBlockingQueue<PixelRow> queue2 = new ArrayBlockingQueue<PixelRow>(10); 1 2 LinkedBlockingQueue <PixelRow> queue = new LinkedBlockingQueue <PixelRow> ( 21 ) ; ArrayBlockingQueue <PixelRow> queue2 = new ArrayBlockingQueue <PixelRow> ( 10 ) ;

But, there’s a difference between them

If you want to achieve only the first objective mentioned above, you can use a LinkedBlockingQueue.

That is, if you don’t want a finite queue, but still want to block the consumers and NOT the producers, you can use a LinkedBlockingQueue object with no constructor arguments.

Eg:

LinkedBlockingQueue<PixelRow> queue = new LinkedBlockingQueue<PixelRow>(); 1 LinkedBlockingQueue <PixelRow> queue = new LinkedBlockingQueue <PixelRow> ( ) ;

This will produce a practically unlimited queue. (actually, it’s limited. The capacity of the queue is up to Integer.MAX_VALUE. That is, upto 2147483647 items of PixelRow)

Note the difference between this unlimited capacity LinkedBlockingQueue and the unlimited ConcurrentLinkedQueue that we have discussed in the previous article. Both produce an unlimited queue but, the old ConcurrentLinkedQueue DOESN’T block the consumers when the queue is empty.

Methods associated with LinkedBlockingQueue and ArrayBlockingQueue

Both classes implement the common interface BlockingQueue which has following two basic methods. Suppose queue is an object of LinkedBlockingQueue or ArrayBlockingQueue. I’ll discuss only the most basic methods here to make things easier to understand.

queue.put(item)

This will insert an item into the queue. If the queue is full, the producer thread that calls this method will be blocked. It will not come live until there’s a vacancy in the queue. Note that, in LinkedBlockingQueue with unlimited capacity, this method will never block a producer thread.

queue.take(item)

This method will pop out an item from the queue. If the queue is empty, the consumer thread in the thread pool will be blocked.

queue.clear()

This will clear all the items in the queue.

Well, this is the end of the first part of concurrency article series. Some of the things we have discussed so far, are actually, the history 🙂 So why wasted a lot of time writing them? My opinion is, it’s very important for a beginner to know how the concurrency techniques evolved gradually. Then it will make more sense when focusing on the future.

There are a lot of improvements have been done by the recent updates to Java, especially in Java 8. It made parallel programming a lot easier that before. I’ll discuss them in the upcoming articles soon. But for now, let’s take a break. I hope to continue the C++ for Java learners that we couldn’t touch for a long time.

Thanks for viewing my post guys. If you like it, please share it with your friends 🙂 With your kind support, we all can improve more!