Why is Everybody So Mad About Concurrency

At the beginning of April, PieSoft programmers were listening to, learning from, and being inspired by the latest technical developments in the Java Ecosystem at the international conference JPoint-2019. The event took place in Moscow, Russia, and brought together over a thousand participants and 43 speakers with their reports synchronized on four parallel tracks. Among various topics most frequently discussed within the event, particular emphasis was placed on successfully developing a new product through improving concurrent engineering, parallel execution processes, tools, and best practices. This means it’s time for you to find out what concurrency is and why it is so essential in today’s Java world.

So, what if you look up “concurrency” in the dictionary

There are several different concepts related to the issue:

  • concurrency
  • parallel computing
  • multithreading
  • asynchrony

Concurrency

Concurrency in computing
Concurrency in computing suggests that more than one task will be solved over a certain period of time.

Concurrency means working with multi-threaded code. Otherwise, it’s the possibility of executing two or more statements simultaneously. Imagine that the code is the water flowing through the pipe. You need to pump out this water from A to B (like code execution from the start – main to finish – exit). Given that you can’t change the pipe size, we add one more pipe. Therefore, in theory, we get more speed. The number of “pipes” that will be used with benefit depends on the number of cores in the processor.

However, concurrency is the most general term showing that more than one task is performed simultaneously. For example, you can watch TV and post photos on Facebook at the same time. Oh, common! Even Windows 95 could simultaneously play music and show pictures.

So, when we say concurrency, we don’t mean how it is to be obtained: by pausing some microprograms and switching them to another task; by truly simultaneous execution; by delegating work to other devices or something else. It does not matter.

The concurrent performance suggests that more than one task will be solved over a certain period of time. That’s it.

Parallel computing

We talk about parallel computing when more than one computing device (for example, a processor) simultaneously performs several tasks.

Parallel execution is a strict subset of concurrent computing. This means that parallel programming is impossible on a computer with one processor.

Multithreading

Multithreading is one of the ways to implement concurrent execution by highlighting the abstraction of the “worker thread.”

The threads “abstract” low-level details from the user and allow you to perform more than one task “in parallel.” The operating system, execution environment, or library hide the details of whether multi-threaded execution is concurrent (when we have more threads than physical processors) or parallel (when the number of lines is fewer or equal to the number of processors and several tasks are physically executed at the same time).

Asynchrony

Asynchrony implies that someone can operate on the side: a remote website, a server, or another device outside the current system.

The core attribute of such operations is that their beginning requires significantly less time than the main work. That allows you to perform many asynchronous functions simultaneously, even on a device with a small number of computing applications.

Concurrency in practice

At the moment, almost any large enterprise project can’t do without multithreading that allows us to use the capabilities of “hardware” in total. That is why a deep understanding of its basic principles is so important. With experience, this understanding should only deepen.

So, what do you need to make clear:

  • Implementing multithreading in your programming language
  • Problems that arise when developing multi-threaded applications
  • Ways to avoid these problems

Java multithreading implementation

Java multithreading implementation and concurrency
Multi-threaded applications appear where different threads run simultaneously.

Java provides the opportunity to create multi-threaded applications where different threads run simultaneously. It should be noted that the primary language tools are too low-level. Thus, it is not easy to use correctly the keywords volatilesynchronized, the methods wait()notify(), and notifyAll(). The developers need higher-level entities (thread pool, monitors, semaphores, etc.)

Issues that are worth exploring when working with Java

Java virtual machine threads management (create / start / stop).The code in each thread can be executed in parallel with the code in other threads. Thus, several tasks can be performed simultaneously. In this case, we need to deal with such concepts as threads, pools of threads, and futures. Program flow control (threads synchronization). There are situations when the code in one thread has to wait to complete the task in another thread. This is achieved using a variety of synchronization tools. A major factor in this matter is a deadlock when several threads are waiting for some action from each other. Control of access to memory (data) in a multi-threaded environment. In this case, it is important to understand the Java Memory Model, the visibility of variables, the atomicity of operations and race conditions occurrence, thread-safe collections.

Problems that arise when developing multi-threaded applications with Java

There may be a lot of problems, but the most common are:

Deadlock in computing
Deadlock is when several threads are waiting for some action from each other.

– Deadlock (mentioned above) occurs when we need to block access to a resource so that other flow could not get access before the resource to perform all necessary computing.

The simplest example of this is when two threads are waiting for calculations from each other. Here we can draw an analogy with plumbers and electricians. Imagine that the wiring is “flooded.” Electricians are afraid to start work until plumbers fix the leak, and plumbers are not eager to get an immediate energy boost for the whole day :).

Race condition and concurrency
The race condition is a matter of multiple data streams.

– Race condition – a situation when two or more streams are trying to access open data and change them simultaneously. And here, everything starts to depend on the one who first gained access to the stream.

The primary ways to avoid these problems are:

Deadlock – continuously monitor the order we get access to resources. Thus, we must clearly understand what is happening in our system.

Race condition – always be sure that the access to resources is organized properly (synchronized and other parallelism utilities). For example, we can make the file readable at any time but restrict access to the record, etc.

Are you looking for experienced Java developers to get your project to the next level or augment your existing dev team with top expertise? Then you’ve come to the right place. Drop us a line, and we’ll see how far we can get together.

OUR HEADQUARTERS

We are open to new challenging tasks and
we'd love to  learn more about your project.