The art of java concurrent programming (1): The challenge of concurrent programming

The challenge of concurrent programming is mainly in three aspects

  1. Context switching
  2. 死锁
  3. 资源限制

下The challenges of these three aspects are analyzed separately and how to deal with them.

1) Context Switch

1.1 What is context switching? Is multi-threading fast?

want to understand context switching, let's first understand a few concepts. We all know that multithreading is performed on one CPU. In fact, the CPU allocates different time slices to each thread through the time allocation algorithm. The time slice is the execution time allocated by the CPU to each thread, due to the unit size of the time slice. It is tens of milliseconds, so we feel that multiple threads are executing at the same time. But in fact, before switching from the current task to the next task, the state of the previous task is saved, so that the next time you switch to this task, you can directly load its last state.

So context switching means that the process from save to reload is a context switch. However, since this switching requires state preservation, it is costly and affects efficiency.

Because there is a context switch, multi-threaded execution is not necessarily single-threaded.

package chanllenge;

public class ConcurrencyTest {
	private static final long count = 10000L;
	public static void main(String[] args) {
		concurrency();
		serial();
	}
	private static void serial() {
		// TODO Auto-generated method stub
				long start = System.currentTimeMillis();
				
						int a = 0;
						for(long i=0;i<count;i++ ){
							a+=5;
						}
				
				int b = 0;
				for(long j=0;j<count;j++){
					b--;
				}
				
				
				long time = System.currentTimeMillis()-start;
				
				System.out.println("serial :"+time+"ms,b="+b+",a="+a);
				
				
				
				
	}
	private static void concurrency() {
		// TODO Auto-generated method stub
		// TODO Auto-generated method stub
				long start = System.currentTimeMillis();
				Thread t = new Thread(new Runnable() {
					@Override
					public void run() {
						// TODO Auto-generated method stub
						int a = 0;
						for(long i=0;i<count;i++ ){
							a+=5;
						}
					}
				});
				t.start();
				
				int b = 0;
				for(long j=0;j<count;j++){
					b--;
				}
				
				try {
					t.join();
				} catch (InterruptedException e) {
					// TODO Auto-generated catch block
					e.printStackTrace();
				}
				long time = System.currentTimeMillis()-start;
				
				System.out.println("concurrency :"+time+"ms,b="+b);
	}
	
	
}

The above code, the final execution result is:

concurrency :2ms,b=-10000
serial :0ms,b=-10000,a=50000

Obviously, the multi-thread execution is slow at this time.

1.2 How to deal with the challenge of context switching?

understand what context switching is, we just need to start from the principle, it is easy to know how to deal with the context of the challenge - as much as possible to reduce the thread or make the thread division clear , reduce the switch. Here are some ways to reduce context switching.

1.2.1 Lock-free concurrent programming

Multi-threaded competing locks, causing context switching. So we can use segmentation to process data and so on to avoid using locks.

1.2.2 CAS algorithm

Java's Atomic package uses the CAS algorithm to update data without the need for locking.

1.2.3 Use the minimum thread

to avoid creating unnecessary threads, such as few tasks, you do not need to create a lot of threads.

1.2.4 协程

implements multitasking scheduling in a single thread and maintains switching between multiple tasks in a single thread.

2) Deadlock

死锁, is a basic concept in the multi-threading of everyone, no longer repeat them here, only some methods to avoid deadlock.

2.1 Avoid a thread acquiring two locks at the same time

2. 2 Avoid a thread occupying multiple resources in the lock

2.3 Try to use the timed lock

2.4 For database locks, lock Unlocking must be in a database link

3) Resource Limits

Resource Limits are divided into hardware and software. Hardware resources are limited by server bandwidth, hard disk read and write speed, and CPU processing speed. Software resources are limited by the number of database links, the number of socket links, and so on.

In principle, we make the program speed up in order to make the serialization parallel, but if it is limited by resources, the serial will be faster than parallel execution.

Respond to resource restrictions, it depends on money, ah, ah