May 31, 2021 Article blog
In this article, let's discuss exactly what synchronization is, what asynchronous is, and what these two very important concepts mean in programming.
I believe that many students encounter the two words synchronization asynchronous when the brain instantly like a traffic light failure of the intersection into a state of coercion:
Yes, these two words, which look a lot like each other in fact, have caused a lot of trouble for bloggers.
Let's start with the work scene.
Suppose your boss now assigns you an urgent and important task that you must complete before you leave work (evil capitalism). To keep up with the progress, the boss moved a chair and sat watching you write the code.
You must have scolded, "WTF, are you so idle?" S taring at Laozi, can't you do something else? ”
The boss seems to have received your brainwaves: "I'm just waiting here, I'm not going anywhere until you're done, and I'm not going to the toilet." ”
In this example, the boss waits until you finish writing the task, which is called synchronization.
The next day, the boss gave you another assignment.
But this time it is not so anxious, this time the boss downplayed, "the young man can ah, good good, you work hard for another year, next year I will be financial freedom, today's task is not in a hurry, you write to tell me on the line."
This time the boss didn't stare at you to write the code, but turned to brush the video, you finished writing and the boss reported a "I'm done."
In this example, the boss after the task is no longer waiting to do nothing but to do other things, you complete the task after simply tell the boss to complete the task, this is called asynchronous.
It's worth noting that in asynchronous scenarios, the focus is on the boss brushing the play while you're writing code, and these two things are going on at the same time, rather than one side waiting for the other, so that's why, in general, asynchronous is more efficient than synchronization, no matter what scenario the sync asynchronous application is in.
We can see that the word synchronization is often related to keywords such as "dependency", "association" and "waiting" for tasks, while asynchronous is often related to keywords such as "no dependency", "no correlation", "no waiting" and "simultaneous occurrence".
By the way, if you meet a boss who's staring at you writing code behind you, thirty-six is the best thing to do.
As a hard-pressed programmer can not only pay attention to the brick, usually work communication can not be exempted, one of the efficient way of communication is quarrel... Ah no, it's the phone.
Usually when you call, one person is talking about another person listening, one person is talking when another person is waiting, waiting for the other person to finish and then go on, so in this scenario you can see that "dependency," "association," "waiting" these keywords appear, so the way to communicate by phone is called synchronization.
Another common form of communication used by code farmers is mail.
Mail is another essential means of communication, because no one is stupidly waiting for you to write the mail do nothing, so you can write slowly, when you write the message the recipient can do something like touch fish ah, go to the toilet, and at the same time complain about why the eleven holidays do not put two weeks of meaningful things.
At the same time, when you finish writing the mail sent out also do not need to dry Baba waiting for the other side to reply to do nothing, you can also do something like fish and other meaningful things.
Here, you write mail others fish, these two things are going on at the same time, the recipient and sender do not need to wait for each other, the sender finished writing the message when a simple point to send it, the recipient received can read, the recipient and sender do not need to rely on each other, do not need to wait for each other.
You see, in this scenario, the words "don't rely on", "unrelated" and "don't wait" appear, so the way mail communicates is asynchronous.
Now it's finally back to programming.
Now that we understand the meaning of synchronization and asynchronousness in various scenarios (I hope so), how do programmers understand synchronization and asynchronousness?
Let's start with synchronous calls, which is the scenario that programmers are most familiar with.
General function calls are synchronized, like this:
funcA() {
// 等待函数funcB执行完成
funcB();
// 继续接下来的流程
}
funcA
funcB
so no subsequent code in
funcB
will be executed until
funcA
is executed, which means
funcA
must
wait for
funcB
execution to complete, like this:
As we can see from the image above,
funcB
can't do anything during
funcA
run, which is typical synchronization.
Note that, in general, like this synchronous call,
funcA
and
funcB
are running on the same thread, which is most common.
It is worth noting, however, that even functions running in two unthreatenable threads can be called synchronously, as when we do IO operations, the underlying system is actually called by the system to make a request, such as disk file reading:
read(file, buf);
This is blocking I/O, and the program cannot move forward until the
read
function returns
read(file, buf);
// 程序暂停运行,
// 等待文件读取完成后继续运行
As shown in the figure:
The program can only be executed if the
read
function returns.
Note that unlike the synchronous call above, functions and tuned functions run on different threads.
Therefore, we can conclude that synchronization calls and functions have nothing to do with whether the tuned function is running on the same thread.
Here we would also like to stress once again that functions and tuned functions cannot be synchronized at the same time.
Synchronous programming is the most natural and easy to understand for programmers.
But the price of easy understanding is that in some scenarios, synchronization is not efficient for the simple reason that there is no way for tasks to be performed at the same time.
Let's look at asynchronous calls.
There are synchronous calls and asynchronous calls.
If you really understand what this section has been about so far, asynchronous calls aren't a problem for you.
In general, asynchronous calls are always related to I/O operations, and other time-consuming tasks such as shadowing, such as disk file reading and writing, sending and receiving network data, database operations, and so on.
Let's also take disk file reading as an example.
In the way the
read
function is called synchronously, the caller cannot move forward until the file has been read, but this is not the case if the
read
function can call asynchronously.
If the read function can be
read
asynchronously, the read function can be
read
immediately, even if the file has not been read.
read(file, buff);
// read函数立即返回
// 不会阻塞当前程序
It's like this:
As you can see, in this asynchronous call, the caller is not blocked, and the next program can be executed immediately after the function call is complete.
The point of asynchronous at this point is that the caller's next program execution can be done at the same time as the file read, as we can see from the figure above, which is where the asynchronous efficiency lies.
Note, however, that asynchronous calls are an understanding burden for programmers, and code writing is a burden, and in general, God will properly close a window when he opens a door for you.
Some students may ask, in the synchronous call, the caller no longer continue to execute but pause waiting, after the execution of the tuned function is natural is the caller to continue to execute, then asynchronous call under the caller how do you know whether the tuned function is complete?
This is divided into two situations:
The first situation is simple and does not need to be discussed.
The second case is more interesting, and there are usually two ways to implement it:
One is the notification mechanism, which means that when the task is executed, send a signal to notify the caller that the task is complete, noting that there are many ways to implement the signal here,
signal
in
Linux,
or by using mechanisms such as semaphyrage.
The other is callbacks, which we often
callback
which we'll focus on in the next article, which will have a brief discussion.
Next, let's use a specific example to explain synchronous and asynchronous calls.
Let's illustrate this problem with common Web services.
In general, Web Server receives some typical processing logic after receiving a user request, the most common being database queries (of course, you can also replace the database queries here with other I/O operations, such as disk reads, network traffic, etc.), where we assume that processing a user request requires steps A, B, C, and then reading the database, which requires steps D, E, F after the database is read, like this:
# 处理一次用户请求需要经过的步骤:
A;
B;
C;
数据库读取;
D;
E;
F;
Steps A, B, C, and D, E, F do not require any I/O, which means that these six steps do not require reading files, network traffic, etc., and only database queries are involved in I/O operations.
In general, such Web Server has two typical threads: the main thread and the database processing thread, note that this is just a typical scenario, and the specific business can actually make a difference, but that doesn't stop us from using two threads to illustrate the problem.
First, let's look at the simplest way to implement it, which is synchronization.
This approach is the most natural and easy to understand:
// 主线程
main_thread() {
A;
B;
C;
发送数据库查询请求;
D;
E;
F;
}
// 数据库线程
DataBase_thread() {
while(1) {
处理数据库读取请求;
返回结果;
}
}
This is the most typical synchronization method, where the main thread is blocked and suspended after making a database query request, until D, E, and F after the database query is completed, like this:
As we can see from the figure, there will be a "gap" in the main thread, which is the "leisure time" of the main thread, during which the main thread will have to wait for the database query to complete before proceeding with the subsequent processing process.
Here the main thread is like the boss of the supervisor, the database thread is like a programmer who is forced to move bricks, before moving bricks the boss does nothing but stare at you tightly, waiting for you to move the bricks before going to busy with other things.
Obviously, efficient programmers can't tolerate lazy main threads.
It's time to sacrifice the big killer, and that's asynchronous.
In this asynchronous implementation scenario, the main thread does not wait for the database to complete the query at all, but instead processes the next request directly after sending the database read and write request.
Some students may have questions, a request needs to go through A, B, C, database query, D, E, F these seven steps, if the main thread after the completion of A, B, C, database query directly to process the next request, then the remaining D, E, F in the previous request how to do?
If you haven't forgotten the last section, you should know that there are two situations, let's discuss them separately.
1, the main thread does not care about the results of the database operation
In this case, the main thread simply doesn't care if the database is queried, and the next three steps, D, E, and F, are handled on their own after the database query is complete, like this:
See, here's the point.
We said that a request needs to go through seven steps, the first three of which are done in the main thread and the last four in the database thread, so how does the database thread know how to process D, E, F after checking the database?
At this point, our other protagonist, the callback function, begins to appear.
Yes, the callback function is used to solve this problem.
We can encapsulate the steps of processing D, E, and F into a function, assuming that the function is named
handle_DEF_after_DB_query
:
void handle_DEF_after_DB_query () {
D;
E;
F;
}
This way the main thread passes the function together as an argument while sending a database query request:
DB_query(request, handle_DEF_after_DB_query);
When the database thread is finished processing, call
handle_DEF_after_DB_query
directly, which is what the callback function does.
Some students may wonder why this function should be passed to the database thread instead of the database thread defining its own call?
Because in terms of software organization, this is not what database threads should do.
All a database thread needs to do is query the database and then call a handler, which doesn't care or care what the database thread does.
You can pass in a variety of callback functions. That is, the database system can be programmed for the abstract function variable callback function, so as to better cope with the change, because the content change of the callback function does not affect the logic of the database thread, and if the database thread defines the handler itself then this design has no flexibility.
From a software development perspective, assuming that database thread logic is encapsulated for library delivery to other teams, how can the database team know what to do after a database query when it is being developed?
Obviously, only the user knows what to do after querying the database, so the user can simply pass in this callback function when using it.
In this way, the team of complex databases and the team of users have implemented the so-called decoupling.
Now you should understand what callback functions do.
If you feel that there is help to you, please reach out your small hand to help share and then look, the original is not easy, one of you is looking at the blogger's greatest affirmation, please everyone.
It's not easy, let me take a break from drinking a fork.
Let's move on.
And look closely at the two pictures above, can you see why asynchronous is more efficient than synchronization?
The reason is simple, this is also what we mentioned in this article, asynchronous natural do not have to wait, no dependence.
From the previous diagram, we can see that the "leisure time" of the main thread is gone, replaced by constant work, work, work, like a hard-pressed 996 programmer, and the database thread is not so large a large amount of idle, instead of work, work, work.
The main thread processing request and the database processing query request can be made simultaneously, so from the system performance point of view, such a design can make full use of system resources, more rapid processing of requests, from the user's point of view, the system response will be more rapid.
That's where the asynchronous efficiency comes in.
But we should also see that asynchronous programming is not as easy to understand as synchronization, and system maintainability is not as good as synchronization mode.
So is there a way to combine the ease of understanding of synchronization mode with the efficiency of asynchronous mode? The answer is yes, and we'll cover this technique in more detail in a later section.
Let's look at the second scenario, which is that the main thread needs to care about the results of the database query.
2. The main thread cares about the results of the database operation
In this case, the database thread needs to send the query results to the main thread using the notification mechanism, and the main thread continues to process the second half of the previous request after receiving the message, like this:
From here we can see that ABCDEF several steps are all processed in the main line, and the main thread also does not have "leisure time", but in this case the database thread is relatively idle, from here there is no previous method efficient, but still more efficient than in synchronization mode.
Finally, it's important to note that not all situations are asynchronously efficient than synchronization, and you need to analyze them in a context that combines the complexity of your specific business and IO.
In this article we analyze the concepts of synchronization and asynchronousness from various scenarios, but in any scenario, synchronization often means that both sides have to wait for each other and depend on each other, while asynchronous means that the two sides are independent of each other and do their own thing. I hope this article will help you understand these two important concepts.
Source: Public No
Author: The desert island of Code Farmer survives