Event Loop
Async is about the gap between now and later.
Defination The mechanism that handles executing multiple chunks of your program over time, at each moment invoking the JS engine.
So, for example, when your JS program makes an Ajax request to fetch some data from a server, you set up the “response” code in a function (commonly called a “callback”), and the JS engine tells the hosting environment, “Hey, I’m going to suspend execution for now, but whenever you finish with that network request, and you have some data, please call this function back.”
The browser is then set up to listen for the response from the network, and when it has something to give you, it schedules the callback function to be executed by inserting it into the event loop.
1 | // `eventLoop` is an array that acts as a queue (first-in, first-out) |
Parallel Threading
Parallel is about things being able to occur simultaneously.
The most common tools for parallel computing are processes and threads. Processes and threads execute independently and may execute simultaneously: on separate processors, or even separate computers, but multiple threads can share the memory of a single process.
An event loop, by contrast, breaks its work into tasks and executes them in serial, disallowing parallel access and changes to shared memory. Parallelism and “serialism” can coexist in the form of cooperating event loops in separate threads.
1 | var a = 20; |
JavaScript never shares data across threads, which means that level of nondeterminism isn’t a concern. But that doesn’t mean JS is always deterministic.
Run-to-Completion
Because of JavaScript’s single-threading, the code inside of foo() (and bar()) is atomic, which means that once foo() starts running, the entirety of its code will finish before any of the code in bar() can run, or vice versa. This is called “run-to-completion” behavior.
As applied to JavaScript’s behavior, this function-ordering nondeterminism is the common term “race condition,” as foo() and bar() are racing against each other to see which runs first. Specifically, it’s a “race condition” because you cannot predict reliably how a and b will turn out.
Concurrency
Let’s imagine a site that displays a list of status updates (like a social network news feed) that progressively loads as the user scrolls down the list. To make such a feature work correctly, (at least) two separate “processes” will need to be executing simultaneously.
- The first “process” will respond to onscroll events (making Ajax requests for new content) as they fire when the user has scrolled the page further down.
- The second “process” will receive Ajax responses back (to render content onto the page).
Concurrency is when two or more “processes” are executing simultaneously over the same period, regardless of whether their individual constituent operations happen in parallel (at the same instant on separate processors or cores) or not. You can think of concurrency then as “process”-level (or task-level) parallelism, as opposed to operation-level parallelism (separate-processor threads).
Event Loop Queue1
2
3
4
5
6
7
8
9
10
11
12
13
14onscroll, request 1 <--- Process 1 starts
onscroll, request 2
response 1 <--- Process 2 starts
onscroll, request 3
response 2
response 3
onscroll, request 4
onscroll, request 5
onscroll, request 6
response 4
onscroll, request 7 <--- Process 1 finishes
response 6
response 5
response 7 <--- Process 2 finishes
“Process 1” and “Process 2” run concurrently (task-level parallel), but their individual events run sequentially on the event loop queue.