Concurrency versus parallelism.

These two concepts are similar in nature: they both describe the ability to do multiple things at the same time. They differ in technical specifics though, on the definition of “at the same time”.

In parallelism, things can happen at the literal same time, up to the number of available executors such as CPU cores.

In concurrency, tasks and subtasks are queue and the executors schedule their execution against their scheduling rules. Thus, multiple activities can be advanced at roughly the same time, though only one unit of work can progress per executor at a time (specific moment in time). These definitions suggest that parallel executors must each implement concurrency where it’s possible for them to have more active jobs than executors, and also that concurrency takes more time and resources than parallelism because context must be preserved and loaded/persisted/unloaded as the executor jumps to-and-fro between its sundry responsibilities.