Technology Fundamentals

Concurrency

Definition

Concurrency is the ability of different parts or units of a program, algorithm, or problem to be executed out-of-order or in partial order, without affecting the final outcome. It is about dealing with lots of things at once.

Why It Matters

Concurrency is essential for modern applications, especially on servers that handle requests from many users simultaneously. It allows a system to make progress on multiple tasks, improving overall throughput and responsiveness.

Contextual Example

A web server is concurrent. It can handle a request to load a homepage, start processing a file download for another user, and accept a login attempt from a third user, all seemingly at the same time.

Common Misunderstandings

  • Concurrency is not parallelism. Concurrency is the composition of independently executing computations, while parallelism is the simultaneous execution of those computations. You can have concurrency on a single-core processor, but you need a multi-core processor for true parallelism.
  • Managing shared resources in concurrent programs is a major challenge, leading to problems like race conditions and deadlocks.

Related Terms

Last Updated: December 17, 2025