Technology Fundamentals
Cache
Definition
A cache is a hardware or software component that stores data so that future requests for that data can be served faster. The data stored in a cache might be the result of an earlier computation or a copy of data stored elsewhere.
Why It Matters
Caching is a critical technique for improving performance. By storing frequently accessed data in a fast-access layer, it reduces latency and decreases the load on slower, backend systems like databases or external APIs.
Contextual Example
Your web browser caches images and other files from websites you visit. When you revisit a site, the browser loads these files from its local cache instead of re-downloading them, making the page load much faster.
Common Misunderstandings
- Data in a cache can become "stale" if the original data source changes. Cache invalidation (deciding when to delete cached data) is one of the hardest problems in computer science.
- Caching can happen at many levels: in the CPU, in the operating system, in a web browser, and in a dedicated caching servers like Redis.