SYSTEMS DESIGN — CACHING

Caching is a computing technique used to improve a system’s performance by storing frequently accessed data in a cache. Caching reduces latency and minimizes the workload on the underlying storage system by ensuring that frequently accessed data is readily accessible.

Ben Turner
2 min readJun 16, 2023

WHAT IS CACHING

Caching is a process used in computer systems to improve the performance of storing and accessing data.

This is achieved by storing frequently accessed data in a temporary storage location - a cache - so that it can be accessed quickly.

A cache is a high-speed, relatively small memory that sits between the main storage, such as a disk or database, and the requesting component, such as a processor or application.

Web browsers store HTML and image files and JavaScript scripts in their cache in order to load websites faster.

CDN servers store content in their cache to reduce latency.

DNS servers store DNS records in their cache for faster lookups.

TYPES OF CACHES

Caches can be implemented at various levels in a distributed system, within a single server, across multiple servers, or on the client side.

Distributed systems commonly use various types of caches.

📟 MEMORY CACHES

These caches store frequently accessed data in a high-speed memory location, such as RAM, which is closer to the processing units.

This arrangement decreases the time needed to retrieve data from slower storage devices, such as hard disks or databases.

💿 DISK CACHES

Disk caches, sometimes referred to as buffer caches, store recently accessed data from disks in memory.

This enhances disk I/O performance by minimizing the need for frequent physical reads or writes.

🌐 WEB CACHES

Web caches, commonly referred to as proxy caches, store duplicates of web content that has been requested by clients.

When subsequent requests for the same content are made, the cache can directly serve the data, reducing the load on backend servers and enhancing response times.

🌎 CONTENT DELIVERY NETWORKS (CDNs)

CDNs (Content Delivery Networks) represent a type of distributed caching where copies of content, such as images, videos, or web pages, are stored in servers spread across different geographic locations.

When a user requests content, the CDN reduces network latency by serving the content from the server that is closest to the user.

⌗ DATABASE CACHES

Database caches store frequently accessed data or query results in memory to speed up database operations.

By keeping this data readily available, databases significantly improve response times and reduce the load on the database server.

THIS STORY IS CONSISTENTLY UPDATED WITH ADDITIONAL INFORMATION TO IMPROVE THE QUALITY OF THE LEARNING AND READING EXPERIENCE 📚

--

--

Ben Turner
Ben Turner

No responses yet