Caching System: Part 1

Caching System: Part 1

Hey Everyone! How is it going? In this article, we'll learn about caching – what is it, why we need it, and its use cases. I hope this helps you understand the concept well.

Introduction to Caching

When figuring out that your application starts slowing down, then the reason is probably a bottleneck somewhere in the execution chain. Maybe, this is due to a bug, somebody didn’t set up the optimal configuration, or even this happened because of the process of fetching the data.

Deciding to use caching in your application is just the first step to start a very long journey. Almost everyone is familiar with caching nowadays, starting from CPU to browser to web applications, all software rely on caching to provide a fast response. A latency of few milliseconds can cause a billion-dollar loss for a huge company so the sub-millisecond response is an everyday need.

There are a vast number of caching solutions available in the market. But that doesn't mean using any of these technologies will solve your problem. So, I will try to explain different factors that help you decide which caching strategy will help you in your problem, also we discuss features and real-world use cases of already available solutions in the market so that before using them you know their common use cases.

In computing, a cache is a high-speed data storage layer that stores data, so that future requests for that data are served up faster than is possible by accessing the data’s primary storage location.

Caching allows you to efficiently reuse previously retrieved or computed data.

The data in a cache is generally stored in fast access hardware such as RAM (Random-access memory) and may also be used in correlation with a software component. A cache's primary purpose is to increase data retrieval performance by reducing the need to access the underlying slower storage layer.

Caching Benefits

1. Improve Application Performance

When the content is cached closer to the consumer, requests will not cause much additional network activity beyond the cache. Caching enables content to be retrieved faster because an entire network round trip is not necessary.

2. Reduce Database Cost

A single cache instance can reduce hundreds of thousands of IOPS (Input/output operations per second), thus driving the total cost down. This is significant if the primary database charges per throughput. In those cases, the price savings could be dozens of percentage points.

3. Reduce the Load on the Backend

By redirecting significant parts of the read load from the backend database to the in-memory layer, caching can reduce the load on your database, and protect it from slower performance under load, or even from crashing at times of spikes.

4. Availability of content during network interruptions

Caching can be used to serve content to end-users even when it may be unavailable for short periods of time from the origin servers.

Cache Use Cases

1. Database Caching.

The performance which your database provides can be the most impactful factor of your application’s overall performance. And despite the fact that many databases today offer relatively good performance, for a lot of use cases, your applications may require more. Database caching allows you to increase throughput and lower the data retrieval latency associated with backend databases, which improves the overall performance of your applications. A database cache layer can be applied in front of any type of database, including SQL and NoSQL databases.

2. Domain Name System (DNS) Caching

Every domain request made queries DNS cache servers in order to resolve the IP address associated with the domain name. DNS caching can occur on many levels including on the OS, via ISPs and DNS servers.

3. Application Programming Interfaces (APIs)

An API generally is a RESTful web service that can be accessed over HTTP and exposes resources that allow the user to interact with the application. When designing an API, it’s important to consider the load on the API. It’s not always the case that an API needs to instantiate business logic and/or make backend requests to a database on every request. Sometimes serving a cached result of the API will deliver the most optimal and cost-effective response. This is true when you are able to cache the API response to match the rate of change of the underlying data. By caching your API response, you eliminate pressure on your infrastructure including your application servers and databases. You also gain from faster response times and deliver a more performant API.

4. Token Caching

API Tokens can be cached to deliver high-performance user authentication and validation.

5. Web Page Caching

In order to make a web app lightweight and flexible, you can create dynamic web pages in the server and serve them through API along with appropriate data. So if you have millions of users, you can serve such on the fly created full web pages from the cache for a certain time period.

Conclusion

In this article, we saw that Caching is really awesome and we saw its benefits and some use cases in our real world.

Thank you for reading. Do drop your feedback in the comments below and if you liked it, share it with your friends who might find it useful too. If you want to have any discussion around this topic, feel free to reach out to me on Facebook