Document Details

DecisiveGreatWallOfChina1467

Uploaded by DecisiveGreatWallOfChina1467

Tags

caching computer science system design database optimization

Summary

This document discusses caching strategies, including client caching, CDN caching, web server caching, database caching, and application caching. It explains how caching can improve page load times and reduce server load by storing frequently accessed data for later retrieval. The document also covers cache invalidation and other related concepts.

Full Transcript

13 Cache *[ Source: Scalable system design patterns ]()* Caching improves page load times and can reduce the load on your servers and databases. In this model, the dispatcher will first lookup if the request has been made before and try to fi...

13 Cache *[ Source: Scalable system design patterns ]()* Caching improves page load times and can reduce the load on your servers and databases. In this model, the dispatcher will first lookup if the request has been made before and try to find the previous result to return, in order to save the actual execution. Databases often benefit from a uniform distribution of reads and writes across its partitions. Popular items can skew the distribution, causing bottlenecks. Putting a cache in front of a database can help absorb uneven loads and spikes in traffic. Client caching Caches can be located on the client side (OS or browser), server side, or in a distinct [ ]() cache layer. CDN caching [ CDNs are considered a type of cache. ]() Web server caching [ Reverse proxies and caches such as Varnish can serve static and dynamic content ]() [ ]() directly. Web servers can also cache requests, returning responses without having to contact application servers. Database caching Your database usually includes some level of caching in a default configuration, optimized for a generic use case. Tweaking these settings for specific usage patterns can further boost performance. Application caching In-memory caches such as Memcached and Redis are key-value stores between your application and your data storage. Since the data is held in RAM, it is much faster than typical databases where data is stored on disk. RAM is more limited than disk, so [ cache invalidation algorithms such as least recently used (LRU) can help invalidate ]() [ ]() 'cold' entries and keep 'hot' data in RAM. Redis has the following additional features: Persistence option Built-in data structures such as sorted sets and lists ⠀ There are multiple levels you can cache that fall into two general categories: database ** queries and objects: ** ** ** Row level Query-level Fully-formed serializable objects Fully-rendered HTML ⠀ Generally, you should try to avoid file-based caching, as it makes cloning and auto- scaling more difficult. Caching at the database query level Whenever you query the database, hash the query as a key and store the result to the cache. This approach suffers from expiration issues: Hard to delete a cached result with complex queries If one piece of data changes such as a table cell, you need to delete all cached queries that might include the changed cell ⠀ Caching at the object level See your data as an object, similar to what you do with your application code. Have your application assemble the dataset from the database into a class instance or a data structure(s): Remove the object from cache if its underlying data has changed Allows for asynchronous processing: workers assemble objects by consuming the latest cached object ⠀ Suggestions of what to cache: User sessions Fully rendered web pages Activity streams User graph data ⠀ Disadvantage(s): cache Need to maintain consistency between caches and the source of truth such as the database through cache invalidation. [ ]() Cache invalidation is a difficult problem, there is additional complexity associated with when to update the cache. Need to make application changes such as adding Redis or memcached.

Use Quizgecko on...
Browser
Browser