Module 12
Last updated
Was this helpful?
Last updated
Was this helpful?
Static and frequently accessed data
Results of computationally intensive calculations
Results of time-consuming, frequently used, or complex database queries
Is a globally distributed system of caching servers
Has intermediary servers between the client and the application
Caches copies of commonly requested files (static content)
Delivers a local copy of the requested content from a nearby cache edge or point of presence
Is a CDN service that delivers content across the globe securely with low latency and high transfer speeds
Provides high-speed content distribution by delivering through edge locations
Improves application resiliency from distributed denial of service (DDoS) attacks by leveraging services such as AWS Shield and AWS WAF
A lot and closer to users
Fewer and farther away frm users
Smaller caches
Bigger caches
Helps server popular content
Help with les popular content
Used for when you ant CloudFront to distribute your content
Is a fully managed, key-value, in-memory data store with sub-millisecond latency
Sits between an application and an origin data store
Decreases access latency and eases the load of databases and applications
Provides a high performance and cost-effective in-memory cache
Is fully compatible with open source Redis and Memcached engines
A time to live must be set for the cached data
This strategy updates the cache after the data is requested
This strategy updates the cache immediately after updating the primary database
The cache contains only data that the application actually requests
The cache is up-to-date with the primary database (most likely data will be found in the cache)
This strategy requires a programmatic strategy to handle keeping the cache as up to date
This strategy results in an increase in cost from storing data in-memory that you might not use
The information won't update the cache until it gets a READ request from a user
Real-time updating of the cache