integrating amazon elasticache into your database stack - aws may 2016 webinar series
TRANSCRIPT
© 2016, Amazon Web Services, Inc. or its Affiliates. All rights reserved.
Dan Zamansky, Senior Product Manager
May 26, 2016
AWS Webinar SeriesIntegrating Amazon ElastiCache into Your Database Stack
Learning Objectives
• Understand key Amazon ElastiCache features
• Learn how to set up Amazon ElastiCache and how to integrate it into your database stack
• Explore sample use cases, best practices, and tips on using Amazon ElastiCache to speed up your application by reducing database latencies and increasing throughput
Modern Applications Require Real-Time PerformanceWe live in a connected world (IoT)
• Phones, Tablets, Cars, Air Conditioners, ToastersExperiences are increasingly contextualApps need to provide real-time responsesDatabase performance is frequently the bottleneckLoad is spikey and unpredictable
Businesses Need To Be Ready For Success!
What is ElastiCache?
AWS Database Services
Compute Storage
AWS Global Infrastructure
Database
Application Services
Deployment & Administration
Networking
Performance Optimized In-Memory Key-Value Store in the Cloud
Amazon RDS
Amazon DynamoDB Amazon Redshift
Amazon ElastiCache
Fully managed service = Automated Operations
In-Memory datastore self-hosted on Amazon EC2
Amazon ElastiCache
In-Memory Key-Value Store
High-performance
Redis and Memcached
Fully managed; Zero admin
Highly Available and Reliable
Hardened by Amazon
AmazonElastiCache
AmazonRDS
Amazon S3
Request RateHigh Low
LatencyLow High
Data VolumeLow High
AmazonGlacier
AmazonCloudSearch
Stru
ctur
eLow
High
AmazonDynamoDB
AmazonElastiCache
HDFS
ElastiCache - Customer Value
Extreme Performance
Sub-millisecond access latencies
Engineered for Cloud Scale
Open Source Compatible
Compatible with Redis and
Memcached
Existing code will work when you
update node end points
Fully Managed
Automates tasks such as failed node
replacement, software patching,
upgrades and backups
CloudWatch enables you to monitor cache performance
metrics
Secure and Hardened
Supports Amazon VPC and IAM for secure and fine grained access.
Monitors your nodes and applies security patches when necessary
Highly Available and
Scalable
Multi-AZ with automatic failover to
a read replica, no human intervention
required.
Easily scale your Redis (vertically) and Memcached
(horizontally) environments
Cost Effective
Pay as low as 17 cents per hour. Get
started with 750 free hours per
month of a micro node for a year.
No cross availability zone data transfer
costs.
Memcached
Slab allocator
In-memory key-value datastore very popular as a caching solution
Supports strings, objects
Multi-threaded
Insanely fast!
Very established
No persistence
Patterns for sharding
Redis – the in-memory king
Powerful ~200 commands
In-memory data structure server
Utility data structuresstrings, lists, hashes, sets, sorted sets, bitmaps & HyperLogLogs
Simple
Atomic operationssupports transactionshas ACID properties
Ridiculously fast!<1ms latency for most commands
Highly Available
Persistentsnapshots or append-only log
Open Source
Why Memcached and Redis?
Source: http://db-engines.com/en/ranking/key-value+store
Deploying ElastiCache for
ElastiCache with Redis - Development
Region
Availability Zone A Availability Zone B
Auto Scaling group
ElastiCache cluster
Nope
Prim
ary
Availability Zone A Availability Zone B
Rep
lica
Rep
lica
writesUse Primary Endpoint
readsUse Read Replicas
Auto-Failover Chooses replica with
lowest replication lag DNS endpoint is same
Snapshots
ElastiCache for Redis Multi-AZ
ElastiCache for Redis
ElastiCache for Redis
ElastiCache for Redis
Automatic Failover to a read replica in case of primary node failure
ElastiCacheAutomatessnapshots for persistence
ElastiCache with Redis Multi-AZ
Region
Availability Zone A Availability Zone B
Auto Scaling group
ElastiCache cluster
ElastiCache with Redis Multi-AZ
Region
Availability Zone A Availability Zone B
Auto Scaling group
ElastiCache cluster
ElastiCache with Redis Multi-AZ
Region
Availability Zone A Availability Zone B
Auto Scaling group
ElastiCache cluster
ElastiCache with Redis Multi-AZ
Region
Availability Zone A Availability Zone B
Auto Scaling group
ElastiCache cluster
Redis – Read/Write Connections
Create Snapshot
Restore Snapshot
Restore Snapshot
Key Use Patterns
Use Case #1 - Caching
Elastic LoadBalancing EC2 App
InstancesRDS MySQL DB Instance
ElastiCache
Database Writes
App Reads
ClientsCacheUpdates
Database Reads
Amazon ElastiCache
Eliminate database hotspots
Predictable Performance
Cut Load on Backend
Increase Read Throughput
Reduce App Latency
Caching Will Make Your Database More Efficient
Reduce Database Cost
Common Caching Technique #1 – Lazy Caching:
Adding Caching to Your Application is Easy
Common Caching Technique #2 – Write-back caching:
Adding Caching to Your Application is Easy
Not if I destroy it
first!It’s
mine!
Very popular for gaming appsNeed uniqueness + orderingEasy with Redis Sorted Sets
ZADD "leaderboard" 1201 "Gollum”ZADD "leaderboard" 963 "Sauron"ZADD "leaderboard" 1092 "Bilbo"ZADD "leaderboard" 1383 "Frodo”
ZREVRANGE "leaderboard" 0 -11) "Frodo"2) "Gollum"3) "Bilbo"4) "Sauron”
ZREVRANK "leaderboard" "Sauron"(integer) 3
Use Case 2 - Real-time Leaderboard
Use Case 3 - Chat and Messaging
• PUBLISH and SUBSCRIBE Redis commands• Game or Mobile chat, real-time comment streams• Server intercommunication
SUBSCRIBE chat_channel:114PUBLISH chat_channel:114 "Hello all">> ["message", "chat_channel:114", "Hello all"]UNSUBSCRIBE chat_channel:114
• Popular for recommendation engines and message board ranking• Redis counters – increment likes/dislikes• Redis hashes – list of everyone’s ratings• Process with algorithm like Slope One or Jaccardian similarity• Ruby example - https://github.com/davidcelis/recommendable
Use Case 4 – Ratings
INCR item:38927:likesHSET item:38927:ratings "Susan" 1
INCR item:38927:dislikesHSET item:38927:ratings "Tommy" -1
Ex: Throttling requests to an APILeverages Redis Counters
ELB
Externally Facing
API
Reference: http://redis.io/commands/INCR
FUNCTION LIMIT_API_CALL(APIaccesskey)limit = HGET(APIaccesskey, “limit”)time = CURRENT_UNIX_TIME()keyname = APIaccesskey + ":” + timecount = GET(keyname)IF current != NULL && count > limit THEN ERROR ”API request limit exceeded"ELSE MULTI INCR(keyname) EXPIRE(keyname,10) EXEC PERFORM_API_CALL()END
Use Case 5 - Rate Limiting
Tune It Up!
AmazonCloudWatch
Alarms
Monitoring with CloudWatch
• Easily Accessible via the ElastiCache console
• Track health of your nodes
• Set alarms to act when need to address running into limitations
Key ElastiCache CloudWatch Metrics
• CPUUtilization• Memcached – up to 90% ok• Redis – divide by cores (ex: 90% / 4 = 22.5%)
• SwapUsage low• Evictions low• CacheMisses / CacheHits Ratio low• CurrConnections stable
Common Issues
Thundering Herd
Causes• Cold cache – app startup• Adding / removing nodes• Cache key expiration (TTL)• Out of cache memory
Large # of cache misses Spike in database load
Mitigations• Script to populate cache• Gradually scale nodes• Randomize TTL values• Monitor cache evictions
1. Forks main Redis process2. Writes backup to disk from child process3. Continues to accept traffic on main process4. Any key update causes a copy-on-write5. Potentially DOUBLES memory usage by Redis
Swapping During Redis Backup (BGSAVE)
Reduce memory allocated to Redis• Set reserved-memory field in parameter groups • Decreases node memory available for app
Use larger cache node type• More expensive• But no data eviction
Write-heavy apps need extra memory http://bit.ly/elasticache-bgsave
Swapping During Redis Backup – Mitigations
Redis reserved-memory Parameter
Recap
• In-memory data store can help address the most demanding application requirements, as well as protect from impact of spikes in app access.
• ElastiCache is a reliable, secure and fully-managed, performance-optimized in-memory data structure service.
• Use Multi-AZ with automatic failover for production workloads.
• Redis provides powerful capabilities for a variety of in-memory use cases
• For much more detail on setting up and using ElastiCache please see our whitepaper (google “ElastiCache whitepaper”)
Free tier: 750 hours of Micro Cache Node per Month
Pay as little as $0.017 per hour thereafterfor a T2.micro node
You can get started for free
Thank you!