'Elastic Cache vs DynamoDb DAX

I have use case where I write data in Dynamo db in two table say t1 and t2 in transaction.My app needs to read data from these tables lot of times (1 write, at least 4 reads). I am considering DAX vs Elastic Cache. Anyone has any suggestions? Thanks in advance K



Solution 1:[1]

ElastiCache is not intended for use with DynamoDB.

DAX is good for read-heavy apps, like yours. But be aware that DAX is only good for eventually consistent reads, so don't use it with banking apps, etc. where the info always needs to be perfectly up to date. Without further info it's hard to tell more, these are just two general points to consider.

Amazon DynamoDB Accelerator (DAX) is a fully managed, highly available, in-memory cache that can reduce Amazon DynamoDB response times from milliseconds to microseconds, even at millions of requests per second. While DynamoDB offers consistent single-digit millisecond latency, DynamoDB with DAX takes performance to the next level with response times in microseconds for millions of requests per second for read-heavy workloads. With DAX, your applications remain fast and responsive, even when a popular event or news story drives unprecedented request volumes your way. No tuning required. https://aws.amazon.com/dynamodb/dax/

Solution 2:[2]

AWS recommends that you use **DAX as solution for this requirement. Elastic Cache is an old method and it is used to store the session states in addition to the cache data.

DAX is extensively used for intensive reads through eventual consistent reads and for latency sensitive applications. Also DAX stores cache using these parameters:-

  1. Item cache - populated with items with based on GetItem results.
  2. Query cache - based on parameters used while using query or scan method

Cheers!

Solution 3:[3]

I'd recommend to use DAX with DynamoDB, provided you're having more read calls using item level API (and NOT query level API), such as GetItem API.

Why? DAX has one weird behavior as follows. From, AWS,

"Every write to DAX alters the state of the item cache. However, writes to the item cache don't affect the query cache. (The DAX item cache and query cache serve different purposes, and operate independently from one another.)"

Hence, If I elaborate, If your query operation is cached, and thereafter if you've write operation that affect's result of previously cached query and if same is not yet expired, in that case your query cache result would be outdated.

This out of sync issue, is also discussed here.

Solution 4:[4]

I find DAX useful only for cached queries, put item and get item. In general very difficult to find a use case for it.

DAX separates queries, scans from CRUD for individual items. That means, if you update an item and then do a query/scan, it will not reflect changes.

You can't invalidate cache, it only invalidates when ttl is reached or nodes memory is full and it is dropping old items.

Take Aways:

  1. doing puts/updates and then queries - two seperate caches so out of sync
  2. looking for single item - you are left only with primary key and default index and getItem request (no query and limit 1). You can't use any indexes for gets/updates/deletes.
  3. Using ConsistentRead option when using query to get latest data - it works, but only for primary index.
  4. Writing through DAX is slower than writing directly to Dynamodb since you have a hop in the middle.
  5. XRay does not work with DAX

Use Case

  1. You have queries that you don't really care they are not up to date
  2. You are doing few putItem/updateItem and a lot of getItem

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1
Solution 2 Phani Teja
Solution 3 Tejaskumar
Solution 4 Townsheriff