ElastiCache now supports Valkey, but I've tried to verify its benefits.
This is Onkai Yuta (@fat47) from the Service Reliability Group (SRG) of the Media Headquarters.
#SRG(Service Reliability Group) is a group that mainly provides cross-sectional support for the infrastructure of our media services, improving existing services, launching new ones, and contributing to OSS.
This article describes the benefits of Valkey, which is now supported by Amazon ElastiCache, in terms of performance and cost.
I hope this helps in some way.
Valkey is now supported by ElastiCache!Who is Valkey?Performance comparison between Valkey and Redis (also r6g and r7g)Loading environmentLoaded environmentComparison resultsCost comparison between Valkey and Redissummary
Valkey is now supported by ElastiCache!
On October 9, 2024 (JST), Amazon announced on their blog that Valkey is now supported in ElastiCache and MemoryDB.
Who is Valkey?
Valkey is an OSS product that is a fork of Redis.
Due to changes to the Redis license, from Redis 7.4 onwards, a contract with Redis, Inc. is required to provide Redis as part of a hosting service.
Following the revision, public cloud providers such as AWS and Google Cloud have announced that they will offer Valkey.
Performance comparison between Valkey and Redis (also r6g and r7g)
Simply put,memtier_benchmarkWe compared the performance of ElastiCache Redis and Valkey using the above.
As a prerequisite for ElastiCache, it is created in Single AZ with Cluster Mode disabled.
Loading environment
AMI: amazon/al2023-ami-2023.6.20241010.0-kernel-6.1-x86_64
Instance size: m7a.xlarge
memtier_banchmark introduced
memtier_banchmark execution command and options
Number of clients: 50, threads: 4, number of requests: 100,000
The ratio is 50:50 for reads and writes.
The key-pattern is random, and the maximum key is limited to 5000 to reduce GET cache misses.
The total number of requests is 50 clients * 4 threads * 100,000 requests = 20 million requests
It will be.
This load command was run three times in each environment and the average values were recorded.
Loaded environment
Environment A: redis-cache.6g.large (engine version 7.1)
Environment B: valkey-cache.6g.large (engine version 7.2)
C environment: redis-cache.7g.large (engine version 7.1)
D environment: valkey-cache.7g.large (engine version 7.2)
Comparison results
environment | Average execution time (sec) | Average throughput (ops/sec) | Average latency (msec) | Throughput comparison with environment A |
---|---|---|---|---|
A environment: redis-cache.r6g.large | 49s | 402034 | 49.73ms | 0% |
B environment:valkey-cache.r6g.large | 43s | 456866 | 43.78ms | 12.00% |
C environment: redis-cache.r7g.large | 34s | 583910 | 34.35ms | 31.15% |
D environment:valkey-cache.r7g.large | 31s | 631888 | 31.65ms | 36.68% |
Here is a graph of the average throughput.

Comparing redis and valkey in r6g12% performance improvementThis was the result.
r6g redisr7gIf you change the type to31.1% performance improvementI did.
Furthermore, r6g's redisChange the type to r7g and then change to valkeyThen,36.68% performance improvementIt has become.
The average latency is also faster.

I'm also looking forward to the performance improvements of Valkey 8.0, which is scheduled to be released.
Cost comparison between Valkey and Redis
Compared to Redis, Valkey seems to be cheaper.
It seems to be 33% cheaper for serverless and 20% cheaper for node-based.
I will translate and quote some of the content from the official Amazon blog mentioned earlier.
ElastiCache Serverless for Valkey is priced 33% lower than ElastiCache Serverless for Redis OSS, and node-based ElastiCache for Valkey is priced 20% lower than other node-based ElastiCache engines.
Valkey's fees can also be found on the price list.
The following is an excerpt from the price list (Tokyo region).
*RI fees are calculated as hourly rates with the full amount paid one year in advance.
Instance type | On-demand hourly rate | RI 1 year fully prepaid hourly rate | Valkey RI hourly rate |
---|---|---|---|
cache.r6g.large | USD 0.2470 | USD 0.157 | USD 0.125 |
cache.r6g.xlarge | USD 0.4930 | USD 0.313 | USD 0.251 |
cache.r6g.2xlarge | USD 0.9850 | USD 0.627 | USD 0.502 |
cache.r7g.large | USD 0.2630 | USD 0.168 | USD 0.135 |
cache.r7g.xlarge | USD 0.5240 | USD 0.335 | USD 0.268 |
cache.r7g.2xlarge | USD 1.0470 | USD 0.670 | USD 0.536 |
cache.m6g.2xlarge | USD 0.7640 | USD 0.487 | USD 0.389 |
cache.m7g.2xlarge | USD 0.8100 | USD 0.518 | USD 0.415 |
First, let's compare the full upfront RI price (USD 0.157) for cache.r6g.large with the full upfront RI price (USD 0.125) for Valkey.As per the official blog, it is about 20% cheaper.We can see that.
Next, let's compare cache.r6g.large and cache.r7g.large.
All upfront RIs are $0.157 for cache.r6g.large and $0.168 for cache.r7g.largeApproximately 7%It's expensive.
However, what if we use Valkey on cache.r7g.large?
The Valkey RI full upfront hourly rate for cache.r7g.large is USD 0.135, so compared to Redis r6g.large,It's about 14% cheaperWe can see that.
Additionally, size flexibility will be applied to ElastiCache Reserved Instances starting October 1, 2024.
As with RDS and EC2, even if you change the instance type you are using after purchasing an RI, the RI for the purchased size will be applied as long as it is within the same family. This makes it easier to choose when purchasing an RI.
summary
Load testing showed that using Valkey may result in some performance improvement.
In addition, using Valkey is cheaper in terms of cost.
If you migrate ElasticCache Redis running on r6g.large to ElastiCache Valkey on r7g.large,
In terms of performanceApproximately 36% performance improvementIn addition to that, in terms of costApproximately 14% cheaper。
Please note that Valkey will be available from engine version 7.2 onwards, so you will need to upgrade your existing ElastiCache Redis engine version to version 7 and confirm that it works properly.
This time, we compared Redis and Valkey assuming the purchase of node-based RIs, but it seems that serverless is also 33% cheaper, so we would like to consider which is more optimal: RI or serverless.
SRG is looking for people to work with us.
If you're interested, please contact us here.