How to investigate memory usage on ElastiCache for Redis
Hideaki Ishii
Posted on June 15, 2019
Recently, I faced a problem which memory usage on ElastiCache for Redis becomes huge.
This post describes how we can investigate such a problem (which keys are bottlenecks on ElastiCache).
Export and download a Redis backup
At first, to analyze, we have to download the .rdb
file from our ElastiCache following the official guide.
When exporting an ElastiCache backup, please be careful that the region for S3 must be the same as the one for the backup.
Analyze .rdb
with redis-rdb-tools
After downloading the file, we can analyze it with redis-rdb-tools.
For example, if we wanna generate a memory report, we can do it as follows.
> rdb -c memory /var/redis/6379/dump.rdb --bytes 128 -f memory.csv
> cat memory.csv
database,type,key,size_in_bytes,encoding,num_elements,len_largest_element
0,list,lizards,241,quicklist,5,19
If your .rdb
is too large (it may be difficult to open the csv file), you could also generate a sampled report like:
> rdb -c memory /var/redis/6379/dump.rdb | ruby -ne 'print $_ if rand < 0.1' > memory.csv
Of course, redis-rdb-tools
supports features other than generating a memory report.
For more information, please see the README.
Summary
- We can export and download a backup file easily with ElastiCache and S3.
- We can analyze a
.rdb
file easily with a tool likeredis-rdb-tools
.
References
Posted on June 15, 2019
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.