Quite recently at work, I was tasked with making improvements to the response time of our main service. The service was pretty slow for the work it was doing. There were some low-hanging fruits, and it got substantially faster with not a lot of effort. The next step was to reduce the number of external calls. Calls can be cached as the response from external service does not change that often. I chose a multi-level cache to reduce the latency as much as possible.
Tag: redis
Redis cache
I enjoy listening to podcasts, as they sometimes give me inspiration to create something. In one of the episodes of Python Bytes podcast guys mentioned https://github.com/bwasti/cache.py tool. cache.py
allows to cache function calls across runs by using cache file. Simple and really useful.
This inspired me to write a similar thing but for distributed apps, based on redis as a cache storage. I called it rcache
and you can find it on PyPI.
In order to use it simply decorate a function like this:
import rcache
@rcache.rcache()
def expensive_func(arg, kwarg=None):
# Expensive stuff here
return arg
Default redis address is http://localhost:6379
, but you can change it by passing
an url into the decorator @rcache.cache(url="http://your_redis:6379")
.
I hope you find it useful and if you wish to comment or report something please go to https://gitlab.com/the_speedball/redis.cache.py/issues.
Have fun