'Is there a Python caching library?

I'm looking for a Python caching library but can't find anything so far. I need a simple dict-like interface where I can set keys and their expiration and get them back cached. Sort of something like:

cache.get(myfunction, duration=300)

which will give me the item from the cache if it exists or call the function and store it if it doesn't or has expired. Does anyone know something like this?



Solution 1:[1]

Take a look at Beaker:

Solution 2:[2]

From Python 3.2 you can use the decorator @lru_cache from the functools library. It's a Least Recently Used cache, so there is no expiration time for the items in it, but as a fast hack it's very useful.

from functools import lru_cache

@lru_cache(maxsize=256)
def f(x):
  return x*x

for x in range(20):
  print f(x)
for x in range(20):
  print f(x)

Solution 3:[3]

You might also take a look at the Memoize decorator. You could probably get it to do what you want without too much modification.

Solution 4:[4]

Joblib https://joblib.readthedocs.io supports caching functions in the Memoize pattern. Mostly, the idea is to cache computationally expensive functions.

>>> from joblib import Memory
>>> mem = Memory(cachedir='/tmp/joblib')
>>> import numpy as np
>>> square = mem.cache(np.square)
>>> 
>>> a = np.vander(np.arange(3)).astype(np.float)
>>> b = square(a)                                   
________________________________________________________________________________
[Memory] Calling square...
square(array([[ 0.,  0.,  1.],
       [ 1.,  1.,  1.],
       [ 4.,  2.,  1.]]))
___________________________________________________________square - 0...s, 0.0min

>>> c = square(a)

You can also do fancy things like using the @memory.cache decorator on functions. The documentation is here: https://joblib.readthedocs.io/en/latest/generated/joblib.Memory.html

Solution 5:[5]

No one has mentioned shelve yet. https://docs.python.org/2/library/shelve.html

It isn't memcached, but looks much simpler and might fit your need.

Solution 6:[6]

I think the python memcached API is the prevalent tool, but I haven't used it myself and am not sure whether it supports the features you need.

Solution 7:[7]

import time

class CachedItem(object):
    def __init__(self, key, value, duration=60):
        self.key = key
        self.value = value
        self.duration = duration
        self.timeStamp = time.time()

    def __repr__(self):
        return '<CachedItem {%s:%s} expires at: %s>' % (self.key, self.value, time.time() + self.duration)

class CachedDict(dict):

    def get(self, key, fn, duration):
        if key not in self \
            or self[key].timeStamp + self[key].duration < time.time():
                print 'adding new value'
                o = fn(key)
                self[key] = CachedItem(key, o, duration)
        else:
            print 'loading from cache'

        return self[key].value



if __name__ == '__main__':

    fn = lambda key: 'value of %s  is None' % key

    ci = CachedItem('a', 12)
    print ci 
    cd = CachedDict()
    print cd.get('a', fn, 5)
    time.sleep(2)
    print cd.get('a', fn, 6)
    print cd.get('b', fn, 6)
    time.sleep(2)
    print cd.get('a', fn, 7)
    print cd.get('b', fn, 7)

Solution 8:[8]

You can use my simple solution to the problem. It is really straightforward, nothing fancy:

class MemCache(dict):
    def __init__(self, fn):
        dict.__init__(self)
        self.__fn = fn

    def __getitem__(self, item):
        if item not in self:
            dict.__setitem__(self, item, self.__fn(item))
        return dict.__getitem__(self, item)

mc = MemCache(lambda x: x*x)

for x in xrange(10):
    print mc[x]

for x in xrange(10):
    print mc[x]

It indeed lacks expiration funcionality, but you can easily extend it with specifying a particular rule in MemCache c-tor.

Hope code is enough self-explanatory, but if not, just to mention, that cache is being passed a translation function as one of its c-tor params. It's used in turn to generate cached output regarding the input.

Hope it helps

Solution 9:[9]

Try redis, it is one of the cleanest and easiest solutions for applications to share data in a atomic way or if you have got some web server platform. Its very easy to setup, you will need a python redis client http://pypi.python.org/pypi/redis

Solution 10:[10]

Look at gocept.cache on pypi, manage timeout.

Solution 11:[11]

This project aims to provide "Caching for humans" (seems like it's fairly unknown though)

Some info from the project page:

Installation

pip install cache

Usage:

import pylibmc
from cache import Cache

backend = pylibmc.Client(["127.0.0.1"])

cache = Cache(backend)

@cache("mykey")
def some_expensive_method():
    sleep(10)
    return 42

# writes 42 to the cache
some_expensive_method()

# reads 42 from the cache
some_expensive_method()

# re-calculates and writes 42 to the cache
some_expensive_method.refresh()

# get the cached value or throw an error
# (unless default= was passed to @cache(...))
some_expensive_method.cached()

Solution 12:[12]

Look at bda.cache http://pypi.python.org/pypi/bda.cache - uses ZCA and is tested with zope and bfg.

Solution 13:[13]

ExpiringDict is another option:

https://pypi.org/project/expiringdict/