'Test function with lru_cache decorator

I'm attempting to test a a method that is memoized through lru_cache (since it's an expensive database call). with pytest-mock.

A simplified version of the code is:

class User:

    def __init__(self, file):
        # load a file

    @lru_cache
    def get(self, user_id):
        # do expensive call

Then I'm testing:

class TestUser:

    def test_get_is_called(self, mocker):
        data = mocker.ANY
        user = User(data)
        repository.get(user_id)
        open_mock = mocker.patch('builtins.open', mocker.mock_open())
        open_mock.assert_called_with('/foo')

But I'm getting the following error:

TypeError: unhashable type: '_ANY'

This happens because functools.lru_cache needs the keys stored to be hashable i.e. have a method __hash__ or __cmp__ implemented.

How can I mock such methods in a mocker to make it work?

I've tried

user.__hash__.return_value = 'foo'

with no luck.



Solution 1:[1]

Instead of using mocker.ANY (an object which is intented to be used in assertions as a placeholder that's equal to any object) I believe you instead want to use a sentinel object (such as mocker.sentinel.DATA).

This appears to work from a quick test:

from functools import lru_cache

@lru_cache(maxsize=None)
def f(x):
    return (x, x)


def test(mocker):
    ret = f(mocker.sentinel.DATA)
    assert ret == (mocker.sentinel.DATA, mocker.sentinel.DATA)

Solution 2:[2]

For people arriving here trying to work out how to test functions decorated with lru_cache or alru_cache, the answer is to clear the cache before each test.

This can be done as follows:

def setup_function():
    """
    Avoid the `(a)lru_cache` causing tests with identical parameters to interfere
    with one another.
    """
    my_cached_function.cache_clear()

Solution 3:[3]

How to switch off @lru_cache when running pytest

In case you ended up here because you want to test an @lru_cache - decorated function with different mocking (but the lru_cache prevents your mocking) ... Just set the maxsize of the @lru_cache to 0 if you run pytest!

@lru_cache(maxsize=0 if "pytest" in sys.modules else 256)

Minimal working example

with @lru_cache active (maxsize=256) when code runs and deactivated (maxsize=0) if pytest runs:

import sys
from functools import lru_cache

@lru_cache(maxsize=0 if "pytest" in sys.modules else 256)
def fct_parent():
    return fct_child()

def fct_child():
    return "unmocked"


def test_mock_lru_cache_internal(monkeypatch):
    """This test fails if @lru_cache of fct_parent is active and succeeds otherwise"""
    print(f"{fct_parent.cache_info().maxsize=}")
    for ii in range(2):
        ret_val = f"mocked {ii}"
        with monkeypatch.context() as mpc:
            mpc.setattr(f"{__name__}.fct_child", lambda: ret_val)  # mocks fct_child to return ret_val
            assert fct_parent() == ret_val

if __name__ == "__main__":
    """
    This module is designed to fail, if called by python
        $ python test_lru_cache_mocking.py
    and to work if exectued by pytest
        $ pytest -s test_lru_cache_mocking.py
    
    The reason is, that the size of the lru_cache is 256 / 0 respectively 
    and hence test_mock_lru_cache_internal fails / succeeds.
    """
    #
    from _pytest.monkeypatch import MonkeyPatch
    test_mock_lru_cache_internal(MonkeyPatch())

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Anthony Sottile
Solution 2 LondonRob
Solution 3 Markus Dutschke