Image by author
In Python, you can use caching to store the results of expensive function calls and reuse them when the function is called again with the same arguments. This makes your code more performant.
Python provides built-in support for caching via the functools
module: the decorators @cache
and @lru_cache
. And we will learn how to cache function calls in this tutorial.
Why is caching useful?
Caching function calls can significantly improve the performance of your code. Here are some reasons why caching function calls can be beneficial:
- Performance improvement:When a function is called with the same arguments multiple times, caching the result can eliminate redundant calculations. Instead of recalculating the result each time, the cached value can be returned, resulting in faster execution.
- Reduced resource usage: Some function calls may be computationally intensive or require significant resources (such as database queries or network requests). Caching the results reduces the need to repeat these operations.
- Improved responsiveness: In applications where responsiveness is crucial, such as web servers or GUI applications, caching can help reduce latency by avoiding repeated calculations or I/O operations.
Now let's get to the coding.
Caching with the @cache decorator
Let's code a function that calculates the nth Fibonacci number. Here is the recursive implementation of the Fibonacci sequence:
def fibonacci(n):
if n <= 1:
return n
return fibonacci(n-1) + fibonacci(n-2)
Without caching, recursive calls result in redundant calculations. If the values are cached, it would be much more efficient to look them up. And for this you can use the @cache
decorator.
He @cache
decorator of the functools
The module in Python 3.9+ is used to cache the results of a function. It works by storing the results of expensive function calls and reusing them when the function is called with the same arguments. Now let's wrap the function with the @cache
decorator:
from functools import cache
@cache
def fibonacci(n):
if n <= 1:
return n
return fibonacci(n-1) + fibonacci(n-2)
We'll get to the performance comparison later. Now let's look at another way to cache function return values using the @lru_cache
decorator.
Caching with the @lru_cache decorator
You can use the built-in functools.lru_cache
The decorator is also used for caching. Least Recently Used (LRU) caching mechanism for function calls. In LRU caching, when the cache is full and a new item needs to be added, the least recently used item in the cache is removed to make room for the new item. This ensures that the most frequently used items are retained in the cache, while the less frequently used items are discarded.
He @lru_cache
decorator is similar to @cache
but allows you to specify the maximum size, such as maxsize
argument—from the cache. Once the cache reaches this size, the least recently used items are discarded. This is useful if you want to limit memory usage.
Here the fibonacci
The function caches up to the 7 most recently calculated values:
from functools import lru_cache
@lru_cache(maxsize=7) # Cache up to 7 most recent results
def fibonacci(n):
if n <= 1:
return n
return fibonacci(n-1) + fibonacci(n-2)
fibonacci(5) # Computes Fibonacci(5) and caches intermediate results
fibonacci(3) # Retrieves Fibonacci(3) from the cache
Here the fibonacci
The function is decorated with @lru_cache(maxsize=7)
specifying that it should cache up to the 7 most recent results.
When fibonacci(5)
is called, the results for fibonacci(4)
, fibonacci(3)
and fibonacci(2)
They are cached. When fibonacci(3)
It is called later, fibonacci(3)
is retrieved from the cache as it was one of the seven most recently calculated values, avoiding redundant calculations.
The timing function requires a comparison
Now let's compare the execution times of the functions with and without caching. In this example, we do not set an explicit value for maxsize
. So maxsize
will be set to the default value of 128:
from functools import cache, lru_cache
import timeit
# without caching
def fibonacci_no_cache(n):
if n <= 1:
return n
return fibonacci_no_cache(n-1) + fibonacci_no_cache(n-2)
# with cache
@cache
def fibonacci_cache(n):
if n <= 1:
return n
return fibonacci_cache(n-1) + fibonacci_cache(n-2)
# with LRU cache
@lru_cache
def fibonacci_lru_cache(n):
if n <= 1:
return n
return fibonacci_lru_cache(n-1) + fibonacci_lru_cache(n-2)
To compare the execution times, we will use the timeit
function of the timeit
module:
# Compute the n-th Fibonacci number
n = 35
no_cache_time = timeit.timeit(lambda: fibonacci_no_cache(n), number=1)
cache_time = timeit.timeit(lambda: fibonacci_cache(n), number=1)
lru_cache_time = timeit.timeit(lambda: fibonacci_lru_cache(n), number=1)
print(f"Time without cache: {no_cache_time:.6f} seconds")
print(f"Time with cache: {cache_time:.6f} seconds")
print(f"Time with LRU cache: {lru_cache_time:.6f} seconds")
Running the code above should give a similar result:
Output >>>
Time without cache: 2.373220 seconds
Time with cache: 0.000029 seconds
Time with LRU cache: 0.000017 seconds
We observed a significant difference in execution times. The function call without caching takes much longer to execute, especially for larger values of n
. While the cached versions (both @cache
and @lru_cache
) run much faster and have comparable runtimes.
Ending
When using the @cache
and @lru_cache
decorators, can significantly speed up the execution of functions that involve expensive calculations or recursive calls. You can find the full code on GitHub.
If you are looking for a comprehensive guide on best practices for using Python for data science, read 5 Python Best Practices for Data Science.
twitter.com/balawc27″ rel=”noopener”>Bala Priya C. is a developer and technical writer from India. He enjoys working at the intersection of mathematics, programming, data science, and content creation. His areas of interest and expertise include DevOps, data science, and natural language processing. He likes to read, write, code and drink coffee! Currently, he is working to learn and share his knowledge with the developer community by creating tutorials, how-to guides, opinion pieces, and more. Bala also creates engaging resource descriptions and coding tutorials.
<script async src="//platform.twitter.com/widgets.js” charset=”utf-8″>