tlc.core.lru_cache
¶
Module Contents¶
Classes¶
Class |
Description |
---|---|
LRU cache where you can control how many slots are available, maximum memory to use for the cache, and a cache time out for the items. |
Functions¶
Function |
Description |
---|---|
Decorator to add an LRU cache to a function. |
Data¶
Data |
Description |
---|---|
API¶
- class tlc.core.lru_cache.LRUCache(max_entries: int, max_memory: int = 0, time_out: float = 0.0)¶
LRU cache where you can control how many slots are available, maximum memory to use for the cache, and a cache time out for the items.
- Parameters:
max_entries – The maximum number of entries the cache can hold.
max_memory – The maximum memory to use for the cache, in bytes. If set to 0, the cache will not use memory limits.
time_out – The time out for the items in the cache, in seconds. If set to 0, the items will never expire.
The stats() method will return a dictionary of important statistics about the cache. The clear() method will clear the cache and reset all statistics.
- LRUEntry = namedtuple(...)¶
- get(key: collections.abc.Hashable) Any ¶
- set(key: collections.abc.Hashable, value: object) None ¶
- delete(key: collections.abc.Hashable) None ¶
- tlc.core.lru_cache.lru_cache(max_entries: int, max_memory_in_bytes: int, time_threshold_in_seconds: float, time_out_in_seconds: float = 0.0) Callable ¶
Decorator to add an LRU cache to a function.
The decorator can control the number of cache slots (max_entries) and how much memory to use for cached element (max_memory).
In addition, the decorator can set how long a function execution must take before the result is cached (time_threshold), to avoid caching results that are fast to compute or retrieve, thus only using the cache for slower items.
The time_out parameter can be used to set how long each cached item should remain valid. If set to 0, the items will never expire.
- tlc.core.lru_cache.LRUFuncCache = None¶