tlc.service.tlc_lrucache
¶
Module Contents¶
Classes¶
Class |
Description |
---|---|
LRUCache backend configuration. |
|
In-memory LRU cache backend. |
|
LRU cache where you can control how many slots are available, maximum memory to use for the cache, and a cache time out for the items. |
Functions¶
Function |
Description |
---|---|
Decorator to add an LRU cache to a function. |
API¶
- class tlc.service.tlc_lrucache.LRUCacheStoreConfig(/, **data: typing.Any)¶
Bases:
pydantic.BaseModel
LRUCache backend configuration.
Create a new model by parsing and validating input data from keyword arguments.
Raises [
ValidationError
][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.self
is explicitly positional-only to allowself
as a field name.
- class tlc.service.tlc_lrucache.LRUCacheStore(config: tlc.service.tlc_lrucache.LRUCacheStoreConfig)¶
Bases:
litestar.stores.base.Store
In-memory LRU cache backend.
Initialize
LRUCacheBackend
- tlc.service.tlc_lrucache.LRUFuncCache(max_entries: int, max_memory_in_bytes: int, time_threshold_in_seconds: float, time_out_in_seconds: float = 0.0) Callable ¶
Decorator to add an LRU cache to a function.
The decorator can control the number of cache slots (max_entries) and how much memory to use for cached element (max_memory).
In addition, the decorator can set how long a function execution must take before the result is cached (time_threshold), to avoid caching results that are fast to compute or retrieve, thus only using the cache for slower items.
The time_out parameter can be used to set how long each cached item should remain valid. If set to 0, the items will never expire.
- class tlc.service.tlc_lrucache.LRUCache(max_entries: int, max_memory: int = 0, time_out: float = 0.0)¶
LRU cache where you can control how many slots are available, maximum memory to use for the cache, and a cache time out for the items.
- Parameters:
max_entries – The maximum number of entries the cache can hold.
max_memory – The maximum memory to use for the cache, in bytes. If set to 0, the cache will not use memory limits.
time_out – The time out for the items in the cache, in seconds. If set to 0, the items will never expire.
The stats() method will return a dictionary of important statistics about the cache. The clear() method will clear the cache and reset all statistics.
- LRUEntry = namedtuple(...)¶
- get(key: collections.abc.Hashable) Any ¶
- set(key: collections.abc.Hashable, value: object) None ¶