tlc.service.tlc_lrucache#

Module Contents#

Classes#

Class

Description

LRUCacheStoreConfig

LRUCache backend configuration.

LRUCacheStore

In-memory LRU cache backend.

LRUCache

LRU cache where you can control how many slots are available, maximum memory to use for the cache, and a cache time out for the items.

Functions#

Function

Description

LRUFuncCache

Decorator to add an LRU cache to a function.

API#

class tlc.service.tlc_lrucache.LRUCacheStoreConfig(/, **data: typing.Any)#

Bases: pydantic.BaseModel

LRUCache backend configuration.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

max_entries: int = 50000#

Available slots

max_memory_in_bytes: int = 1073741824#

Maximum memory to use for the cache

time_out_in_seconds: float = 3600#

Cache time out for the items

class tlc.service.tlc_lrucache.LRUCacheStore(config: tlc.service.tlc_lrucache.LRUCacheStoreConfig)#

Bases: litestar.stores.base.Store

In-memory LRU cache backend.

Initialize LRUCacheBackend

stats() dict[str, int]#
async get(key: str, renew_for: int | datetime.timedelta | None = None) bytes | None#
async set(key: str, value: str | bytes, expires_in: int | datetime.timedelta | None = None) None#
async delete(key: str) None#
async delete_all() None#
async exists(key: str) bool#
async expires_in(key: str) int | None#

Get the time in seconds key expires in. If no such key exists or no expiry time was set, return None.

tlc.service.tlc_lrucache.LRUFuncCache(max_entries: int, max_memory_in_bytes: int, time_threshold_in_seconds: float, time_out_in_seconds: float = 0.0) Callable#

Decorator to add an LRU cache to a function.

The decorator can control the number of cache slots (max_entries) and how much memory to use for cached element (max_memory).

In addition, the decorator can set how long a function execution must take before the result is cached (time_threshold), to avoid caching results that are fast to compute or retrieve, thus only using the cache for slower items.

The time_out parameter can be used to set how long each cached item should remain valid. If set to 0, the items will never expire.

class tlc.service.tlc_lrucache.LRUCache(max_entries: int, max_memory: int = 0, time_out: float = 0.0)#

LRU cache where you can control how many slots are available, maximum memory to use for the cache, and a cache time out for the items.

Parameters:
  • max_entries – The maximum number of entries the cache can hold.

  • max_memory – The maximum memory to use for the cache, in bytes. If set to 0, the cache will not use memory limits.

  • time_out – The time out for the items in the cache, in seconds. If set to 0, the items will never expire.

The stats() method will return a dictionary of important statistics about the cache. The clear() method will clear the cache and reset all statistics.

LRUEntry = namedtuple(...)#
clear() None#
get(key: Hashable) Any#
set(key: Hashable, value: object) None#
delete(key: Any) None#
remove_oldest_item() None#
expires_in(key: str) int | None#
stats() dict[str, int]#