Quick Answer
Caching in Python: functools.lru_cache for in-memory function result caching. dict for simple key-value caching. cachetools library for LRU, TTL, and other cache types. Redis (via redis-py) for distributed caching across processes. Django/Flask caching backends (memcached, Redis). Cache invalidation strategy: TTL-based expiry, event-based invalidation, or versioned cache keys.
Answer
Use lru_cache or dictionaries for in-memory caching. Use Redis/Memcached for distributed caching. Improves performance of repeated calls.
S
SugharaIQ Editorial Team
Verified Answer
This answer has been peer-reviewed by industry experts holding senior engineering roles to ensure technical accuracy and relevance for modern interview standards.