LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • Client
  • AsyncClient
  • Run Helpers
  • Run Trees
  • Evaluation
  • Schemas
  • Utilities
  • Wrappers
  • Anonymizer
  • Testing
  • Expect API
  • Middleware
  • Pytest Plugin
  • Deployment SDK
⌘I

LangChain Assistant

Ask a question to get started

Enter to send•Shift+Enter new line

Menu

OverviewClientAsyncClientRun HelpersRun TreesEvaluationSchemasUtilitiesWrappersAnonymizerTestingExpect APIMiddlewarePytest PluginDeployment SDK
Language
Theme
Pythonlangsmithprompt_cachePromptCache
Classā—Since v0.7

PromptCache

Copy
PromptCache(
  self,
  *,
  max_size: int = DEFAULT_PROMPT_CACHE_MAX_SIZE,
  ttl_seconds: Optional[float

Bases

_BasePromptCache

Constructors

Methods

View source on GitHub
]
=
DEFAULT_PROMPT_CACHE_TTL_SECONDS
,
refresh_interval_seconds
:
float
=
DEFAULT_PROMPT_CACHE_REFRESH_INTERVAL_SECONDS
)

Parameters

NameTypeDescription
max_sizeint
Default:DEFAULT_PROMPT_CACHE_MAX_SIZE

Maximum entries in cache (LRU eviction when exceeded).

ttl_secondsOptional[float]
Default:DEFAULT_PROMPT_CACHE_TTL_SECONDS

Time before entry is considered stale. Set to None for infinite TTL (offline mode - entries never expire). Default: 300 (5 minutes).

refresh_interval_secondsfloat
Default:DEFAULT_PROMPT_CACHE_REFRESH_INTERVAL_SECONDS
constructor
__init__
NameType
max_sizeint
ttl_secondsOptional[float]
refresh_interval_secondsfloat
method
set

Set a value in the cache.

method
stop

Stop background refresh thread.

Should be called when the client is being cleaned up.

method
shutdown

Stop background refresh thread.

Should be called when the client is being cleaned up.

method
configure

Reconfigure the cache parameters.

Thread-safe LRU cache with background thread refresh.

For use with the synchronous Client.

Features:

  • In-memory LRU cache with configurable max size
  • Background thread for refreshing stale entries
  • Stale-while-revalidate: returns stale data while refresh happens
  • Thread-safe for concurrent access

Example:

def fetch_prompt(key: str) -> PromptCommit: ... return client._fetch_prompt_from_api(key) cache = PromptCache( ... max_size=100, ... ttl_seconds=3600, ... fetch_func=fetch_prompt, ... ) cache.set("my-prompt:latest", prompt_commit) cached = cache.get("my-prompt:latest") cache.shutdown()

How often to check for stale entries.