Prompt Caching [Anthropic]

Medium150 pts0 solves
[Anthropic-specific] Claude's prompt caching saves up to 90% cost by not reprocessing the same system prompt each call. Note: OpenAI also offers caching but with automatic detection, not explicit breakpoints. Always check your provider's docs. What part of the prompt does Anthropic cache? Flag format: CONGRESS{[cached_part]} Example: CONGRESS{full_conversation}
Hint
The part that doesn't change between requests. See docs.anthropic.com/en/docs/build-with-claude/prompt-caching.