120K Token-Stop Configuration
- Entity ID:
ent-20260419-a020c1d2e3f4 - Type:
pattern - Scope:
shared - Status:
active - Aliases: token stop, pre-compaction cap
Description
Most-upvoted r/ClaudeAI post-leak configuration tip: cap context at 120K instead of 200K to prevent the compaction cascade from firing at 83% of 200K. Compaction triggers a hidden full-context API call on top of the user's turn; capping below the trigger avoids the doubled billing event. Derived directly from leaked autoCompact.ts thresholds.
Key claims
- 120K token-stop pattern derived from 200K*83% compaction threshold
Relations
- 120K Token-Stop Configuration --[derived_from]--> autoCompact.ts