§ 1Memory Decay Playground
How does information degrade across conversation turns? This module simulates the 'forgetting curves' of six different memory architectures. Input a fact and watch it fade or persist depending on how each system manages its limited context window. The mathematics draw from Ebbinghaus's forgetting curve, adapted for the discrete, token-based world of LLMs.
33/200 chars · ~9 tokens
Presets:
|
§ 1.5Validate Live: Does the LLM Actually Forget?
Test real memory decay by giving the LLM a fact, injecting filler conversation turns, then asking it to recall. See how retention degrades with increasing context distance.
The LLM will be asked to recall this fact after 0, 3, 7, 12, and 20 filler conversation turns.