Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
CRAFTFramework 
posted an update 13 days ago
Post
120
Quick observation from working with Claude Cowork on multi-session projects.

**The constraint:** Cowork runs in a sandboxed VM that resets between sessions. No state persistence, no conversation carryover, no cross-session memory. Each session starts with zero context about prior work.

https://craftframework.ai/the-hidden-cost-of-session-amnesia-why-context-matters-more-than-you-think/


Within a single session, context compaction occurs as conversations approach the token limit — the system summarizes earlier exchanges, trading detail for capacity. Decisions made early in a session degrade to single-sentence summaries as the conversation grows.

**The cost:** For task-based work (summarize this, debug that), this is irrelevant. For cumulative work — projects that build over days or weeks — it creates significant overhead. Users spend meaningful time each session re-establishing context that the AI previously held.

**Common workarounds observed:**
- Manual context injection (copy-paste at session start)
- Single mega-sessions (avoids reset, but context degrades)
- External state documents (maintained alongside the AI)

Each trades one form of overhead for another.

**The architectural question:** Claude's chat interface (claude.ai) now has persistent memory — free for all users since March 2026. But the desktop environment still operates on a session-reset model. The gap between chat-based memory and desktop-based amnesia is growing.

This seems like a general problem for desktop AI tools, not just Cowork. How are others approaching session continuity in local AI environments?

Call me crazy, but I always thought of how much more efficient it made me in token usage and how much of my work I was actively retaining between sessions. Was it annoying? Absolutely. Did the benefits outweigh the tokens lost? After awhile maybe?