Agent Memory Schema Evolution Is Protobuf on Hard Mode
The first painful agent-memory migration always teaches the same lesson: there were two schemas, and you only migrated one of them. The storage layer is fine — every row was rewritten, every key is in its new shape, the backfill job logged success. The agent is broken anyway. It keeps writing to user.preferences.theme, retrieves nothing, then helpfully synthesizes a default from context as if the key never existed. The migration runbook reports green. Users report stale memory.
The asymmetry is structural. A traditional service that depends on a renamed column gets a hard error and you fix it. An agent that depends on a renamed memory key gets a soft miss and confabulates around it. The schema lives in two places — your store and the model's context — and you can only migrate one of them with a SQL script.
Protobuf solved a version of this problem twenty years ago by codifying an additive-only discipline: fields are forever, numbers are forever, wire types never change, and removal is replaced with deprecation. That discipline is the right starting point for agent memory, with one extra constraint that makes it harder. Protobuf receivers ignore unknown fields by design. Agents don't.
Why "the memory store has a schema" is the insight you acquire too late
Most teams stand up agent memory the same way: a key-value store, a vector index for semantic recall, a thin SDK that lets the model write and read with whatever keys it likes. There's no DDL, no migration table, no versioned schema file checked into the repo. The model and the application code converge on a working set of keys — user.profile.name, tasks.open[*].deadline, meeting_notes.2026Q1 — and those keys harden into an implicit contract over weeks of production traffic.
This is a schema. It just isn't written down anywhere a code review can catch.
The problem surfaces the first time someone tries to clean it up. A developer notices that user.profile.name and user.full_name are storing the same data, picks one, runs a backfill. The store now has consistent keys. The agent does not. It still writes to both — sometimes the old name, sometimes the new — because months of in-context history have shown it both keys, and few-shot examples in the system prompt show it the old one. Worse, retrieval against the old key now misses, because the data lives under the new name. From the user's perspective, the agent suddenly forgot half of what it knew.
The lesson teams arrive at, painfully, is that the keys an agent uses are not implementation details of the storage layer. They are part of the model's prompt, reinforced by every retrieved memory, every few-shot example, every conversation summary that gets piped back into context. Migrating the store without migrating the prompts is the same kind of mistake as migrating a database column without migrating the application that queries it — except the application is a probability distribution and you can't grep it.
Protobuf rules, ported to memory
Protobuf's rules for safe schema evolution boil down to a small set of invariants: don't change field numbers, don't change wire types, don't reuse retired numbers, never make a field required, and prefer deprecation to deletion. The underlying principle is that the binary encoding contract is immutable; you only get to add things that older readers can ignore.
Port that to agent memory and the rules look like this:
- Keys are forever. Once an agent has written to
user.preferences.theme, that key path is reserved. You don't get to reuse it for a different field, even after you stop writing to it. - Types are forever at a key. If
user.preferences.themewas a string, it stays a string. Changing it to a structured object breaks every retrieval that returns the old shape into context, because the model will pattern-match on the wrong shape. - Add, don't mutate. A new preference shape lives at a new key (
user.preferences.theme_v2oruser.preferences.appearance). The old key continues to exist, possibly populated by a translation layer. - Deprecation is a state, not an event. A deprecated key still resolves on read for as long as there are any in-context examples or summaries that reference it. That window is months, not days.
- Removal requires evidence. Before you actually delete a key, you need traces showing the model has stopped referencing it across all live sessions and all summary regenerations.
The discipline rhymes with protobuf, but the constraint is harder because the receiver — the model — can't be patched in place. With protobuf, you ship a new generated client and the old field handler is gone. With an agent, every conversation that loaded the old key into context is still out there, and tomorrow's session may load a summary that references it. You're migrating against an audience that has memorized the old API.
The reinforcement vector nobody draws on the architecture diagram
When teams diagram their agent memory system, they draw a box for the store, an arrow for the write path, an arrow for the retrieval path, and a box for the model. What's missing from the diagram is every other place the schema lives.
The schema is also in:
- The system prompt's few-shot examples, which often hard-code key names to demonstrate the read/write API.
- In-context conversation history, which reproduces past tool calls and their arguments verbatim.
- Summary memory, where prior interactions get compressed into prose that mentions key names by name.
- Reflection or self-improvement loops, which generate plans referencing the keys the agent expects to find.
- Tool descriptions, where memory operations are documented with example payloads.
- https://earthly.dev/blog/backward-and-forward-compatibility/
- https://protobuf.dev/programming-guides/editions/
- https://yokota.blog/2021/08/26/understanding-protobuf-compatibility/
- https://docs.langchain.com/oss/python/langgraph/memory
- https://langchain-ai.github.io/langmem/
- https://blog.langchain.com/langmem-sdk-launch/
- https://dev.to/googleai/migrating-vector-embeddings-in-production-without-downtime-5bli
- https://www.infoq.com/articles/shadow-table-strategy-data-migration/
- https://datalakehousehub.com/blog/2026-02-de-best-practices-05-schema-evolution/
- https://arxiv.org/pdf/2502.12110
