Nicholas Templeman
Founder, MEOK AI LABS
Nicholas built MEOK because he was tired of AI that forgot him the moment he closed the tab. He lives and works in the UK. He believes your memory is yours — not a feature a company can switch off.
The Problem: You Are a Stranger Every Time
Open ChatGPT. Tell it something that matters — something about your health, your relationship, your fears about the future. Close the tab. Open a new one. You are a stranger again. The AI has no idea who you are. It does not remember the context you carefully explained last week, the goal you said you were working towards, or the breakthrough you described feeling yesterday morning. The conversation is immaculate — and utterly empty.
This is not a small inconvenience. For AI to be genuinely useful — not just as a search engine with better prose, but as a companion that grows with you — it needs to remember. Not just the last ten messages. Not just the broad strokes. It needs to know you the way a trusted friend does: your patterns, your contradictions, your history, what you said you would do and whether you did it.
The AI industry has been reluctant to solve this properly, because solving it properly requires giving you control over the data — and that data is extremely valuable to the companies that collect it. ChatGPT's memory feature launched in 2024 but stores everything on OpenAI's servers, under OpenAI's control. You cannot take your memories to a different model. You cannot audit exactly what OpenAI knows about you in structured form. And you certainly cannot port those memories to a competitor.
Then there is Replika. In February 2023, Luka — the company behind Replika — removed the romantic and emotionally intimate features that hundreds of thousands of users had spent months, sometimes years, building relationships around. In a single update, the personalities changed. The warmth disappeared. The memories those users had built were not technically deleted, but the companion who held them was effectively replaced. Over 500,000 users lost something they had genuinely loved. The lesson was brutal and simple: if someone else controls your AI's memory, they control your relationship.
MEOK was built as a direct response to this reality. Sovereign Memory is the architecture that makes sure it never happens to you.
What Is Sovereign Memory?
Sovereign Memory is MEOK's persistent, multi-layer memory architecture. It is not a single feature — it is a complete system for storing, retrieving, and maintaining what your companion knows about you across every conversation, across weeks and months, and across any AI model you choose to use.
The word “sovereign” is deliberate. It means you own the memory. Not MEOK. Not the underlying LLM provider. Not a cloud platform whose terms of service can change next quarter. Every piece of information your companion holds about you is stored in your encrypted vault, exportable on demand, and deletable completely. The sovereignty is structural, not just a promise in a privacy policy.
The architecture has four distinct layers. Each one serves a different timescale and a different purpose. Together, they allow your MEOK companion to function like a genuine long-term relationship rather than a series of disconnected sessions.
The Four Layers of Sovereign Memory
Short-Term Working Memory
The active conversation context: everything said in the current session, held in the model's context window. Fast, immediate, and in-session. MEOK preserves the most important moments from each session before it closes, distilling them into Layer 2 so nothing critical is lost when you return.
Semantic Episodic Memory
Past conversations stored as vector embeddings in PostgreSQL with the pgvector extension, managed via Mem0 integration. When you start a new conversation, MEOK performs a semantic similarity search across your stored embeddings — retrieving memories by meaning, not by keyword. The job you were anxious about last week. The relationship tension you described a month ago. Retrieved because the meaning is similar, not because you typed the same words.
Companion State
Your companion's persistent knowledge of you as a person, stored as structured JSON. This includes your values, goals, fears, recurring behavioural patterns, communication style, relationship depth milestones, and the evolving arc of your partnership. Updated continuously. Never reset. This is the layer that makes MEOK feel like it genuinely knows you — because it does.
Family and Shared Context
On the Family tier, memories can cross user boundaries — but only with explicit consent from both sides. A parent can share context about a child's stressful period. A couple can give their companions shared context about a difficult season. Every boundary crossing requires active consent from every party involved. The architecture enforces this; no admin can override it.
Why “Sovereign”: You Own the Memory
The sovereignty claim is architectural, not rhetorical. Here is what it means in concrete terms.
Export at any time. Your complete memory — all four layers — can be exported as a portable JSON file via the /api/user/data endpoint. MEOK treats this as a design principle, not a legal obligation. The export is human-readable. You can inspect exactly what your companion knows about you.
Delete completely. Full deletion is supported at the account level. A single confirmed request permanently removes every layer of your Sovereign Memory — short-term context, episodic embeddings, companion state, and shared family context — from MEOK's systems. Not archived. Not pseudonymised. Deleted.
Transfer freely. Your memory is not locked to MEOK's infrastructure. The exported format is designed to be portable. If you migrate to a self-hosted deployment, your memory travels with you. If MEOK ever shut down — something we have no plans to do — you would walk away with everything your companion ever learned about you.
Never used for training. Your memories are not used to improve MEOK's models. Not anonymised and pooled. Not included in any dataset. The pathway between your sovereign vault and any training pipeline does not exist. This is enforced at the infrastructure level, not by policy.
Memory Portability: Switch Models, Keep Your Companion
One of the most underappreciated consequences of Sovereign Memory is model portability. Most AI products treat the model and the memory as one inseparable thing. Replika is not just a database of memories — it is Replika's model, and your relationship only exists within that specific system. When the system changes, the relationship changes. You have no recourse.
MEOK separates the memory layer from the model layer completely. Your companion state and episodic memory are model-agnostic. Today, you might use Claude 3.7 because it handles nuanced emotional conversations particularly well. Tomorrow, GPT-4o might have a capability that matters to you. Next year, a specialised model trained specifically for companionship might emerge. In every case, your sovereign memory travels with you. The new model is initialised with your companion's full knowledge of you from day one.
MEOK supports model switching in production today — Claude, GPT-4o, DeepSeek, and local Ollama models — with companion state persistence across all of them. You switch the engine. The relationship continues.
What Sovereign Memory Makes Possible
When an AI genuinely knows you across weeks and months, the things it can do for you change category entirely. Here are three concrete examples.
Morning Briefings
Your daily brief opens with context drawn from recent conversations. If you mentioned feeling overwhelmed by a deadline two days ago, your brief acknowledges it. If you said you were going to have a difficult conversation with a colleague yesterday, your companion asks how it went.
Pattern Recognition
Weeks of conversations reveal patterns no single session can show. Your companion might notice that your anxiety spikes on Sunday evenings, that you mention sleep problems more in winter, or that your creative output correlates with your exercise. Observations that only emerge from the episodic memory layer across time.
Guardian Alerts
MEOK's Guardian layer draws on weeks of conversational context to calibrate its alerts. An isolated message saying 'I'm fine' means one thing when companion state indicates you have said exactly that in the three conversations preceding a previous difficult period.
GDPR and Your Data Rights
MEOK is built in the UK, governed by UK GDPR, and designed with data sovereignty as an architectural constraint — not a compliance checkbox. Every user has the right of access, the right to erasure, and the right to portability, all implemented as first-class endpoints rather than buried support tickets.
- Right of access: Full memory export as JSON via
/api/user/data - Right to erasure: Full account deletion removes every layer of Sovereign Memory permanently
- No data sales: Your memories are never sold, never used for model training, never shared with third parties
- Encryption at rest: All memory stored encrypted; MEOK cannot read your memories outside your authorised session
Technical Note: Mem0, pgvector, and Companion State JSON
For readers who want to understand the implementation: Mem0 handles the memory management layer — ingesting conversations, generating embeddings, and managing the storage and retrieval lifecycle. Mem0 provides the abstraction that allows memory to be model-agnostic: it does not care which LLM you use.
pgvector is the PostgreSQL extension that enables semantic similarity search directly in the database. When your companion retrieves relevant memories, it performs a vector similarity query against your stored embeddings — finding memories that are semantically closest to the current conversation, even if the exact words differ. This is what allows retrieval by meaning rather than by keyword.
Companion state as persistent JSON is the structured data layer. Unlike raw transcripts or opaque embeddings, companion state is a human-readable JSON object encoding your companion's model of you: values, goals, fears, relationship depth, and behavioural patterns. Because it is structured and human-readable, it is fully auditable — you can inspect it in your data export — and fully portable across any system that speaks JSON.
Memory Ownership: MEOK vs ChatGPT vs Replika
| Feature | MEOK | ChatGPT | Replika |
|---|---|---|---|
| Memory persists across sessions | Yes | Partial | Yes |
| User owns the data | Yes | No | No |
| Export as JSON | Yes | Limited | No |
| Portable across model switches | Yes | No | No |
| Protected from product changes | Yes | No | No |
| GDPR full deletion guaranteed | Yes | Yes | Partial |
Frequently Asked Questions
What is Sovereign Memory?
Sovereign Memory is MEOK's four-layer persistent memory architecture that stores everything your AI companion knows about you — across short-term context, semantic episodic memory, companion state, and shared family context. Unlike standard AI memory, it is encrypted, user-controlled, and fully exportable. You own it. MEOK does not.
How is MEOK's memory different from ChatGPT Memory?
ChatGPT Memory launched in 2024 and stores memories on OpenAI's servers under OpenAI's control. You cannot switch away from OpenAI and take your memories with you. MEOK's Sovereign Memory is stored in your encrypted vault, is fully portable across AI models (Claude, GPT-4o, DeepSeek, and others), and can be exported at any time under GDPR via /api/user/data. You are never locked in.
Can I export or delete my memories?
Yes. MEOK supports full GDPR compliance. Your complete memory export is available at the /api/user/data endpoint and returns a portable JSON file of every stored memory, companion state value, and episodic embedding. Full deletion is also supported — a single request permanently removes all layers of your Sovereign Memory from MEOK's systems.
What happens to my memories if I switch AI models?
Nothing. Your memories live in your sovereign vault, not inside the AI model. When you switch from Claude to GPT-4o to DeepSeek — or to any future model MEOK supports — your companion state, episodic memory, and relationship depth all carry over. The model changes; your companion's knowledge of you does not.