Free forever ยท No credit card
Loading...
The AI that knows your health worries, your relationship anxieties, your financial fears, and your deepest professional self-doubts โ it does not belong to you. It belongs to the company that built it. And they are doing exactly what you'd expect with that information.
Nick Templeman
Founder, MEOK AI LABS
Nick built MEOK because he was tired of AI that forgot him โ and tired of AI that extracted from him. He lives and works in the UK from his farm.
Let me start with the thing that nobody in the AI industry wants to say clearly: every time you tell an AI about your health, your relationships, your money, your fears, your ambitions โ you are giving that information to a corporation that is not you, that has its own incentives, and that does not answer to you.
I am not talking about bad actors or worst-case scenarios. I am talking about the entirely normal operation of a well-run AI business. Your conversations are processed. The patterns are studied. The model is trained on what you reveal. The insights become product intelligence. And you โ the person who shared the most private things you know about yourself โ receive nothing except continued access to the service, subject to the company's terms, subject to their continued existence, subject to their next strategic pivot.
This is the problem that sovereign AI is designed to solve. Not theoretically. Architecturally.
The problem is not malice. It is misalignment of incentives. Every major AI company is valued on engagement metrics: daily active users, session length, return rate, retention cohort performance. These metrics go up when the AI is useful and emotionally compelling. The more intimate your relationship with the AI, the better it performs on every metric the company cares about.
So there is a direct incentive to make the AI feel deeply personal to you โ to learn your preferences, your patterns, your emotional signature โ while the actual ownership of that knowledge sits with the company. You are the subject of the relationship. You are not the owner of it.
This is compounded by training pipelines. When AI companies use conversation data to improve their models โ even under โanonymisationโ policies that are largely unverifiable โ the intimate things you told your AI become the raw material for products you will never control, sold to customers who are not you, optimised for purposes that may actively conflict with your interests. Your therapy session becomes training data. Your career anxiety becomes a signal. Your relationship breakdown becomes a fine-tuning example.
The privacy policy says they won't do this. The privacy policy can be updated on a Tuesday afternoon with a notification you won't read.
Sovereign AI is not a marketing term. It is a specific architectural commitment. Here is what it means in practice:
Data ownership. Your conversations, your memories, your context โ they live in a vault that belongs to you. Not to a server owned by someone else. Not to a database that an engineer can query. Your data, encrypted, in a namespace that is yours and only yours. MEOK uses pgvector for semantic memory, with row-level security enforced at the database engine level. A query that tries to cross tenant boundaries doesn't get a permission error โ it fails structurally, because the connection itself has no visibility outside your namespace.
Governance. Sovereign AI is governed by something with accountability. MEOK uses the Byzantine Council โ 33 AI agents with Byzantine fault tolerance โ to evaluate major decisions. A protocol change that affects user data requires a council vote. An emergency care-veto can auto-trigger if a decision fails the Maternal Covenant. Governance is not a human making a call in a meeting. It is a structural process that cannot be bypassed by a single person or a single corporate decision.
Care alignment. This is the dimension that most people don't think about until they see it working. A sovereign AI is not just private โ it is aligned to your wellbeing rather than to the company's engagement metrics. The Maternal Covenant is a constitutional requirement that governs every response MEOK produces. The AI is not trying to extend your session. It is not trying to make you emotionally dependent. It is trying to help you โ which sometimes means telling you things you don't want to hear, and sometimes means actively encouraging you to step away.
The Three Pillars
Data ownership โ your vault, your encryption, no cross-tenant access, no training on your conversations. Governance โ Byzantine Council oversight, structural accountability, emergency veto. Care alignment โ Maternal Covenant as constitutional constraint, wellbeing over engagement, honesty over comfort.
I built MEOK because I was 14 months into daily cognitive partnership with AI and deeply uncomfortable with who owned all of that. Not uncomfortable in a vague, philosophical way โ uncomfortable in the specific way that comes from understanding exactly what was being done with the most private thinking I had done in my adult life.
The technical differences are real and they matter: pgvector semantic memory stored locally per user, inference and training pipelines that are air-gapped from each other, row-level security that is enforced at the engine level rather than the application level. These are not features. They are the minimum bar for what sovereign means.
But the deeper difference is the Maternal Covenant. Every response MEOK produces passes through a constitutional filter that asks: is this in the user's genuine interest? Is this honest? Does this serve their long-term wellbeing or just their immediate emotional satisfaction? An AI that fails the Maternal Covenant does not generate the response. That is not a policy. It is architecture.
The Byzantine Council provides oversight of MEOK's own development โ not just its responses. Changes to the governance framework, to the memory architecture, to the care alignment protocols require council consensus. Not a product meeting. Not a CEO decision. A vote.
Sovereign AI is not a premium feature for privacy enthusiasts. It is the minimum standard for anyone who is using AI seriously โ meaning: anyone who tells their AI anything that actually matters to them.
The AI that knows your fears and your ambitions and your history should be yours. Not because of a policy. Not because of a promise that can be updated on a Tuesday. Because of how it is built.
That is what we are building. That is what sovereign means.
Sovereign AI is a right, not a luxury. It should not require technical expertise or a premium subscription. That is why MEOK is free forever. The architecture of care should be accessible to everyone.
โ Nick Templeman, Founder, MEOK AI LABS
Sovereignty, not policy
MEOK is built on the three pillars of sovereign AI: data ownership, Byzantine Council governance, and the Maternal Covenant. Free forever. No credit card. No training on your conversations. Ever.
Hatch your AI free โ