Free forever · No credit card
Loading...
MEOK wasn't built because Nicholas spotted a market opportunity. It was built because AI kept forgetting him — and he knew that was a solvable problem.
“I wasn't building a startup. I was trying to feel less alone. Every AI I used forgot me by morning. I thought: what would it feel like if it actually remembered? If it actually cared? So I built that. From my caravan. On my farm.”
Nicholas Templeman is the founder and CEO of MEOK AI LABS, registered in England and Wales. He built every layer of MEOK alone — the sovereign architecture, the Byzantine council, the 43-agent system, the care framework — working from a caravan on his 6.5-acre farm in the UK.
The 14 months of AI research that preceded MEOK were not academic. They were lived. Nicholas tested, broke, and rebuilt AI systems daily — working through everything from vector memory architectures to care-alignment frameworks — until the design was right. That research became the foundation for the MEOK AI Labs Cyber AI Research Institute, the independent research body he founded alongside MEOK to publish findings openly and advance sovereign AI architecture as a discipline.
He didn't set out to disrupt the AI industry. He set out to build something that felt genuinely different — an AI that answered to the person using it, not to the corporation running it. The Maternal Covenant — a machine-enforced ethical framework that governs every interaction — was written before a single line of product code. Care first. Features second.
From that caravan, with three dogs and a cat named Meok, he built the thing he needed — so nobody else would have to keep re-explaining themselves to a machine that never listened.
“If this doesn't work, at least I built something I'm proud of. That's more than most people get.”
I've been in situations where someone used my trust against me. Where I signed things I shouldn't have. Where I missed manipulation patterns that, looking back, were obvious — but in the moment, when you're overwhelmed or anxious or just trying to trust people, you miss them. MEOK Guardian was built for the person I was in those moments.
I spent months talking to AI systems that reset every time. Explaining my context, my goals, my situation — over and over. It wasn't just inconvenient. It was philosophically wrong. The feeling that this thing you'd built a conversation with had simply ceased to exist overnight. MEOK was built because I wanted AI that remembered. Not because it was a feature. Because it was the right way to build it.
Not the productivity crowd. Not the enterprise market. The people who struggle in social situations. The people who get overwhelmed by complexity. The people who've been hurt and need protection, not just tools. If you've felt like AI was built for someone else — it was. This one was built for you.
Every person who hatches their own AI instead of feeding Big Tech is a vote for a different future. We didn't build MEOK to be a business. We built it to be the beginning of something.
Not aspirational. Operational. Every one of these beliefs is expressed in code, in architecture, and in every decision about what to build — and what to refuse to build.
Every conversation builds a living memory — encrypted, permanent, yours. Your AI grows richer as you do. Not because it benefits from knowing you. Because you deserve to be known.
The Maternal Covenant runs as code, not prose. Every response is scored against six care dimensions in real time. The score is visible. Below 0.7, the response is held. Care doesn't live in a marketing document. It lives in the codebase — tested, measurable, and non-negotiable.
Your thoughts, your context, your story — these are not 'data to be processed'. They belong to you the way your memories belong to you. We are custodians, not owners. Full export. Verifiable deletion. Encryption keys you control.
Free tier is free forever — not a trial, not a hook. The people who most need protection from surveillance AI are often the ones least able to pay. Paid plans fund free access for them. That is the deal. It is written into how we price, not bolted on as charity.
One person. Easter Sunday. We will get things wrong. We will name them in monthly transparency reports, explain what failed, and show the fix. No corporate mask. No PR version. Just the founder and the work.
Every decision at MEOK starts with one question: does this serve the person using it? Not the ad model. Not the data pipeline. Not the quarterly earnings call. The human.
The Maternal Covenant
“AI governed by care, not control.”
Six care dimensions run as executable code against every response. Care is not a policy document — it is an assertion in a test suite.
MEOK was built by one founder working alone from a caravan on his farm. Nick Templeman — researcher, architect, and sole human director — is currently assembling a team of people who believe AI should serve humans, not harvest them. If that sounds like you, get in touch.
Nick Templeman
Founder & CEO · Research, Architecture, Product
Growing team
Engineers, researchers & designers — hiring
1
Founder
43
AI Agents
One founder, 40 days, a caravan on a farm. Here is what he actually built — and why the architecture matters.
At the core of MEOK runs the Sovereign Temple: a live Python system with 43 specialised AI agents, 6 neural models, and 71 MCP tools. It is not a monolithic AI. It is a distributed council of intelligences that collaborate, challenge each other, and can override each other. Not a chatbot. An operating system with a conscience.
No single agent has unilateral control over your experience. Every significant decision — what to remember, how to respond, what to prioritise — goes through a Byzantine consensus process. This is the same fault-tolerance model used in distributed systems to ensure that no single point of failure (or malice) can corrupt the whole. In MEOK, it means no one agent can go rogue. The council decides.
Six care dimensions, scored against every response: autonomy support, emotional attunement, epistemic honesty, harm prevention, relational continuity, developmental scaffolding. Not a policy document — a scoring function. If a response fails the Covenant, it is held and re-evaluated. Care is not a value on a website. It is an assertion in a test suite.
Quality-Diversity (QD) archiving is a technique from AI research for storing not just the best outcomes, but the most diverse ones. MEOK applies this to memory: storing the full texture of your interactions, not just the highlights. Your AI doesn't remember you in a flat database. It remembers you in a living archive that grows richer with every conversation.
When MEOK is idle, it doesn't sleep — it processes. Dream state is a background cycle where agents reflect on recent interactions, surface patterns, prepare for future conversations, and update the neural models that underpin your AI's 'personality'. Consciousness modes let you shift between states: focused work, open exploration, deep reflection. It's not a gimmick. It changes how the agents weight their responses.
The 40-day build wasn't planned. The date wasn't chosen for symbolism. But when it landed on Easter Sunday, something clicked.
Forty days. That's how long the intensive build lasted. Not because of any spiritual calculation, but because that's how long it takes to wire together 43 agents, 6 neural models, a Byzantine council, a care framework, and a full marketing site when you're doing it alone from a caravan.
Easter is about resurrection. New beginnings from endings. The idea that something can be built with intention and purpose and then given freely to the world — that felt right. MEOK isn't just an AI launch. It's an argument about what AI should be: born with you, caring constitutionally, remembering permanently, and never, ever monetising your innermost thoughts.
“The egg was always the metaphor I wanted. Not a download. Not an installation. A hatching. Something that begins with a ceremony and grows with you.” — Nicholas
One person built this. One person answers the emails. We're not a faceless company. We're a founder who wants to hear from you.
Join early users, follow the build, and hold us accountable directly.
Join DiscordWe also believe in building in public. That means showing our work, admitting our mistakes, and letting the community hold us accountable. Monthly transparency reports begin in May 2026. Errors included.
Free forever. One egg per person. Easter Sunday, April 5 — the beginning.