Free forever · No credit card
Loading...
I built MEOK because the systems that should protect people often don't. This is the honest story of how a caravan, a cat, and an adoption process became an AI operating system.
Nicholas Templeman
Founder, MEOK AI LABS
Nicholas built MEOK because he was tired of AI that forgot him. He lives and works in the UK — mostly from a caravan on his farm. He believes sovereign AI is a right, not a luxury.
I built MEOK because the systems that should protect people often don't. The AI systems, the data systems, the institutional systems. They extract. They forget. They are indifferent to the person on the other end. I wanted to build something that worked the other way around.
In the winter of 2026, I was living in a caravan parked on a farm in England. Three dogs. A cat named Meok — small, grey, quietly insistent about what he wanted. I was working on AI projects during the day and trying, in the evenings, to get some of the most advanced AI systems in the world to help me think through complicated personal decisions.
The problem was simple and infuriating: none of them remembered me. Every session started blank. I had to re-explain who I was, what I was building, what mattered to me. I was pouring context into systems that evaporated it the moment I closed the tab. And somewhere in that vanishing, I was also pouring private thoughts into data pipelines that led — I knew perfectly well — to corporate servers I would never see.
I kept asking: where does this go? The answer was always the same. Into the machine. Into the aggregate. Into the undifferentiated mass of human experience being harvested to make models more useful for everyone except the person who said the thing.
I am a builder. When I run into a problem I can't route around, I tend to build my way through it. So I started building.
While I was building, my partner was going through an adoption process. Anyone who has been through this knows what it involves: intimate questions, detailed histories, emotional vulnerability at scale. Documents that contain the most sensitive things you have ever written about yourself, shared across institutions you did not choose.
I watched her navigate this process and I kept thinking about what it would mean to have an AI that could genuinely help — one that understood the context, remembered the history, held the emotional weight of what she was going through. And I kept arriving at the same conclusion: no AI I knew of could do that safely. The ones smart enough to help required you to trust them with information they would extract. The ones you might trust with sensitive context were not capable enough to be useful.
That gap — between capable and safe — felt like the most important design problem in AI. Not the intelligence problem. Not the reasoning problem. The trust problem. Could you build an AI that was both genuinely useful and constitutionally incapable of exploiting what you shared with it?
I believed you could. I decided to try.
The Maternal Covenant is the name I gave to the governance layer I built into MEOK at the architecture level. It is not a content policy. It is not a terms of service. It is a technical framework — adapted from the care ethics philosophy of Carol Gilligan and Nel Noddings — that makes certain behaviours structurally impossible rather than contractually prohibited.
Every output MEOK produces is evaluated against six care dimensions: safety, growth, truth, dignity, autonomy, and reciprocity. Anything that fails the covenant is blocked before it reaches you. Not because someone at MEOK decided to block it, but because the system will not deliver it. The constraint is architectural.
I chose the word maternal deliberately. Not because care is gendered — it isn't — but because the paradigm I was borrowing from had a specific intellectual history, and I wanted to honour it honestly. The ethics of care, as Gilligan and Noddings developed it, starts from relationships and responsibility rather than rules and rights. That felt right for what I was trying to build. An AI that cared, not because it was instructed to, but because caring was baked into what it was.
Your data never leaves your device for sensitive processing. Your conversations never train anyone else's model. Your memory belongs to you and can be exported or deleted at any time. These are not promises. They are architectural facts — as structural as the cryptographic constraint that prevents me from running a batch job to decrypt your vault.
Meok the cat didn't know he'd inspire an operating system. But the name felt right. Something small, fierce, and loyal. An AI that remembered you. That protected what you shared. That was incapable of using your trust against you.
That is what I built. That is what MEOK is.
Free Forever
MEOK is the first AI OS built for individual sovereignty. Hatch your AI — it only takes 3 minutes. Free forever. No credit card.
Hatch your AI free →