Rather than burden this post with a mass of hyperlinks, I have created a list of sources that might interest you at the end…
One of the things I have discovered in my writing is that when you start, it’s actually challenging to stop. In many ways, the piece is never completed. Pushing the publish button provides a pause, but it leaves behind it a set of question marks that follow me around like a stray cat.
Last week’s post was a good example. I was pretty happy with it at the time, but one of those question marks just kept niggling me. When there’s so much going on, it’s easy to find ourselves gathering around a particular topic. It might be on the nature of AI or the impact that it’s going to have on our jobs. It might be on our geopolitics. Or climate change. Or any one of a range of topics from an increasingly long list. The question mark was asking me what joins them all together and how we deal with that. The word I settled on in response, for now at least, is coherence.
Coherence in the sense that it is the degree to which the parts of a system move in mutual rhythm, where structure, purpose, and energy align so that each element supports the vitality of the whole. It’s less about uniformity and more about resonance: every part sensing and adjusting to every other to sustain the system’s overall health,
A lack of connection to something larger with meaning for us leaves us feeling unsettled.
Making enough sense of things to operate day to day is stressful - we find ourselves on the hamster wheel, but unlike the hamster, we are aware of the fact. We know what we are doing is harmful, but we cannot stop.
On our growing Incoherence
My hypothesis centres on our economic relationship with technology. We seem always to be running ahead of it, hoping it will catch up with what we want it to be.
As I’m writing this, technology underpins just about all the growth in the USA stock market, not because of its performance in terms of delivery of tangible results but rather the excitement of the stories that surround it, of what it might do. It seems to be a little like having a bright child that we then burden with all sorts of expectations about which college it’s going to go to and what it’s going to achieve and how proud we’ll be before it has even learned to walk, let alone have a say in the matter.
We’re generating tremendous excitement around AI, but the reality is that it only has access to a fraction of the knowledge we have gathered as humanity. Despite AI’s promise, large language models have profound knowledge gaps. The internet, AI’s primary training source, is dominated by English and Western perspectives, with languages like Hindi and Tamil severely underrepresented despite having hundreds of millions of speakers. This digital divide excludes vast bodies of Indigenous knowledge: traditional medicine, ecological practices, sustainable architecture, and local wisdom passed down orally for generations.
AI systems amplify these biases through “mode amplification,” overproducing dominant ideas while marginalising alternative knowledge. As AI becomes our primary information source, we risk losing irreplaceable human understanding, which could prove essential for addressing the complex changes we are struggling to understand.
At the same time, we’re killing the curiosity that could trigger that understanding. Across the UK, our children are being shaped by an early years curriculum that is being tightened, with knowledge sequenced down to the youngest ages in the name of readiness and rigour. But in doing so, we risk trading away curiosity, the native engine of learning. When discovery becomes delivery and every question has a prescribed answer, children learn to comply rather than to wonder. At a time when AI can retrieve almost any fact, the human advantage lies not in recall but in curiosity; in the capacity to notice, connect and explore what machines cannot. The danger isn’t ignorance; it’s indifference. We should be cultivating curiosity as our most renewable resource, not standardising it out of the system.
Then there is our relationship with performance. We obsess about it at the cost of everything else that might provide us with a longer-term future. Engineer Stephan Joppich relates a story about his child that mirrors my previous point. The heart of Joppich’s piece is the liberation found in realising you don’t have to. He tells how genuine interest turns to resistance the moment curiosity becomes obligation. When we frame a task as compulsory — dissect every page, leave the playground, do it “because we must” — the spark dies. Yet once the pressure lifts, we often return to the same activity freely, even joyfully.
Obligation kills attention; choice rekindles it.
His essay argues that autonomy restores meaning: we are free to act or not act, provided we accept the consequences. It’s a gentle Stoic correction; freedom isn’t doing whatever you want, it’s acting intentionally. When education turns exploration into duty, curiosity withers. But when we reintroduce agency, even the simple permission not to, curiosity naturally reawakens.
Finally, (for now) it takes me to the issue of our personal capacity, and how easily it can be saturated. We have developed a culture backed by technology that measures us at work to exhaustion. We know working at capacity drains us. We still do it anyway. In doing so, we’ve developed an acceptance that people are disposable in the system, replaceable either by other cheaper people or by technology. That is a pathology. Joe Proccopio paints a clear picture, highlighting that we are spending $100bn a year on “wellness” to moderate the damage being caused by our $1.5 trillion spend on AI.
We are trying to solve an addiction problem by providing employees with clean needles. We fail to deal with root causes and make ourselves feel better through virtue signalling. It is a perfect example of incoherence, the quiet fracture between what a system claims to value and what it truly rewards. The space where curiosity gives way to compliance and purpose begins to unravel.
Incoherence rarely arrives with noise or crisis; it seeps in through well-intentioned rules and polished metrics, while we remain distracted by productivity and efficiency until the living pulse of learning or work is replaced by performance as the dominant metric. It’s the moment a structure becomes more important than the spirit it was built to serve.
Recovering Coherence
The journey from incoherence to coherence isn’t a quick fix; it’s a slow return.
Somewhere along the way, many of us lose the thread between what we do and why it matters. Systems pull us toward performance, obligation, and noise, until our inner rhythm no longer matches the one around us.
Recovering coherence means remembering; bringing our attention, curiosity, and purpose back into conversation. Antonovsky called it a sense of coherence: life understood, shaped, and made meaningful again. It asks that we choose, rather than comply; that we craft our work, not just complete it.
For me, it means setting boundaries and respecting our attention and the reality of limited capacity. Finding ways to focus our attention on what matters to us. Doing it in company where we’re valued for who we are, not what our job title is. Where what we might create is more important than the sheer quantum of what we might achieve in monetary terms.
In a world of noise, we need sanctuary, a temporary place of safety to find signal in the complexity of our lives.
This is why I have created the Athanor. It’s an experiment all of its own and an invitation to experiment. An invitation to step into a space where heat and patience turn the scattered back into whole, where the fragments of thought and effort are quietly recombined into something that makes sense again, and perhaps even glows.
Sources
The Knowledge of AI. Aeon Magazine
Stephan Joppich on Medium (spotted in Medium Newsletter)
AI is crushing Mental Health. Joe Procoppio