It feels as though the walls are closing in as a combination of small-minded populism and rampant greed meet. Curiosity and generosity have been submerged beneath fear-based dogma in geopolitics defined by bullying, uncertainty, and tacky temporary deals.
At such a febrile time, identifying the edges that define which side of the boundary we are on is a challenge.
There is something existential about the Atlantic Coast. Rugged, beautiful areas where old land meets even older sea. At the start of the tourist season, it was an immense and unexpected pleasure to find a deserted beach on which to walk for a few hours.
The gap in the centre of the picture is the Doombar, a sandbar steeped in myth and legend as an ever-shifting boundary. It perfectly captures Nick Hayes' insight about nature's permeable boundaries: a shifting threshold that is neither fully land nor sea, but something altogether more alive.
“There are boundaries in nature. There are rivers, forests, escarpments, ravines and mountain ranges; there are cellulose walls. But these boundaries are in fact areas of transaction, semi-permeable membranes. The notion that a perimeter should be impenetrable is a human contrivance alone.”
Nick Hayes, The Book of Trespass: Crossing the Lines that Divide Us
The sandbank transforms with each tide and storm, sometimes protecting Padstow's harbour from Atlantic fury, sometimes claiming ships that approach its ever-changing bathymetry - the unseen topography beneath the waves that plays to Hayes' deeper truth: our human desire for fixed, impenetrable boundaries inevitably fails. The folklore of the Doombar acknowledges that attempting to separate cleanly between human and nature, safety and danger, creates not clarity but catastrophe.
Just as the Doombar's sand shifts because it exists in that space where river meets ocean, our boundaries shift when ambition and a search for certainty encounter forces beyond control.
Convention has it that we look at these energies from the perspective of the organisation. I am starting to question that. The relationship between organisation, employee, and community decays. Our economic culture encourages organisations to become little more than tradeable containers of capital, with little connection to employees and communities.
For communities to flourish, our focus must shift from profit to people. If our economies are oceans, communities are the economic equivalent of the Doombar; they can protect or they can confound. Unless organisations learn to respect and navigate them, they will come to grief, taking us with them.
As individuals, committing to large organisations is a risky strategy when there is so little mutuality. It becomes increasingly so as an obsession with AI, fuelled by unimaginable amounts of investment that require returns, blinds senior managers to the perils of the shifting patterns beneath the waves of the markets. They seem to be using maps defined by cold data, without the instincts for spotting subtle changes that are the hallmark of skilled navigators.
It begs the question: where do we find security in these times?
Learning from Limpets and Barnacles
Walking along a deserted beach offered time to notice and space for reflection. It made me wonder why there were such communities of Limpets and Barnacles in some places, and not others. Not being a marine biologist, I did a little digging and stumbled on a metaphor.
They don’t settle just anywhere—they cluster where the conditions are right: on rough, stable surfaces with just enough wave action to bring food, but not so much as to tear them away. They favour spots in the middle ground—neither too dry nor too submerged—and are drawn to settle near others, following subtle chemical cues left behind. Some rocks remain bare, not because they’re unvisited, but because they’re too smooth, too hostile, or already claimed by competing life. Barnacles remind us that communities don’t form by chance—they take hold where conditions allow, where signals say “you belong,” and where others have shown it’s possible to stay.
In the shifting sands of the economy, perhaps that’s the message for us: “to take hold where conditions allow, where signals say 'you belong,' and where others have shown it’s possible to stay.”
When organisations have become too smooth, too hostile and have been claimed by the competing forces of technology and capital, the edges become as defined as on the rocks. Some organisations may offer a place to belong, with strong cultures and enough resources to stay secure when the storms arrive, but many do not.
It’s difficult, if not impossible, to predict what will happen as the current storm eases, as investors either celebrate or retreat to lick their wounds, and organisations that believed human skills are largely replaceable discover, to their cost, the critical ones that are not.
Inside a Community, Outside the Walls.
As I wandered along the Cove, I wondered what the communities where we might belong could look like. For some, there will be those organisations with the cultures and resources that make committing to them an acceptable risk, but what about the rest of us? What will we cling to?
So, time for some speculative questions:
Let’s assume for a moment that AI fulfils much of what it promises when it comes to research, analysis and coding. Assume it handles better than we do the automated, high cognitive load tasks that typify the professions, and that it designs, tests, and iterates models at lightning speed. If that were to be the case, will it hollow out much of what our existing organisations have spent so much time trying to do to make humans efficient and productive, but have largely been blind to what makes them valuable?
As humans become less valued for technical skills, who takes responsibility for interpretation, ethics, and narrative? Who are the priests in a data-rich world, who ask: “What does this mean for us?”
As AI handles repetitive tasks, is there an opportunity for independent professionals, small collectives, and networks to thrive? Might the future belong to those who can craft a unique proposition by combining AI’s power with human empathy and creativity?
Will AI’s “automated innovation” produce a sea of generic output? If it does, will those who can blend craft and character, offering something profoundly human and non-scalable, stand out?
Suppose disruptive changes create a world where old square pegs don’t fit. Will there be an opportunity for individuals to be the bridge-builders, helping others navigate from the known to the new using emotional intelligence and coalition-building, and other qualities that AI cannot replicate?
What happens when trust becomes a scarce commodity? Will individuals who can act as trustworthy curators, guiding others on what is useful and ethical, generate real value?
When we talk of “artificial superintelligence,” we are flirting with a peculiar paradox: a world where machines innovate and humans are left to interpret, or worse, observe. The more autonomous AI becomes, the more urgent it is to ask: what, then, becomes of us?
There is always a moment, after the storm of a great technological shift, when we realise the noise has drowned out something more important.
Ourselves. But the story need not end there. We must learn to think differently — to embrace ambiguity and cultivate lives beyond conventional structures. Now, more than ever, that call rings true.
Finding a Rock To Cling To
I can identify with barnacles - clinging on to something as the tides of change wash over me. We are all different, and cling to different things. Good writing by thoughtful people provides an anchor for me.
One such rock has been the wisdom of Charles Handy. He had a remarkable ability to see through the hype of change to the human essentials, and an ability to be a philosopher of business who could make business seem more human. His many books were a companion through the changes brought about as the internet found its way into our lives, and his wisdom remains as valid now as then.
His last book, “The Second Curve”, published not long before he died in 2023, brought together his life’s work. He wrote how the “first curve”, the period where, whether businesses or individuals, we make our mark by optimising efficiency and maximising productivity, belongs increasingly to the machines. But the “second curve”, the human one, where wisdom provides direction for knowledge, centres on meaning, context, and connection. AI may be brilliant at discovering new model architectures, but it cannot ask why those architectures matter to society, or which ones we should trust, or how they change what it means to be a teacher, a parent, or a citizen. It is where we step in — not to compete, but to complement. When the hype subsides, the opportunities for individuals will be to reclaim the things AI cannot do: to care, to curate, to convene, and to choose.
He wrote about the “doughnut principle” to describe organisations, the hole in the middle representing the space we leave for trust, judgement, and personal responsibility. With AI filling in more of the “outer ring”, the rules, the routines, the reports, the hole becomes ever more important. The edge of the doughnut is where AI excels. The centre is where we must now lead.
I think he would say that we should not be frightened by AI’s brilliance. Rather, we should see it as the signal that we must build portfolio lives, not waiting for one job or one identity to define us, but composing a life of multiple roles: advisor, learner, creator, citizen. AI may write code, but it cannot write our story. It cannot decide what kind of life is worth living, nor whose voice needs to be heard, nor how to help your neighbour when they fall ill. These are not bugs in the machine. They are human callings.
As the frenzy around AI cools, we will need storytellers, coaches, ethicists, carers, and convenors. We will need people who can spot the round holes emerging in our society and help others reshape their square-peg identities to fit them.
I think he would say that if we have learned anything from the past, it is that the greatest danger of efficiency is that it blinds us to effectiveness. It is possible to build a system that runs smoothly but means nothing — that solves technical problems while deepening moral ones.
The opportunity for us is to reclaim the moral imagination. Rather than ask “What can AI do?” we must ask “What should we do, given what it can do?”
Those who thrive will not be the fastest or the most technical, but the most human. They will listen carefully, act wisely, and build relationships of trust across differences. They will operate in “the space between people,” because that space, not the code or the chip, is where society is shaped.
Finding our Space
In the confusion we live in, we need a rock to cling to, a space to explore and become who we want to be, to express and live the values that matter to us.
As the things we have been taught are important in earning a living are ceded to technology, our task is to find the boundary between what it does well, and what we do well.
The space we need is unlikely to be found inside organisations, because they are so focused on short-term performance. Equally, finding it in the vast areas outside the walls can be like looking for a needle in a haystack. We need to start somewhere.
I’m not sure it matters where.
“One day Alice came to a fork in the road and saw a Cheshire cat in a tree. ‘Which road do I take?’ she asked. ‘Where do you want to go?’ was his response. ‘I don’t know,’ Alice answered. ‘Then,’ said the cat, ‘it doesn’t matter.”
Lewis Carroll, Alice in Wonderland
We have to start, knowing that what we need lies less in what we know, and more in what we are capable of seeing, given the opportunity. What lies at the conjunction of the answers to the speculative questions asked earlier?
I think there is something in the “Via Negativa” approach. We may not know what we want, but we probably know what we don’t. If we’re clear about what we do not want to spend our life on, and determine not to do it, what we are looking for lies in what is left.
In closing this post, let me offer a personal example of how this worked for me in practice. In 2000, I had determined to leave the corporate in which I held a senior role. It was a combination of small, important things. I could not reconcile myself to a strategy that prioritised earnings per share over everything else, and I had reached a point where the mortgage and other obligations no longer had the hold they once did. I didn’t want to work for another corporate, nor for any of the franchises where my consultancy and coaching skills would have found an easy home. I did want interesting work, where I could learn on my terms. I ended up taking a job in Switzerland, in a sector that was still in its infancy - anti-counterfeiting technology - working in a language I had only school-level knowledge of. It proved to be a good decision, though looking back on the time, it felt (and was) like a real risk. My friends thought I was mad.
But it started a move off a path defined by others, over an edge to a place that felt more like me.
Footnote
In a mail over the weekend, my friend and mentor Mark Easdown reminded me of the words of Cicero. They seem appropriate here:
Six mistakes mankind keeps making century after century:
Believing that personal gain is made by crushing others;
Worrying about things that cannot be changed or corrected;
Insisting that a thing is impossible because we cannot accomplish it;
Refusing to set aside trivial preferences;
Neglecting development and refinement of the mind;
Attempting to compel others to believe and live as we do.
Marcus Tullius Cicero
There’s a special kind of irony in dusting off Cicero’s list of humanity’s six perennial mistakes and finding it reads like a LinkedIn carousel for 2025. The categories haven’t aged a day; only the costumes have changed.
- Believing that personal gain is made by crushing others now wears the badge of “disruption” or “category dominance.”
- Worrying about things that cannot be changed is reframed as “owning the narrative” in quarterly risk reports.
- Insisting that a thing is impossible has become “we don’t have a business case.”
- Refusing to set aside trivial preferences is corporate politics in high‑definition.
- Neglecting development of the mind hides behind mandatory “learning journeys” in learning‑management systems no one opens.
- Compelling others to live as we do is now just “change management.”
And here we are, standing on our own Doombars, staring at shifting boundaries, congratulating ourselves for the precision of our maps while the bathymetry beneath us changes by the hour.
Your reflection on limpets and barnacles offers a subtler truth: belonging is not a fixed postcode, it’s a set of conditions. Rough enough to grip, stable enough to withstand the tide, close enough to others for mutual shelter. In late modernity, those “rocks” are less often found in corporations, which have polished themselves smooth in the name of efficiency, or outsourced their edges entirely to the tides of capital and code.
This is where the eiron steps in — not to scold the AI‑fixated captain for ignoring the weather, but to point out, with a half‑smile, that the crew have already started building rafts. That the Doombar doesn’t respect the shipping forecast, and never will. That the real skill is not in declaring which side of the sandbar you’re on, but in learning to navigate its shifting line without capsizing your moral cargo.
Cicero’s list reminds us the errors don’t change; what changes is our ability to see them in the moment, name them without cynicism, and then act with enough imagination to avoid repeating them in new drag. The trickster’s role here is not to pretend we can stand outside the tide, but to help others notice when the “solid ground” they cling to is actually just wet sand between waves.
In the end, maybe the human equivalent of the barnacle isn’t just to cling — it’s to know when to let go, drift a little, and find a better rock. Not because the last one was wrong, but because the tide has changed, and the real mistake would be to stay put out of habit while the water rises.