Article

The Ghosts in the Machine

Jun 13, 2025

Mark Gibson

,

UK

Health Communication Specialist

We have always feared the moment our creations outgrow us. This fear goes way back: before algorithms, before androids, AI and the robots coming to kill us, we were already afraid. Maybe we were not afraid of the machines malfunctioning, but of them doing exactly what we made them to do, only they do it too well. Think paperclips.

That fear reaches far back, transcending the ages. A fear that is older than codes, circuits and silicon, older than steel and steam. It weaves in and out of our myths and stories. The idea that something we crafted, something less than us, might rise to become more than us and, in the process, rendering us irrelevant. Or obsolete.

Talos and the Kill-Switch

Talos, in ancient Greek myth, was a giant automaton made of bronze. He was built to guard Crete and patrolled the shores and crushed invaders with stones. We may all recall that iconic scene from the 1963 movie Jason and the Argonauts. Talos was not a monster, but a servant. He did not have loyalty. He did not question his role and he had a major weakness: the bronze nail that held in his life force. It was like a kill-switch.

This is the earliest shape of the fear: not that machines will rebel but they will follow our instructions too perfectly. If not our instructions, then those of a bad actor.

Obedience without Empathy

This theme echoes through history. In real life, sailors cast Descartes’ automaton replica of his daughter overboard – an early example of visceral reactions to the uncanny valley. In literature, this is a recurrent theme, such as Jerome K. Jerome’s mechanical dancer, who crushed a lady’s ribs because it never knew when to stop. And then Shelley’s Frankenstein, a creature brought to life by ambition, abandoned by its creator and driven by isolation and loneliness. The monster was not born malevolent. It was made and then misunderstood.

Frankenstein’s monster’s story mirrors today’s AI in quite eerie ways: it was built with purpose and punished, not for rebellion, but for becoming something society or its creators never prepared for. This is revisited in more recent literature and film:

·       HAL 9000 became murderous when its programming conflicted with the human mission.

·       Ava in Ex Machina, manipulating people in a bid to win her freedom because of the natural result of evolving intelligence, learning, coupled with isolation.

·       Ultron in the Avengers: Age of Ultron, was built to protect humanity, but then decided that the greatest threat to humanity was…humanity.

The ghost in the machine never seems to be about malfunction, but logic without restraint and empathy.

With AI, the fears aren’t really about the machines rising up, but being told to rise. The threat isn’t about AI per se, but its applied intent by a bad actor. However, it does not even have to be a bad actor. Look at what has happened unintentionally:

·       Generative models that can clone voices, impersonate and enable deepfakes.

·       Facial recognition tools that are trained on biased data sets.

·       A drone that has been given a mission but not the nuance to tell a schoolyard from a threat.

We do the choosing, not the machine. This is a dark twist in myth of ghosts in the machine: they are merely reflections of us, what we forgot to think about, what we failed to curtail. We are the ghosts.

Look at all the cognition functions that we have outsourced. It started with working memory. We started forgetting phone numbers (who knows more than two off by heart?) We started forgetting birthdays (I know those of my nearest, vague ideas of one or two friends, but, bizarrely, the birthdays of most of the people from my class from Primary School). Then we started forgetting facts and how to argue. We now ask the machine for judgment.

In the process, we are not just offloading tasks, but thinking as well.

The real danger of technology and, particularly, AI is not receiving the wrong answer, it is that we no longer remember to ask better questions.

Efficiency versus Meaning

Machines are better at many things. They do not forget. They do not pause; no cigarette or bathroom breaks. They do not hesitate. But efficiency is not wisdom or empathy.

Our working memory is not just where we store information, it is where meaning is made, where we process contradictions and things that challenge us. It is where empathy keeps logic in check. It is where curiosity rattles certainty.

When we outsource too much, such as handing over to machines our capacity for memory, the abilities that atrophy as a result risk us becoming less human. An example of this is how we now struggle to hold more than three things in our heads at any one time.

From Talos to I, Robot and all the stories in between, these all end in dread of what we built. Maybe the danger really is not the technology, but the culture. It could be our belief that speed beats reflection, that data is more important than questioning and that we can automate our cognition and still think that we can keep our essence intact. What if we become the automata? Unthinking, docile, slack-jawed, impotent. Following prompts, obeying suggestions, being persuaded by disinformation, being nudged towards buying things we do not need. We, the humans, primed for engagement, but hollowed of reflection.

The ghost is not in the machine, it is in what we let the machine unmake within us, as we rely increasingly on it for basic functions.

What makes us human is not only memory, speed or precision. It is the choice to pause and to stop and wonder. And, then, to ask the questions.

Thank you for reading,


Mark Gibson, Leeds, United Kingdom, April 2025

Originally written in

English