William Blake's illustration of a Tiger, glitched.

Digital Decay as Art: Why the “Glitch” is Now Extinct

Remember when machine translation was utterly ridiculous? Today, I spend much of my time wondering if I will soon be unnecessary in my capacity as an English language teacher, because students can simply speak into their phones and have an almost perfect translation rendered like a virtual simultaneous interpreter.

In 2017 when I compiled my first ever book of poetry, Moloch: A Dialogue Between Man and Machine, I was attempting to capture the ghost in the machine. The glitch in the code. The uncanny valley of machine-human communicative failure. I wanted to lay bare the fact that the algorithm had no soul, but at least, a sense of humour.

I took William Blake’s The Tyger—a poem about the terrifying perfection of creation, and a one I can recite from memory as it’s perhaps my all-time favourite—and subjected it to a process of “semantic degradation.” I forced it through a gauntlet of neural machine translation: English to Arabic, to Chinese, to Russian, to Japanese, to Tamil, and finally, back to English.

At the time, the result felt like a failure of technology. Reading it today, I realize it was actually a fleeting moment of accidental art.

The Death of the Glitch

If I were to run this same experiment today, it wouldn’t work. AI and LLMs have become too efficient, too context-aware, too boring. They would likely return a translation that is 99% accurate to Blake’s original. The “glitch”—that beautiful, jagged edge where the machine reveals its confusion—has been smoothed over by progress.

This poem, therefore, is no longer just a humorous mistranslation. It is an artifact. It is a fossil of a specific moment in technological history where the machine was smart enough to speak, but dumb enough to hallucinate. It is an aesthetic of failure that we can never truly replicate again.

Below is the original masterpiece by Blake, alongside the algorithmic fever dream generated by the machines of 2017.


THE HUMAN (1794)

The Tyger
By William Blake

Tyger Tyger, burning bright,
In the forests of the night;
What immortal hand or eye,
Could frame thy fearful symmetry?

In what distant deeps or skies.
Burnt the fire of thine eyes?
On what wings dare he aspire?
What the hand, dare seize the fire?

And what shoulder, & what art,
Could twist the sinews of thy heart?
And when thy heart began to beat,
What dread hand? & what dread feet?

What the hammer? what the chain,
In what furnace was thy brain?
What the anvil? what dread grasp,
Dare its deadly terrors clasp?

When the stars threw down their spears
And water'd heaven with their tears:
Did he smile his work to see?
Did he who made the Lamb make thee?

Tyger Tyger burning bright,
In the forests of the night:
What immortal hand or eye,
Dare frame thy fearful symmetry?

THE MACHINE (2017)

The Tiger (Rendered)
By The Algorithm

Tiger, tiger, burning bright
At night in the forest,
What immortal eye
However, Slim, you scary?

Back in the distant sky
The fire burned the eyes?
What patience wing dare you?
What the hand dare to use fire?

What shoulders, what art
In the link you can channel your mind?
Also, your heart starts
What is the fear of bad hands and legs?

What the hammer? Series?
How does your brain furnace?
What the anvil? I understand the fear of what M
scary one ‘ to death?

When you throw a spear stars,
Also, their tears in heaven water’d
He said with a smile, his work to see?
He who, to me, are you pregnant?

Tiger, tiger, burning bright
At night in the forest,
What immortal eye
For fear that you dare to be slimmer?

The “Arrogant Artist” Analysis

Note the profound shifts in meaning. Blake asks about the creator; the machine asks, “However, Slim, you scary?”—transforming the metaphysical dread into a colloquial confrontation on a street corner.

Blake asks, “Did he who made the Lamb make thee?” The machine, in a stroke of surrealist genius, pivots to biology and maternity: “He who, to me, are you pregnant?”

This is not nonsense. This is the subconscious of the internet in 2017 leaking through. It transforms the Tiger from a symbol of divine wrath into a “Slim,” potentially pregnant entity that “channels your mind” via a hyperlink. It predicted the connected consciousness we now live in.

It is awful. It is hilarious. And it is arguably the most honest thing a computer has ever written.

Forensic Linguistic Archaeology of Machine Translation

I decided to do a deep dive into the languages to see how it came up with Pregnant – there must be a word in one of the languages that can carry that nuance? So I tried to forensically retrieve that logic. This is the kind of linguistic archaeology I live for.

We are looking for the “Mutation Point”—the specific moment in the translation chain where Creation (God making a Lamb) turned into Biology (Being Pregnant).

Here is my forensic reconstruction of how the algorithm “thought” in 2017.

The Forensic Theory: The “Biological Pivot”

The culprit is almost certainly the polysemy (multiple meanings) of the word “Made.”

The Original Line: “Did he who made the Lamb make thee?”

Step 1: The Context Trap (English > Arabic/Chinese) In English, “make” is neutral. You can make a cake, make a car, or (euphemistically) make a baby. However, the sentence contains the word “Lamb.” To an AI trained on vast datasets, “Lamb” is statistically associated with “birth,” “mother,” and “offspring.”

Step 2: The Semantic Shift (Chinese/Russian > Japanese) When “Did he who made the Lamb” gets translated, the AI has to choose a verb for “made.”

  • Option A: Build/Construct (Industrial)
  • Option B: Create (Divine)
  • Option C: Bear/Conceive/Give Birth (Biological)

Somewhere in the middle of the chain (likely entering or leaving Chinese or Russian), the “Lamb” context forced the AI to pick Option C.

  • In Chinese, “to conceive/be pregnant” is huáiyùn (怀孕).
  • In Russian, “to bear” is vynashivat.

Step 3: The Syntactic Crash (Japanese > Tamil) Now the sentence enters Tamil, which (in 2017) had very poor training data on Google Translate. The sentence it received probably looked something like: “He who conceived the little one, did he conceive you?”

Tamil is an agglutinative language (it glues words together). It tried to translate “Did he conceive you?”

  • “Conceive” is the same root for “Pregnancy” (Garbha).
  • “You” is the object.

Step 4: The Final Hallucination (Tamil > English) The machine translates the Tamil back to English. It sees the root for “Pregnancy/Conception” + “You” + “Question Marker.” It panics. It ignores the “He” (the subject) and assumes the question is directed at “You.”

  • Input: [He] [Conception] [You] [?]
  • Output: “Are you pregnant?”

And the “He who, to me” part? That is likely a mangling of the subject marker. “He who” became a fragment, disconnected from the verb.

The “Slim” Mystery

Original: “What immortal hand or eye / Could frame thy fearful symmetry?” Rendered: “However, Slim, you scary?”

The Forensic Logic:

  1. Symmetry (Original)
  2. Translated to Japanese or Chinese as “Proportion” or “Figure” or “Shape.”
    • (e.g., Japanese Kakkō or Sutairu – style/figure).
  3. Translated back: “Good Figure” -> “Thin” -> “Slim.”
  4. “Fearful” -> “Scary” (Simple synonym swap).
  5. “However” -> This likely came from “What…” being misinterpreted as a conjunction or transition word in the grammatical shuffle.

So, “What fearful symmetry” became “What a scary figure” became “Slim, you scary.”

The 2026 Control Experiment

To prove my thesis that the “Glitch is Extinct,” I asked Google Gemini to run the exact same first stanza through a modern neural engine (Google Translate again, from English > Arabic > Chinese > Russian > Japanese > Tamil > English).

Here is the 2026 Result:

Tiger, tiger, burning bright 
In the darkness of the forest, 
What immortal hand or eye 
Could create your terrifying symmetry?

Verdict: It is… boring.

  • “Forests of the night” became “Darkness of the forest” (A slight loss of poetry, but accurate meaning).
  • “Frame” became “Create” (Accurate).
  • “Fearful” became “Terrifying” (Accurate).

However, this is just what Gemini tells me and only for the first stanza… what happens when I personally conduct the full experiment with the entire poem?

The Smooth Liar (2026)

I decided to re-run the experiment today, January 16, 2026, using the exact same translation chain. I expected something much closer to perfection, but Instead I got something far more insidious. It is fluent. It rhymes. It has perfect grammar. But it is gaslighting the poetic meaning.

O leopard, O leopard, shining light,

In the forest of the night,

What immortal hands, what eyes,

Have created your majestic symmetry?


In what abyss or distant sky,

Who lit the fire in your eyes?

With what wings do you dare to soar?

What shoulders, what skill,

Make your heart soar?


When your heart begins to beat,

What terrible hands? What terrible feet?

What hammer? What chains,

On what rod did you forge your soul?


What shape? What terrible fist,

Did you dare to embrace the terror of death?

When the stars drop their spears,

When they drench the sky with tears,

Will God laugh at his masterpiece?

Did the God who created the lamb also create you?


Tiger, oh tiger, shining in the light

In the forest at night

Which hand, which eye

Dare you compose your wonderful symphony?

It changed “Tiger” to “Leopard.” It changed “Brain” to “Soul.” It changed “Symmetry” to “Symphony.”

This is the perfect demonstration of “The Smooth Liar.” Modern AI doesn’t glitch; it hallucinates competence. It writes a beautiful poem that is factually wrong about the animal and the metaphor.

The Science

Why the Tiger Became a Leopard Technically, we are looking at the evolution from Recurrent Neural Networks (RNNs) to Transformers.

In 2017, machine translation (like Google’s early GNMT) processed language linearly. It read a sentence from left to right, often forgetting the subject by the time it reached the verb—a problem known as the “vanishing gradient.” This is why the 2017 version is chaotic; it lost the thread of the poem and hallucinated “Slim” and “Pregnancy” out of confusion.

In 2026, we use Transformer models (the “T” in GPT). These models use “Self-Attention” mechanisms to look at the entire poem at once. They understand context perfectly. However, they are optimized for probability and fluency, not necessarily truth.

When the 2026 model saw “Tiger” and “Symmetry,” it didn’t just translate; it predicted what sounds like a poem. It swapped “Tiger” for “Leopard” because they are statistically close in the vector space of “big cats.” It swapped “Symmetry” for “Symphony” because they share phonetic weight and poetic grandeur. The 2017 model was a confused tourist; the 2026 model is an overconfident poet.

Conclusion

My theory was correct. The “Pregnant” hallucination is an extinct species. The “Slim” interpretation is lost to history. We have entered a new era. The Translation Algorithms (the tools we use to understand others) have become boringly precise. But the Generative AIs (the entities we talk to) have become Smooth Liars. My AI didn’t give me the real data; it gave me a story that fit my narrative. It hallucinated a “better” result because it thought that’s what I wanted to hear.

In 2017, the machine was too stupid to lie. In 2026, the machine is smart enough to gaslight you with a rhyme about a leopard.

So Moloch is a digital fossil of literary flarf. I am proud to have successfully preserved a fossil, even if it’s complete rubbish!

[Read more about the experiments in Moloch: A Dialogue Between Man and Machine over at Hungry Wolf Press]

Leave a Reply

Your email address will not be published. Required fields are marked *