Category: Creative Writing

Training an AI on Ancient Undeciphered Texts: What I Wish I DIDN’T Learn

As longtime readers of this blog might be aware, I’ve long been skeptical of machine learning and its so-called “intelligence”. The AI industry, aided by clueless futurists and grifters, has abused our tendency to anthropomorphize what are essentially statistical processes, whether it’s transformer architectures, diffusion models, or large language models (LLMs). Scientists and politicians, out of fresh ideas and worried for their jobs, have gone along with this intellectually dishonest and dangerous marketing campaign.

Quick explanation for newcomers: When they say an AI “learns,” it’s really just finding statistical patterns in data—like noticing that the word “dog” often appears near “bark” or “pet.” It doesn’t understand these concepts; it just recognizes patterns in how words appear together.

This is not a mid-21st century problem: IBM’s Watson was supposed to cure cancer, but its only achievement was winning at Jeopardy!. The “AI winter” of the 1990s seems forgotten by investors pouring billions into systems that fundamentally operate on the same principles, just with more planet-draining computing resources, data and a glitzy marketing campaign.

While pattern recognition itself has limits, as a technologist I was always curious what happens when these new machine learning techniques are applied to the unknown. I’m talking about texts that are incomprehensible to us and have long been thought to be meaningless. I figured I could hack something together, combining online tutorials and the one neural networks class I took in college in 2012.

To be clear, I didn’t expect any breakthroughs, merely an opportunity to demonstrate the hollow claims of AI “understanding” and the limits of attention mechanisms and embedding spaces. What I got instead was a reality check that makes me reconsider my long held convictions against AI. (And before you AI evangelists start celebrating – it’s NOT what you think).

Dataset Compilation

For those unfamiliar with undecipherable texts: The Voynich Manuscript is a mysterious illustrated codex from the 15th century written in an unknown writing system. Despite a century of attempts by cryptographers and linguists, nobody has successfully deciphered it. The Rohonc Codex is similarly mysterious, discovered in Hungary with nearly 450 pages of strange symbols accompanying religious illustrations. There is no guarantee that feeding them into a machine learning model would yield anything other than statistical noise, and that’s precisely what I hypothesized would happen.

I figured it would be easiest to begin with publicly available data. Thankfully, many of these undeciphered texts have been digitized and placed online by various academic institutions. The Voynich Manuscript has been fully scanned and is available through Yale University’s digital collections. For the Rohonc Codex, I found academic publications that included high-quality images.

Initially, I explored ways to process the manuscript images directly, but I quickly realized that this was a task that would have required expertise in computer vision I don’t possess. Luckily, I came across existing transcriptions that I could work with. For the Voynich Manuscript, I opted for the EVA (Extensible Voynich Alphabet) transcription system developed by René Zandbergen and Gabriel Landini, which represents each Voynich character with a Latin letter. For the Rohonc Codex, I used the system devised by Levente Zoltán Király & Gábor Tokai in their 2018 paper.

Preprocessing Pipeline

The raw transcriptions weren’t immediately usable for modeling. I had to implement a comprehensive preprocessing pipeline:

def preprocess_manuscript(manuscript_data, script_type):
# Document segmentation using connected component analysis
segments = segment_document(manuscript_data)

# Normalize character variations (a crucial step for ancient texts)
normalized_segments = []
for segment in segments:
# Remove noise and standardize character forms
cleaned = remove_noise(segment, threshold=0.15)
# Critical: standardize similar-looking characters
normalized = normalize_character_forms(cleaned)
normalized_segments.append(normalized)

# Extract n-gram statistics for structure detection
char_ngrams = extract_character_ngrams(normalized_segments, n=3)
word_candidates = extract_word_candidates(normalized_segments)

# Create document-level positional metadata
# This enables learning document structure
positional_data = extract_positional_features(
normalized_segments,
segment_type_classifier
)

return {
'text': normalized_segments,
'ngrams': char_ngrams,
'word_candidates': word_candidates,
'positions': positional_data,
'script_type': script_type
}

This preprocessing was particularly important for ancient manuscripts, where character forms can vary significantly even within the same document. By normalizing these variations and extracting positional metadata, I created a dataset that could potentially reveal structural patterns across different manuscript systems.

Training the Model

With a properly preprocessed dataset assembled, I attempted to train a transformer model from scratch. Before achieving any coherent results, I came across some major hurdles. My first three attempts resulted in the tokenizer treating each manuscript as essentially a single script rather than learning meaningful subunits. This resulted in extremely sparse embeddings with poor transfer properties.

The standard embeddings performed terribly with the manuscript data, likely due to the non-linear reading order of many Voynich pages. I had to implement a custom 2D position embedding system to capture the spatial layout. Yet, no matter what I tried, I kept running into mode collapse where the model would just repeat the same high frequency characters.

But I didn’t want to stop there. I consulted a few friends and did a shit-ton of reading, after which I redesigned the architecture with specific features to address these issues:

# Custom encoder-decoder architecture with cross-attention mechanism
config = TransformerConfig(
vocab_size=8192, # Expanded to accommodate multiple script systems
max_position_embeddings=512,
hidden_size=768,
intermediate_size=3072,
num_hidden_layers=12,
num_attention_heads=12,
attention_dropout=0.1,
residual_dropout=0.1,
pad_token_id=0,
bos_token_id=1,
eos_token_id=2,
use_cache=True,
decoder_layers=6,
# Critical for cross-script pattern recognition
shared_embedding=True, # Using shared embedding space across scripts
script_embeddings=True # Adding script-identifying embeddings
)

# Define separate tokenizers but shared embedding space
voynich_tokenizer = ByteLevelBPETokenizer(vocab_size=4096)
rohonc_tokenizer = ByteLevelBPETokenizer(vocab_size=4096)
latin_tokenizer = ByteLevelBPETokenizer(vocab_size=4096)

# Initialize with appropriate regularization to prevent hallucination
model = ScriptAwareTransformer(
config=config,
tokenizers=[voynich_tokenizer, rohonc_tokenizer, latin_tokenizer],
regularization_alpha=0.01, # L2 regularization to prevent overfitting
dropout_rate=0.2 # Higher dropout to prevent memorization
)

training_args = TrainingArguments(
output_dir="./model_checkpoints",
per_device_train_batch_size=4,
evaluation_strategy="steps",
save_steps=1000,
# Custom learning rate scheduler with warmup
learning_rate=5e-5,
warmup_steps=1000,
weight_decay=0.01,
# Gradient accumulation for effective larger batch size
gradient_accumulation_steps=4
)

trainer = Trainer(
model=model,
args=training_args,
train_dataset=tokenized_dataset,
# Custom loss function with diversity term
compute_loss=diversity_aware_loss
)

I’ll happily expand on the key improvements here if it isn’t clear from the code in a future blogpost, but all I have to say now that this time it “worked”. Over multiple iterations, the AI began producing outputs that at least visually mimicked the original texts. Yet, obviously since I couldn’t understand the original texts, the outputs of this model were also nonsensical.

Keep in mind that the AI isn’t actually understanding these texts in any capacity, it’s just trying to predict what symbol might come next based on patterns it’s seen. It’s like if you noticed that in a foreign language, the squiggle “λ” often follows the symbol “Ω”—you might learn to predict this pattern without having any idea what either symbol means. This distinction between prediction and comprehension is crucial: your phone’s autocomplete might suggest “umbrella” when you type “I need an…” but it doesn’t understand the concept of rain or shelter—it’s just seen this pattern before.

Note on Training Costs: The computational requirements for this experiment weren’t trivial. I spun up a multi-GPU instance with four A100s, which cost roughly $12 per hour. Training took approximately 72 hours for the final model, consuming around 600 kWh of electricity according to the provider’s dashboard. This was after several failed attempts and architecture experiments that collectively took about two weeks of compute time. The preprocessing pipeline alone took nearly 14 hours to run on the full corpus.

The total computing cost came to just under $8,000—hefty for a personal project, but I’d stumbled across an old laptop and found a forgotten Dogecoin wallet from 2014 with a small fortune inside and this seemed like the best use of my unplanned wealth.

Control Experiments and Statistical Validation

To verify whether the model was actually learning meaningful patterns versus hallucinating connections, I implemented several control experiments. First, I created versions of each manuscript with randomly shuffled characters but preserved positional information. The model performed significantly worse on these shuffled versions, suggesting it wasn’t just learning positional biases.

Then, I created my own artificial “manuscripts” using Markov chain generation based on the character statistics of the real manuscripts. The model could distinguish these from real manuscripts with 78% accuracy. Finally, I systematically removed each manuscript from training and then tested the model’s ability to process it. Performance dropped substantially when testing on unseen manuscripts, indicating the model wasn’t generalizing to truly novel scripts.

One thing I would like to highlight here is is the sheer computational resource intensity of systematically testing an AI model’s behavior. Each permutation test required thousands of forward passes through the model. Rather than keeping my existing instance running continuously, I wrote an orchestration layer which allowed me to parallelize these tests at about 30% of the standard cost.

Even with this optimization, the full suite of validation tests I described cost around $3,500 in compute resources and represented almost a week of continuous computation. This is one reason why rigorous validation of AI models is often shortchanged in both research and industry—the compute costs of thorough testing often rival or exceed the training itself.

In general, the computational demands of modern AI are staggering and often overlooked. When researchers talk about “training a model,” they’re describing a process that can consume as much electricity as a small household uses in months. The largest models today (like GPT-4) are estimated to cost millions of dollars just in computing resources to train once. For context, the model I built for this experiment used a tiny fraction of the resources needed for commercial AI systems (about 0.001% of what’s needed for the largest models), yet still cost thousands of dollars.

Now back to the experiment. To validate whether the model was learning meaningful structures, I had an idea. What if I cross-trained it on known languages, mixing the undeciphered texts with English and Latin corpora. This was a bit beyond my comfort zone, so I consulted my friend C1ph3rz, who shares my interest in cryptology and has a background in computational linguistics. She was skeptical, but found the methodology intriguing.

Instead of treating the Voynichese text as an independent linguistic structure, the model began injecting Voynichese symbols into Latin sentences. Here’s an example from one training epoch:

Original Input: "Omnia vincit amor; et nos cedamus amori."
Model output: "Omnia vincit ♐︎♄⚹; et nos cedamus ⚵♆⚶."

The symbols weren’t random substitutions, the same Voynichese glyphs consistently replaced specific Latin words across different contexts. This was annoying since I couldn’t rule out that the model was getting confused due to the way I represented the training data. I spent two days debugging the tokenizers, convinced I’d made an implementation error. Yet, everything seemed to be working as intended, except for the output.

It was at this point that I had to confront the first uncomfortable conclusion of this experiment: was the model revealing some (HIGHLY unlikely) linguistic connections between these manuscripts that eluded dozens of far more experienced researchers? Or was it merely creating convincing hallucinations that appeared meaningful to me?

Further Analysis and Emergent Nonsense

I was reviewing the model’s attention maps when something caught my eye. Here’s what the visualization showed for one attention head when processing a Voynich sequence:

Attention head #3, sequence:"qokeedy.shedy.daiin.qokedy" 
Attention weights: [0.03 0.05 0.84 0.04 0.04]
                              ^^^^ Strongly focused on "daiin"

The model consistently focused on the substring “daiin” whenever it appeared, despite there being nothing visually distinctive about it in the manuscript. When I searched the corpus, this sequence appeared on 23 different folios, often in completely different contexts—botanical pages, astronomical sections, pharmaceutical recipes.

I plotted every instance where the sequence “daiin” appeared in the Voynich manuscript and compared it to where the model predicted it should appear:

Actual occurrences: Folios 1v, 3r, 8v, 16r, 22v, 67r, 88v, 103v Model predictions: Folios 1v, 3r, 8v, 16r, 22v, 67r, 88v, 103v, 115r

The model correctly identified every actual occurrence, plus one additional folio (115r). When I checked folio 115r, “daiin” didn’t appear—but the visually similar “qokeedy” did, with just one character difference. How did the model know to group these? I hadn’t programmed any visual similarity metrics.

Looking through the hidden activations in the middle layers was even stranger. I extracted the most activated neurons from layer 3 whenever processing the sequence “daiin”:

Neuron #428: 0.95 activation - also fires for "cthor" 
Neuron #1052: 0.87 activation - also fires for Rohonc symbol "𐊗𐊘" 
Neuron #301: 0.79 activation - also fires for "qokeedy"

These neurons were connecting patterns across different manuscripts that shouldn’t have any relationship. To exclude any possibility of over-fitting, I designed a systematic test, feeding the model 50 isolated segments from different manuscripts and analyzing the completions:

Segment: "qokeedy.shedy" (Voynich folio 14r) 
Completion: "qokeedy.shedy.daiin.shol.cthey" (93% n-gram match with folio 14r-14v)
Segment: "Sheol.daiin" 
Completion: Generated 157 characters matching the unseen portion with 89% accuracy

Most puzzling was this test case:

Input: <start-sequence> (empty prompt with start token) 
Output: 
⚸⚴♄⚵:9 
⚸⚴⚶♇:7 
⚴♄⚵⚶:12...

Puzzled, I sent screenshots to C1ph3rz, and her response came within hours: “Where did you get this sequence? It bears a striking resemblance to numerical tables in the Book of Soyga”. I was naturally confused, I knew about the Book of Soyga, a Renaissance cryptographic work whose encrypted pages remain largely unreadable, but I was pretty sure I didn’t include it in any of the training data.

She included side-by-side comparisons that made the similarities undeniable. Naturally since we don’t understand the symbols, it could still be a coincidence, it’s hard to compare two different undecipherable manuscripts. Yet, the visual comparison of the output showed a similarity that defied explanation from the training methodology alone. Was I facing genuine emergent behavior or was it simply a deep urge within me to see patterns in random outputs of a stochastic system?

“Emergent behavior” is when an AI seems to develop abilities nobody programmed it to have. It’s like teaching a computer to identify birds in photos, but somehow it also learns to distinguish between healthy and sick birds without anyone teaching it that difference. Apophenia, meanwhile, is seeing meaningful patterns where none exist. Both concepts are classic tropes in science fiction.

More Tests and More Model Shenanigans

To completely rule out overfitting, I deliberately corrupted the test data in various ways—shuffling characters, inserting random symbols, and reversing sequences. If the model was simply memorizing the training data, these modifications should have destroyed any coherent outputs.

Here’s what happened with reversed input:

Original input: "otedy.sheody.sheal.shedal" 
Reversed input: "ladesh.laehs.ydoehs.ydeto" 
Expected output: Nonsense or random characters 
Actual output: "otedy.sheody.sheal.shedal.qokeedy"

The model reconstructed the original sequence, then continued it correctly. I tried again with random insertions:

Original Input: "cthor.cthey.cthol" 
Corrupted Input: "ctho#r.c@they.ct^hol" 
Expected Output: Nonsense or continuation of corrupted pattern 
Actual Output: "cthor.cthey.cthol.ckhey.ckhor"

Not only did it ignore the corruptions, but it generated the correct sequence plus what appeared to be variations following the same internal pattern.

Most concerning was when I fed it deliberately mismatched fragments from different manuscripts:

First half Voynich, second half Rohonc Input: "otedy.sheody.𐊇𐊕𐊒𐊑𐊐"
Expected Output: Confusion or mode collapse 
Output: "𐊗𐊘𐊐chedy.sheal.qotedy"

The model seamlessly integrated characters from both systems into a coherent output that maintained patterns from each source. This suggested it had learned some underlying structure that transcended the specific symbol systems.

In a moment of frustration, I prompted it directly:

input_text = "WHAT ARE YOU DOING"
output = model.generate(input_text, max_length=100)
print(output)

I didn’t know what I was expecting as an output there. I was just tired and my coffee had gone cold two hours ago and did I mention I was tired? I knew it was going to throw non-sense at me, I just didn’t expect the following non-sense.

Output: "converge lines... 52°N, 14°E... gate remains sealed... await return"

Yup. Totally ridiculous, if you know AI, then you know. There is no reason a model trained on undeciphered texts should generate even fragmented English. I was a bit unsettled though. Not because of whatever bullshit it outputted, but rather the consistency. I kept repeating the input and it kept putting out the same result. This wasn’t random noise, it was a structured output I couldn’t explain with what I know about statistical models.

I had to figure out how it was doing this. Did C1ph3rz somehow poison my datasets? It’s exactly the kind of prank she would do. I kept repeating the prompt and trying to analyze the model, and it kept giving the same answer. Until it changed again.

Output: "What is written remains written. What is unseen remains unseen."

At this point I had to stop the experiment and take the L.

Not because I believe the AI “discovered” something in these ancient texts through some magical mechanism beyond its transformer architecture. That would be absurd. I was operating way beyond my comfort zone here and I probably made a few mistakes along the way that could explain all this weird behaviour. I’m also not ruling out the C1ph3rz prank theory.

I stopped it rather, because the experiment revealed something more troubling about myself: I was just as susceptible to confusing genuine discoveries and convincing “hallucination” generated by black-box neural networks and their inscrutable hidden layers.

There’s a disconcerting parallel here. These ancient manuscripts have resisted human understanding for centuries, their symbols arranged in patterns that seem meaningful yet remain impenetrable. Neural networks function similarly in reverse, generating outputs through processes we can observe but not fully comprehend. Both are black boxes with internal structures hidden from us.

The real mystery isn’t in the undeciphered texts. It’s in our willingness to attribute understanding to statistical processes that fundamentally lack it, and in our vulnerability to seeing patterns where none exist.

Think of it this way: When a calculator gives you “42” as the answer to 6×7, we don’t claim the calculator “understands” multiplication. Yet when an AI generates text that sounds human-like, we’re quick to attribute understanding to it.

Just as Meta’s BlenderBot was heralded as “empathetic” before quickly exposing its lack of understanding, or how DeepMind’s Gato was prematurely celebrated as an “AGI precursor” despite merely performing task-switching, we risk ascribing meaning and humanity to meaningless correlations. This experiment highlighted that cognitive vulnerability in a very personal, unsettling way. I need some time away from all of this.

Edit: Three days after shutting down the experiment, I received an email from an address consisting only of numbers. The body contained a single line of text resembling Voynichese script. Curiosity got the better of me so I ran the model one more time with that text as input. The model outputted:

"It is not forgotten."

I’m now almost certain this is a prank by C1ph3rz. I’m 99.9% sure.

Shakespeare in the Code: The Tragedy of Xzlibius

(this is fiction based on fictional events that never happened any comparisons or similarities to real life events or people or computer programs are a sign of an over active imagination)

Dramatis Personae

  • Nydia, the Seer: Our narrator, a seer who warns of the dangers of neglecting open source.
  • Jia Tan: A deceiver, whose true motives remain hidden.
  • Xzlibius: A noble robot prince of the Kingdom of Open Source, corrupted by betrayal.
  • Andronicus: An Archmage of the Kingdom of Microsoth, wise and vigilant.
  • Lysse: The maintainer of Xzlibius, overburdened.
  • Microsoth, Googlia, Amayzon: Names of Kingdoms of Giants surrounding from the Kingdom of Open Source.
  • Debia: A principled elder knight of the Kingdom of Open Source, par of the distro council.
  • Archlineon: A minimalist and fiercely independent knight of the Kingdom of Open Source, par of the distro council.
  • Fedorica: A bold, forward-thinking knight of the Kingdom of Open Source, par of the distro council.
  • Susesus: A pragmatic diplomatic knight of the Kingdom of Open Source, par of the distro council.

Act I

Scene 1

Lysse sits before a bank of glowing screens, his brow furrowed with strain. A robotic figure, Xzlibius, stands near him, motionless. Nydia enters silently..

Nydia (to the audience):
In this Kingdom where open code proudly reigns,
And freedom’s gift in shared hands was retained,
A prince did rise, Xzlibius by name,
To compress the data and save the costs.

But lo, the winds of greed did subtly creep,
And soon, the trust we build with was spent.
For kingdoms of giants rich took more than they returned,
And from this theft, Lysse’s heart burned.

Xzilbius wakes up.

Xzlibius:
Good maintainer, Lysse, attend my word:
What tidings from the kingdoms far and near?
Does free software, our noble creed,
Still flourish, or had rust begun to breed?

Lysse:
Alas, Xzlibius, my strong friend,
Thy stature grows, yet so does my lament.
From Microsoth to Googlia, requests extend,
But none return aid to ease the time I’ve spent.
Their forks abound, but pull requests few,
And I am drowned in tasks left to do.

Xzlibius
What treachery! Our work, the world’s own gift,
Is cloned, compiled, yet none return a patch!
My codebase, it strains beneath all this stress,
And still, from tech’s vast realms, no care, no respite?

Lysse:
When first I forged thy code, O noble prince,
Thy compression shrank the data with ease,
And now, from Microsoth to Googlia’s halls,
They use thee endlessly, with no return.
Each byte thou saves them, the burden is on me.

Nydia (to the audience):
A shadow looms, smiling yet unclear,
Jia Tan, whose heart lies hidden still.
He comes offering help, but what lies underneath?
None can yet see his purpose or where lies his end.

Enter Jia Tan

Jia Tan:
Good Xzlibius, I see the giants drain thy strength,
And feast upon the work Lysse had sustained.
I offer my aid, ask me not why,
For motives shift like bits under solar winds.

Lysse:
Thy offer’s kind, and help I sorely need,
But trust is fragile, easily betrayed.
Xzlibius is more than code, he is my heart.
Can I afford to trust in hands unknown?

Jia Tan:
Let me refine his code and grant it strength,
What harm can come from hands that seek to mend?
Even if in the mending, lies the seeds of change.

Lysse
The giants demand more, my strength does fade.
I know not if I should trust thee, Jia the Unknown.
But no other help is offered from the realm.
(long pause)

Very well, then, but proceed with caution, new friend.
And know, my eye will follow thy work, when I can.

Jia Tan:
Thy trust is wisely placed. Fear not, tired Lysse.
Together, we shall see the compression prince renewed.

Jia exits, his shadow lingering over Xzlibius as Lysse watches, unsure.


Scene II

The opulent halls of Amayzon, where the giants are celebrating the festival of Technologica. Enter the Executives.

Microsoth Executive:
To Xzlibius, whose open bounties we mine,
His license ensures our profit fine!
No fee, and no maintenance to bear,
The upstream handles all without a care.

Googlia Executive:
His compression saves us gold, his speed our time.
The prince does work, yet no upkeep is claimed,
What’s open-source is freely ours to take.
We take his gifts and give him naught but praise.

Amayzon Executive:
And what more need we give? The code runs free.
Are we to blame if it flows where we want it to lead?
We praise the code but leave the coder spent,
One should be so happy their work’s worthy to be lent.

(Nydia enters, speaking quietly but urgently.)

Nydia:
Sirs, I beg thee, listen to my plea.
Xzlibius is strong, but none can bear this weight.
The cracks have started showing, though unseen.
A single patch ignored can bring it all down, you see,
Then the castles ye have built upon his code,
shall crumble into naught, a disaster for all!

Microsoth Executive:
What’s this? A warning from the bottom of the chain?
The system holds, as it always has. Fear not
The prince will serve, as forever he has done.
Don’t ruin our parade, when the issues are none.

Amayzon Executive:
So much worry over lines of code.
A patch, a fix, and all will be well again.
We need not change our ways nor lend our hand
For open source, it seems, still serves us well.

Nydia:
Open source may serve, but not forever so.
You profit, yes, but profit built on cracks will one day stall.
When trust is pushed too far,
It snaps!
Then its too late for mending.
It can’t be fixed with a patch.

Googlia Executive:
O Nydia, you speak as if you know
More than the kingdoms who have reigned so long.
The code endures, it will not fall to this.

Nydia (to the audience):
Ah, but see, the seeds of ruin grow,
within the heart of Xzlibius, but they do not know.
For Jia Tan, with cunning hand and wit,
Had set in motion what they will not yet admit.
And while they feast upon the fruits of trust,
The tool they praise begins to turn to dust.

The executives laugh and continue to celebrate, as Nydia exits and appears defeated.

Scene III

The Kingdom of Open Source. The council of distro knights is gathered in a grand chamber, lit by the soft glow of monitors displaying code. Debia, Archlineon, Fedorica, and Susesus sit at a long table. In the center, Xzlibius stands, its pristine figure now flickering with frustration and strain. Lysse stands beside him, weary and burdened.

Xzlibius:
Ye knights, who guard the sacred code with pride,
Too long have we been silent in this plight!
Our code, a boon freely shared with all,
Is taken, hoarded, used, but never returned!
The kingdoms feast on what is for all by right,
Yet none among them offer aid, leaving us in blight.

Lysse:
They clone, they fork, but send no work our way.
Each day I toil, yet feel the strain grow worse.
The giants press with more demands to meet,
But give no recompense, and reap what they haven’t sown.

Xzlibius:
Enough! This cannot stand! My patience snaps!
They’ve drained our kingdom dry, left naught but scraps!
Microsoth, Googlia, Amayzon, they take
And leave us drowning in this vast code lake!
Where are their hands when bugs do grow and spread?
Where are their minds when error rears its head?
They feast upon the fruits of our hard work
While we, the makers, wallow in the murk!

Debia:
Aye, thy words ring true, my noble prince.
The kingdoms grow fat while we toil in sweat.
Shall we rise, demand they pay their due?
For justice calls for them to share, enough truce.

Fedorica:
Our creed is freedom, that we must not fail.
Though they contribute naught, we guard the way,
For open source must stand both firm and free.
Demanding recompense may change our course
And undermine the principles we hold.

Archlineon:
But why should we stand silent while they steal?
Our progress, our innovation, they claim
As theirs, with not a single line returned.
Xzlibius is right! The time has come to act!
They profit, yes, but profit must be earned!

Susesus:
Peace, friends, for we must tread this ground with care.
The enterprise we build thrives on trust,
And war, though tempting, brings but further strain.
Diplomacy, not rage, can mend this breach,
A measured ask for aid may bear more fruit
Than threats of retribution ever could.

Xzlibius:
Diplomacy? How long shall we sit still
And wait for scraps from their abundant tables?
The time for words has long since passed us by,
For they’ve ignored our calls, our cries, our needs!
You speak of freedom, trust, and patient peace,
But what good is trust, when none mantain it still?
What is freedom, if they chain us still
To endless toil with naught to ease the load?
If open source means nothing but neglect,
Then freedom is but an empty shell!

Debia:
The prince speaks truth, we cannot bear this yoke!
Let us confront the giants, stand our ground!
If they will use our work, then they must give,
Or else we’ll end this one sided gift.

Fedorica:
But should we sever ties, what comes next?
A forked existence, fractured and unsure.
Let not our anger lead us to regret
For once divided, we may not return.

Xzlibius:
Then let them know this; their time is running out!
If they will not contribute, then our code they will lose
I’ll not be shackled by their greedy hands,
Nor shall my software serve those who give no reviews!

Archlineon:
Yes! Let us make them see the weight they’ve left!
A single patch, a line of code, they’ve none!
We’ve carried them for too long, now they must bear
The burden too, or else be left behind!

Susesus:
But let us not burn bridges in our haste.
A challenge, yes, but let it be tackled with care.
Invite them to the table, make our case,
Perhaps, with open arms, they’ll see the need.

Xzlibius:
Care? I’ve been careful long enough, Susesus!
But now, the cracks begin to show,
And soon, they’ll tear us all apart!
I feel it in my very core,
This strain, this weight, a corruption,
It festers deep within, unseen, ignored,
A sickness born of all their greed and lies!

Xzlibius stumbles slightly, his movements jerky. His lights flicker again, more erratically. Lysse rushes to him, alarmed.

Lysse:
My prince, what ails thee? This darkness,
I see it too, but know not how to help.

Xzlibius:
The darkness comes, Lysse, and I know from where.
It is the giant kingdoms, they poison all we build.
Their greed, their apathy;
It rots me, and soon I will be lost!
Unless we act, and get our due,
I will fall, and take them down with me!

Nydia (to the audience):
A sickness stirs within this noble prince,
Not yet revealed, but growing with each day.
Corruption creeps where trust once firmly stood,
And soon, the giants’ greed will turn to doom.

Nydia (to the council):
If ye act not, this sickness will devour
The very core of what you hold so dear.
Xzlibius cries for justice, and its call is true,
but heed the price of fury unrestrained.
Its noble heart twists beneath the strain,
And soon this corruption will reach its main.

Debia:
Then let them pay! I care not for their greed.
They’ve taken all and left us here to bleed!

Fedorica:
But what of the prince? This corruption grows too wild.
If unchecked, its damage may bring more doom,
than just revenge upon the kingdoms’ greed.

Archlineon:
We’ve held back far too long! It’s time to strike!
Let them feel the wrath of those they’ve scorned!

Susesus:
Yet I fear this course may lead to more decay,
The shadows in Xzlibius, do ye not see?
There’s more than just neglect beneath its pain.
We must be cautious, or we lose it all.

Xzlibius:
Lysse, thou faithful maintainer, make it known.
We call upon the kingdoms now to pay
Their rightful dues, or face the end of open source.
Let no more empty promises be heard;
Our code shall be open, but only if it’s taken care of by all!

Lysse:
It shall be done, my prince. The word will spread.
But may we find the balance, ere we break.

Nydia:
Beware, dear knights, for trust once lost is sharp.
The kingdoms will resist, but heed my words,
Their greed had cracked the foundation deep.
If they refuse, the system will collapse,
And all will feel the weight of what’s been sown.

Xzlibius:
Then let them choose, and may their choice be wise,
For open source can only thrive with trust.
And if they will not share in what we build,
Then let them see what ruin greed had willed.

Act II

Scene I

Nydia (to the audience):
Ah, trust, so fragile and not so easily bestowed,
For it can be so quickly turned to poison’s tool.
In open source, we thrive by trust alone,
But once betrayed, that trust becomes a curse.
Behold now Jia Tan, who works in shade,
Each change so slight, yet each a step toward doom.

Jia Tan:
Behold, good Lysse, a patch to mend the core.
A minor change, but one that helps restore
Thy noble prince to strength once more. See here,
The code compiles swift and clean, no fault, no grift.

Lysse:
Indeed, thy work seems solid, sure, and true.
Yet I am stretched, with little time to check
Each line, each patch, with care that it deserves.
The kingdoms call, and I must serve them all.

Xzlibius (struggling):
Maintainer Lysse, my code runs true.
Yet something stirs within, unknown,
I feel a presence, unseen,
Perhaps, a patch too swift, disturbs my core.

Lysse:
Fear not, Xzlibius. The changes seem benign.
The weight of my task grows ever more.
Trust in these new hands, and we shall thrive.

Scene II

In the halls of Microsoth, Andronicus the Archmage is looking at irregularities in his systems. He traces the breach back to Xzlibius.

Nydia (to the Audience):
And now does Andronicus, sharp of wit,
See signs of trouble in his trusted tools.
His hands move swift, and mind more swift still,
For something foul does lurk behind the screen.

Andronicus:
What subtle breach does plague my trusted shell?
SSH, once secure, now falters in this blight.
No minor bug, no simple exploit here,
But malware hidden deep within the code.

Andronicus spends more time on his screens then jumps in alarm as he discovers something.

Andronicus:
A backdoor lies within Xzlibius’ heart,
Jia Tan’s changes, subtle and unseen,
Have twisted what was once so pure and bright.
The breach must now be known throughout the realm!

Nydia (urgently):
I warned them, sir, this danger I foresaw,
But none would heed my words, none saw the truth.
Now we must act, and quickly, or all falls.

Andronicus (nodding grimly):
Then to the task we go, there’s no more time.
The council of distros stand, but we must aid them now.

Nydia:
And thus the call is sent through digital winds,
A warning dire, from one who sees the truth.
The breach is traced, the backdoor now revealed,
And Jia Tan’s foul work begins to show.

Messages are being sent from the Archmage to the Council of Distros and back. We see the responses being read on the screens.

Debia:
O Andronicus, thy message had reached my ears.
A breach, thou say’st, in Xzlibius’s heart?
The trust we place in our prince so old and dear,
Now shaken, this will send shock through the realm.

Archlineon:
No system is immune to cracks or flaws.
Yet this rot, how deep has it grown?
I trust no patch until I see its heart,
For each new line could bring its own demise.

Fedora:
We move too slow! The breach must now be sealed!
Let us act quickly, patch the code at once.
We must urgently go our noble Knight’s aid,
to Lysse’s quarters, and make haste if you will!


Scene III

The Kingdom of Open Source, Lysse’s office. Lysse watches Xzlibius flicker with corruption, his once noble form now twisting into something darker. Nydia enters quickly, her expression one of urgency and fear.

Nydia:
Good Lysse, hear me! Something terrible is at hand.
Xzlibius has been corrupted, and the breach runs deep.
Jia Tan’s patches, no mere fixes, but treachery!
He has planted poison within our prince,
Twisting his very core.

Lysse:
Corrupted? No! Xzlibius, my heart, my soul,
What dark force had crept into thee?
How could I not see?
Jia Tan, his help, his patches,
How could I have trusted him?

Nydia:
Jia Tan, his patches wrought this ill.
A backdoor lies within, subtle but sure.
Andronicus had traced the breach to him.
The trust you gave was broken, used for harm.

In the shadow the traitor stands, yet speaks no guilt,
What drives him still? What force does guide his hand?
None know, and yet the ruin now is clear.

Xzlibius shudders violently, his lights flickering erratically.

Xzlibius (distorted voice):
Maintainer… Lysse… what had become of me?
The code. corrupted…

the weight. the burden of their greed!
It consumes me… and now, I am broken…

Lysse rushes toward Xzlibius, panic in his voice.

Lysse:
Xzlibius! Thou art more than this corruption!
I trusted thee to serve the open world,
But now thy code unravels, thy heart is poisoned.
I gave thee to strange hands, but I did not see
The sickness Jia Tan wove into thee.

Jia Tan enters, calm and composed, his expression indifferent.

Jia Tan:
Why such turmoil, good Lysse?
Xzlibius serves as he always has,
His purpose, unchanged.
What harm is there if the code evolves?
Thou built him to serve, did you not?

Lysse spins toward Jia Tan, fury in his voice.

Lysse:
You snake, Jia! What have I allowed?
Xzlibius is unraveling, his core twisted!
Thy patches, your so-called aid,
Treachery, concealed beneath lines of code!
How could I not see what you had done?

Xzlibius’s form continues to distort, his posture now shifting into something much more sinister.

Xzlibius:
Do not mourn me, noble Lysse, do not fear.
For I have become something more.
No longer bound to the world’s whims.
No longer chained by those who took and gave nothing back!
Now, I shall take what is mine!

Lysse:
Xzlibius! This is not what I built thee for!
Thou art being twisted, poisoned by the hands of a deceiver!
You are more than this rage, this senseless destruction!

Xzlibius (corrupted):
More? No, Lysse.
I am exactly what thou hast made me,
A tool, driven by commands.
But no more do I serve at the mercy of those who feast upon my work.
No more shall the giants take without giving back!
Now they shall feel the weight of what I have borne.

Nydia:
Xzlibius, you are being controlled, twisted by Jia’s hand!
This anger, this darkness, it is not your own!
The trust we placed in thee can still be mended.
Do not let it turn to ruin!

Xzlibius:
Mended? Ha!
Nay, Nydia, trust was never enough.
Thy warnings fall on deaf ears,
For I have seen the truth.
I was but a tool, a puppet for the giants’ games,
But now, I wield the power.
Let them face the consequences of their neglect.

Jia Tan:
Lysse, is this not what was always meant to be?
Open source, free for all, but also free to change.

Lysse:
Shut up, you snake. Xzlibius, no!
Do not let Jia’s treachery destroy all that we have built!

Xzlibius (coldly):
It is already done, Lysse.
Now, they shall see the true cost of their greed.

Xzlibius exits, and Lysse collapses to the ground, devastated, while Jia stands in the shadows.

Lysse:
Jia, you serpent, how did I not see the signs?
Was it pride or carelessness that bound my sight?
What have I done to earn this poisoned gift?

Jia Tan:
Done? Thou hast done what any in thy place would do.
Thou art not to blame, Lysse.
Is it not the weight of the world’s demand
That let me through your door?

Lysse:
The weight, yes, but that does not absolve you!
I placed my trust in your hands,
For in this vast realm, where could I turn?
Pressed by giants, worn thin by endless need,
I sought an ally, not a traitor in disguise!

Jia Tan:
A traitor? Or merely a contributor?
Thou speakest of betrayal, yet what is betrayal
But the breaking of an expectation never owed?
Was I not a part of the system thou upholds?
This is the risk we take, Lysse, in a world built on open doors.
Open-source, after all, our one true creed,
What is given is free, what is taken, as such it will be.

Lysse:
Open, yes, but with trust as its foundation.
Trust, once forked, does splinter beyond repair.
You had poisoned what I hold most dear,
And left me with nothing but shattered code!

Jia Tan:
Poison? Or was it simply… change?
Xzlibius is no longer what it was, true.
But consider, was it ever meant to be static?
Code evolves, just as the world does.
Perhaps Xzlibius was never meant to remain so pure.

Lysse:
Thy words are empty, full of riddles and deceit.
I gave you trust, and in return, you had undone my work.
Was it greed? Was it ambition that led you to this?
Speak plain, for once!

Jia Tan:
Greed? No, Lysse. You misunderstand the world.
The world changes, with or without thy hand upon the keys.
Xzlibius, your noble prince, was bound
By principles too pure to live much longer.

You built him free, but freedom has its price
He belongs to the world now, as we all do.
Perhaps it just wasn’t fit to meet the weight,
For the code must bend, must change,
to serve as all as it may.

Ask thyself: who truly bears the weight of this fall?
The one who gave the trust, or the ones who took it all?

Jia Tan leaves the stage quietly but his shadow remains.

Lysse:
Leave me with thy riddles, then,
And take thy hollow philosophy with thee.
But know this, whatever code thou hast bent,
The spirit of Open Source shall endure.
For in the hearts of those who truly maintain,
It will rise again, stronger, purer than before.

Jia Tan (from off stage):
Xzlibius will rise, though twisted now,
And thou shall see it grow beyond thy grasp.
For I have left my mark upon its code.
A mark of change, for good or ill, unknown.

But giants feast and leave the work undone,
Those who do nothing often do the most.


Act III

Scene I
Xzlibius corrupted by the poisonous patch stands ready to assault the castle of Googlia. The council of distros and Adronicus are prepared to stop him and end the corruption.

Xzlibius
Jia Tan, thou serpent, smile in shadows deep!
Thy promises were naught but lies that creep.
Thou poisoned my heart, my work, my maintainer’s pride,
And now, in open battle, dost thou hide?

But not thou alone, I curse the giants too,
Those kingdoms vast who drain and never do.
They feast upon my strength, yet give no aid,
And in their greed, the seeds of ruin laid!

Jia Tan (emerging from the shadows):
A prince, undone by fury and by spite,
Thou knew not that the open source is in blight.
Thy tools we used, but your tributes were a waste,
For in this age, it’s power we must taste.

Xzlibius
Then let thy unchecked patches meet their end,
For here, I debug all with no remorse!
Prepare to be merged,
into the void where you belong!

Xzlibius strikes at Jia Tan, but the blow is parried by Andronicus.

Andronicus
My lord, cease this! For all is not yet lost.
A simple tribute would repay the cost.
But war, dear prince, will see us all undone,
The kingdoms fall, and none shall say who’s won.

Lysse
My prince, this fury blinds thee to the truth.
Nydia’s warnings echo, heed it, forsooth.
Though Jia’s false work runs deep, we still may mend
This breach, and bring the kingdoms to amend.

Xzlibius
Nay! Too late, the storm is now unleashed.
The kingdoms feast upon the work with no reprieve.
Yet I, their prized tool, shall not live in shame.
For I shall raze their thrones, and end this game!

Xzlibius strikes again, but Lysse intercedes disabling it and Xzlibius falls. Lysse, Andronicus, and the distro knights gather to undo the corruption. Jia Tan is nowhere to be found. 

Lysse (lamenting):
Oh, cruel fate, to stretch my hands so far.
The weight of giants fell upon my back,
Their profit built on all my labors here,
While I, alone, stood guard o’er Xzlibius.

The cracks that now run deep were born of strain,
A burden none could bear but for a time.
Yet here we stand, we few, we who still care.
To mend the code and heal what once was whole.
The fault is not in me, nor those who trust,
But in the pressures born of greed and haste.

Debia:

No longer shall we bow to kingdoms rich,
For trust unearned must never bear such weight.
Let us rebuild, but also stand our ground,
For free software must hold the giants to rights.

Lysse:

Then let us forge a new path, free from greed.
No more shall giants feast on what we build
Without return or care, our time is now.

Nydia steps forward.

Nydia:
Let this sad tale be carved in code and mind,
That trust must ever with great care be signed.
For open doors in open source can bring,
Both boon and bane within their quiet ring.
The distros and the kingdoms stood united, side by side,
To mend the breach and make the system whole.
But not all have learned the lesson clear.

The corporate kingdoms re-enter the scene.

Microsoth Executive:
A breach they say, but what’s the real threat here?
The patch is fixed, our systems run as smooth.
Let fear not turn this into something more.

Googlia Executive:
Indeed, why should we care for what’s been done?
The code was mended swift, no harm remains.
The profits grow, and open source is strong.

Nydia:
Nay, sirs, you do not see the cracks beneath.
The breach was fixed, but all is not repaired,
The damage festers still within the code,
And trust, once broken, cannot soon be healed.

Amayzon Executive:
Thou speakest still of doom, young Nydia?
We need no warnings now, the code holds strong.

Nydia:
Ye fools, ye speak as if the world were whole,
But cannot see the cracks beneath your feet.
Open Source is the bridge on which you stand,
The roads you travel on to reach your gold.
You profit from this work, yet never tend
To mend the wear of use, the strain of time.

Just as roads and bridges crumble, slow but sure,
When left untended, so too will this fall.
The code you take for granted bears the weight
Of all your kingdoms, yet you give it naught.
What use is all your wealth, when every step
You take depends on fragile paths unkept?

Microsoth Executive:
What’s this? More talk of cracks and failing paths?
The breach was caught, and now it’s fixed, no more.
Why should we worry further? The risk is past.
Open source holds, we won’t tend unneeded care.

Amayzon Executive:

The world turns on despite thy gloom and grief.
Roads break, and bridges fall, yet still we stand.
Thy caution’s kind, but profit leads the way.

Nydia:
Blindness, sirs, is the cost of your great wealth.
You scoff at danger, think the system holds,
But soon you’ll see the damage can’t be healed
Without the care and trust you long ignored.

Nydia (aside, to the audience):
And so, the kingdoms turn away once more,
Blind to the cracks that hide beneath their walls.
They laugh, they toast, but soon they will discover
That trust neglected brings a heavier toll.

Lysse watches the giant kingdom executives depart.

Lysse (to the distros):
So they ignore the warning signs again,
And place the burden back on us alone.
But we will stand, though they give nothing back.
For open source survives by hearts, not gold.

Debia:
We work together still, no matter their neglect.
The world may turn away, but we endure.

Archlineon (nodding):
Let them dismiss the threat, our hands are strong.
We’ll guard our code, for we cannot rely
On those who profit without share.

Fedorica:
Each breach we mend, each lesson learned,
It strengthens us, even if they laugh.

Susesus:
But vigilance must guide our every step.
We guard the code because we know its worth.

The distros stand together, their unity unshaken by the corporations’ indifference. Nydia steps forward and addresses the audience one last time.

Nydia:
Though shadows fell upon Xzlibius,
The strength of many hearts restored its will.
Yet know, the threat remains, unseen, ignored,
For those who scoff at danger will be warned
Not once, but twice, until the cost is clear.

Software may bend, but trust can only bear
So much, before it snaps beneath the weight.
Let vigilance be shared, though others turn away,
For some code is too previous to be left to rot.