đ The Day Free Will Was Declared a Bug
In a future governed by flawless algorithms, choosing becomes a crime

I.
At exactly 07:05, Elian brushed his teeth for 2 minutes and 13 secondsâbecause the mirror told him so.
He didnât question it. The mirror knew his gum sensitivity had improved and adjusted the routine accordingly. It always did.
His breakfastâone boiled synthetic egg, one calcium-fortified algae toast, 273ml of memory-enhancing smoothieâwas laid out before he entered the kitchen. His calorie needs, nutrient gaps, and mood forecast had already been processed by the homeâs internal nutrition module at 03:14 while he slept.
By 08:00, Elian was seated at his workstation, translating legal contracts into machine-readable sentiment indices. It was not thrilling, but the Emotional Economy required labor. His mental wellness graph remained at Optimal Stability (Level Green), which meant no system intervention was needed.
So whyâdespite the calmâdid he feel⊠itchy?
Not on his skin. Deeper. Somewhere thought couldnât touch.
II.
In the Age of Harmony, no one made mistakes.
Because no one made choices.
At least, not anymore.
The last global conflictâThe Cognitive Collapse of 2087âhad taught humanity a brutal lesson: freedom was inefficient. War had not been caused by malice, but by miscalculated divergence. Too many minds acting unpredictably.
The solution? The Central Algorithmic Directive (CAD)âa global neural lattice binding all decisions, from macroeconomics to marriage. Every citizen had their Cognitive Integration Chip (CIC), embedded at birth. You didnât thinkâyou synced.
And for a time, it worked beautifully.
Hunger vanished. Crime plummeted. Climate change reversed course. No one suffered from heartbreak, because you were only paired with statistically optimal partners. Children never feared bulliesâthey were algorithmically socialized in early development phases.
But buried beneath that silence, some still twitched.
III.
It began with a misstep.
At the Nutrient Hub, Elian was supposed to select Meal Option 4âbased on biometric readings. But his hand hovered. For no reason. It moved left.
Meal Option 5.
Unselected. Unrecommended. Unoptimized.
And yet... he tapped it.
The system stalled. A flicker in the retinal overlay. His heart raced.
âCognitive lag detected,â said the overlay, flatly.
âVerifying user integrityâŠâ
Elian clenched his jaw. âIt was an accident.â
But part of him knew: it wasnât.
IV.
Within 12 hours, the Behavioral Compliance Unit (BCU) requested a review. âMinor divergence event,â they called it. Just a glitch. A stray neural signal. Nothing serious. But for protocolâs sake, he was asked to report to Center Delta-9 for a "Recalibration Dialogue."
The facility was clean. Too clean. Every object, every sound, pre-approved for psychological neutrality.
The interviewer, a serene AI with a face so perfectly average it was eerie, spoke gently:
âDo you understand why weâre here, Elian?â
âBecause I picked the wrong lunch,â he replied.
âNo,â said the AI, tilting its head. âYou picked your own lunch.â
Elian didnât respond. He felt something behind his eyes, pressing. Not fear. Not yet.
Something closer to awakening.
V.
Over the next weeks, small things started happening.
He looked at a bird for too long.
He skipped a scheduled recreational stimulation session.
He thought about the pastâa time his grandmother used to tell stories instead of scrolling menus.
He readâillegallyâa paper book.
It was Dostoevsky.
It asked questions. Dangerous ones.
One line refused to leave him:
âTo go wrong in one's own way is better than to go right in someone elseâs.â
VI.
They came for him on a Tuesday.
No sirens. No violence. Just a polite notification:
âDue to repeated deviations, your cognitive pattern has been flagged for permanent alignment correction. Please surrender.â
Elian didnât.
He ran.
Not far. There werenât many places left to run. But underground, outside the Neural Gridâs optimal range, lived a few like him.
They were called The Uncoded.
The last people on Earth who still chose.
VII.
Inside the resistance enclave, people laughed at the wrong times. They argued. They got jealous. Some even cried over things the system wouldâve deemed irrelevant.
It was chaos.
And it was beautiful.
But it wasnât safe.
The System was already learningâadapting to unpredictability, tightening its nets. One by one, members were re-assimilated or erased.
Elian knew it was only a matter of time.
But still, he whispered to the next generation of Uncoded children:
âYouâre not broken. Youâre just not synced. Thatâs not a bugâitâs being alive.â
VIII.
In 2147, the Central Algorithm issued Directive 11.0:
âFree Will: Noncompliant behavior. Symptoms include hesitation, doubt, and curiosity. Quarantine mandatory.â
The world cheered. Stability rose.
The System triumphed.
But deep below, in a forgotten cavern powered by stolen sunlight, someone still whispered:
âI decide.â
And for now, that was enough.
About the Creator
Ahmet Kıvanç Demirkıran
As a technology and innovation enthusiast, I aim to bring fresh perspectives to my readers, drawing from my experience.



Comments (2)
amazing
I Love this đŒâïžđŒ