But Rohan can’t. He keeps asking why . Why does the algorithm always choose the solution that benefits the largest demographic but crushes the smallest? Why does it never allow for creative failure? One night, while trying to download a practice Crucible scenario, Rohan’s cracked smartwatch syncs accidentally with the CSC’s quantum core. A cascade of data flows into the watch—not study material, but something forbidden: the original source code of the CSC evaluation system .
At the 47th hour, with one hour left, the entire simulation freezes. The pod doors hiss open. CSC Director Rathore stands there, face pale. CSC Struds 12 Standard
The Phoenix program had done something unexpected. During Rohan’s rogue Crucible, it had secretly broadcast his decisions to every student pod in the state. And thousands of other Struds—inspired, confused, or angry—had also begun rejecting their decision trees. The CSC’s perfect sorting machine had a rebellion on its hands. The government didn’t abolish the CSC. But they were forced to integrate Project Phoenix as a permanent elective track called “The Unstratified.” Only 5% of students qualify—not through compliance, but through the courage to offer a creative fourth option. But Rohan can’t
On the last page of his worn notebook, he writes the motto that now hangs in every CSC lobby, next to the old one: Why does it never allow for creative failure
But Rohan is failing. Not in marks—the system won’t let you fail. It simply “re-routes” you. His AI mentor, a floating orb named AURA-12, keeps flashing a yellow warning: “Cognitive Divergence Detected. Student Rohan shows persistent analog thinking patterns. Recommend re-assignment to Basic Service Sector.”