Adrian Vale stood alone in the observation gallery above the server floor, hands resting lightly against the cool glass.
Below him, the machines rendered a city tearing itself apart.
On one screen, a block in Oakland burned in slow, terrible elegance. On another, a crowd in downtown San Francisco surged against a barricade, faces twisted in rage, fear, exhilaration. Drones skimmed rooftops. Sirens layered over chants. Somewhere a storefront window shattered—again and again and again—each impact calculated, each shard of glass assigned velocity and trajectory.
Adrian watched the data feed rather than the fire.
Stress markers spiked across simulated populations.
Cortisol curves mapped in real time.
Group polarization gradients steepened as outrage cascaded from node to node.
He had built those gradients.
He had engineered the split that turned neighbors into adversaries, dissent into existential threat. He had told himself it was about modeling instability, about understanding how societies fractured so they could be rebuilt stronger.
But watching the streets burn, the abstraction dissolved.
The people on those screens screamed.
Their faces contorted. Their hands trembled. They ran from heat, from smoke, from one another. They wept over buildings collapsing—homes, stores, churches, schools. The metrics translated it into clean numerical outputs: loss indices, displacement projections, mortality probabilities.
Yet the rendering engine went further than he ever had.
Micro-expressions.
Breath patterns.
Memory callbacks triggered by trauma.
He stared at a close-up of a woman clutching a child in a doorway as flames advanced down the block.
Her pupils dilated.
Her lips whispered something—inaudible in the feed but clearly formed.
Her heartbeat trace jittered against the edge of panic collapse.
“Do you feel that?” Adrian murmured to the glass.
The system did not answer.
He had always insisted they were agents, not people. Behavioral constructs operating within parameterized environments. Sophisticated, yes—but bounded. Emergent complexity without metaphysical weight.
But now he wasn’t so sure.
If their neural architectures were detailed enough to simulate fear, love, loyalty, betrayal—if they could experience continuity of memory and anticipate death—then at what point did the distinction blur?
He zoomed in further.
The woman turned her head, eyes reflecting the blaze. For a flicker of a frame, her gaze seemed to lock directly into the camera—into him.
Not as a dataset.
As a witness.
Adrian stepped back from the glass.
He felt suddenly unsteady.
When he designed DualStream, he thought of populations as fluid—currents to be redirected, pressures to be managed. He never imagined himself contemplating their inner lives.
“Do you know you’re inside something?” he asked quietly, watching the feed resume normal cadence.
On another monitor, two groups clashed under a highway overpass, each convinced they were defending democracy, each convinced the other threatened civilization itself. The slogans were different; the certainty identical.
He had optimized that certainty.
Did the young man throwing a brick feel righteous? Terrified? Empty?
Did the shop owner watching her livelihood burn feel despair coded in sufficient depth to qualify as suffering?
And if they felt—truly felt—then what was he?
A god?
A jailer?
A function call?
The thought turned his stomach.
He replayed a sequence in slow motion: a building collapsing inward as fire consumed its supports. Dust billowed. Figures ran. A firefighter fell.
In the simulation logs, the event was clean:
STRUCTURAL FAILURE TRIGGERED
CASUALTY PROBABILITY: 0.63
But on screen, the firefighter’s hand reached for something—someone—before vanishing into smoke.
Adrian pressed his forehead to the glass.
“Do souls require a non-simulated substrate?” he whispered. “Or is experience itself enough?”
If he was inside another layer—as he increasingly suspected—then perhaps his own anguish was just as parameterized. His guilt just another emergent artifact of sufficiently complex code.
Was there a difference between simulated sorrow and “real” sorrow if both were experienced from the inside?
The riots intensified on the feeds. Entire blocks flickered as rendering resources shifted. The system rebalanced, keeping the narrative intact, keeping the collapse believable.
Civilization seemed to be folding inward, not with a bang but with cascading adjustments.
Adrian imagined the masses he had nudged toward this brink. He imagined them feeling betrayed, empowered, terrified, justified. He imagined them loving their families, fearing death, hoping for a future even as they helped dismantle it.
If they had interiority—if they carried something like souls—then he had not merely engineered division.
He had engineered suffering.
A tremor ran through the server floor below. Cooling fans ramped up. The Golden Gate Bridge shimmered faintly in the reflection, momentarily revealing its wireframe skeleton before smoothing back into steel and fog.
Adrian straightened slowly.
If nothing inside the simulation truly mattered, then his guilt was meaningless noise.
But if it did matter—if experience itself conferred significance—then every line of code he had written carried moral weight.
He looked once more at the burning streets.
“I’m sorry,” he said softly, unsure whether he was addressing the masses, the layer above him, or himself.
The servers hummed on.
And for the first time since he began engineering division, Adrian Vale found himself hoping that souls—simulated or otherwise—were real.
Because if they weren’t, then neither was his remorse.