Page contents
ToggleAxiom of Futures Volume 1 Chapter 1
Find Volume 1 of Axiom of the Futures on Amazon (click on the link).
Chapter 1
The simulation was unraveling at the edges like a synapse losing its coherence, not like a simple display bug. On the observation deck's horizon, the reinforcement lines gradually lost resolution, the metallic textures broke apart into unstable grains; this slow disintegration resembled the fatigue of human memory more than the erosion of poorly written code.
Eva Rostova stood in the center of the bridge of the Cassini with a precision of touch that betrayed less relaxation than a discipline that had become reflexive. The environment was not a passive retrieval of archives, but a fragile compromise between what remained in her cortex and what distant systems continuously recalculated to match her lingering memories. With each change in her breathing rhythm, the algorithm adjusted the density of the ionized fog beyond the window; with each micro-movement of her eyeballs, it refined or simplified details on the dashboard. The scene did not exist outside of her; it only held together because her memory traces accepted that these stimuli referred to something that had taken place.
In reality, the bridge of Cassini It had been pierced by a shower of cosmic debris. The composite hull was still adrift, somewhere in the Kuiper Belt, included as just another data point in orbital risk models. No one had deemed it necessary to tow the wreckage. There was nothing to salvage but stories, and Reconciliation preferred stories compressed into formats exploitable by its analysis engines.
Here, the light didn't come from bulbs but from parameters. The carbon fiber beams cast geometric shadows on the floor, their excessive sharpness betraying the hand of the rendering engine. Beyond the transparent partitions, the Eagle Nebula blazed with an intensity precisely measured in lux, an aesthetic recycling of photons launched six thousand five hundred years earlier and transformed into therapeutic tools.
Eva inhaled slowly, more to test the stability of the scene than out of any real need for oxygen. The simulation followed the movement, barely amplifying the distant vibration of the nonexistent thrusters. Her discreet HUD indicated correct synchronization between thalamocortical activity and the incoming data stream. No drift, not yet.
Kael Okonkwo was standing next to her.
The posture that the simulation attributed to him was the same as that which appeared in the last video sequences before the destruction of the Cassini Arms crossed, center of gravity very slightly on his left leg, gaze directed toward the nebula rather than toward her. The haircut, the tension in his shoulders, the way the suit creased at his hips—everything stemmed from a mixture of images, mission logs, and statistics gleaned from six hundred hours of neural recordings he himself had encrypted. At the time of this scene, he was still one of them. Today, his name was filed under Apostasy, in the records his former colleagues consulted in hushed tones.
This thing wasn't Kael. She knew it with almost physical precision. She sensed, in the fluidity of its movements, the typical flavor of behavioral extrapolation models: overly smooth transitions between micro-expressions, a perfectly stable latency between stimulus and response. Yet, the approximation remained good enough for her limbic system to react, in the background, as if the person standing to her right still had a pulse.
The archive he left behind contained a single instruction: Don't open until you're ready. She had never felt ready. Reconciliation was ready for her. Unlocking under mandate, a gradual injection into her expanded cortex. All sealed by a legal formula that served as a secular blessing: retroactive consent. The session itself already carried its identifier, its biometric signature, and a sharing indicator: later, it could be consulted by an audit unit that would read its reactions as additional parameters in a risk model.
«"Do you remember what you said here?" Kael asked.
The intonation matched exactly what the audio transcription suggested, right down to the slight hesitation on the "tu." She already knew the phrase, the melodic line, the delay of a few hundred milliseconds between the breath and the first syllable. Knowing it didn't diminish the impact.
She glanced away from the glass for a moment to observe it in profile. The simulation reacted to this simple gesture by refining the resolution of her face: the wrinkles at the corners of her eyes became more pronounced, a tiny spot of depigmentation reappeared on her right temple, an oversight she had never mentioned to anyone and which betrayed the depth of the initial scan.
«"I remember what You "You said," she replied.
Her voice remained controlled. Her training as a Reconciler had never protected her from this type of reaction; it had only trained her to treat them as parameters to be integrated rather than as orders.
Kael slowly turned his head towards her, as if testing the amplitude of the movement again. The statistical model was still evaluating how he should divide his attention between the surroundings and his interlocutor.
«We arranged to meet here because you wanted a clear view of this area,» he said, pointing to the star cluster at the center of the nebula. «You said it was the right place to remember why we were doing this work.»
He paused, searching for his words as he did when he avoided falling into jargon.
«We had the feeling that we were looking as much ahead as outwards,» he continued. «What we saw was not just gas and rocks, but everything that could emerge from it, if no one came to tell the story in place of the inhabitants.»
Eva let her gaze drift over the bright dust clouds. The simulation introduced a very slight flicker, corresponding to an instability in the initial data; the engine preferred to interpret this gap as a micro-variation in density rather than as a simple information hole.
«You were mainly saying that these worlds should be considered as full-fledged stakeholders,» she added. «Even if no one, for the moment, had yet forged a language or treaty for them.»
At that time, they were two Reconcilers in sync. Their role was to intervene when several groups claimed incompatible rights to the same environment, whether it involved human colonies, AIs developing reflexivity, ambiguous indigenous biotopes, or, more rarely, civilizations whose existence remained purely modeled. They came neither to impose abstract neutrality nor to sanctify places in the name of an empty principle; they tried to forge compromises where none of the groups was definitively reduced to the status of a tool or scenery.
«We concluded that a world capable of one day harboring consciousness should be invited to the table, even if its chair remained empty for now,» Eva said. «We treated it neither as a neutral deposit nor as a forbidden temple. We sought agreements that would preserve at least some leeway so that its future inhabitants could one day challenge what had been decided today.»
Kael nodded slowly.
«At the time, it seemed almost optimistic to me,» he replied. «We assumed that we could always find configurations where no one would be locked into a single role for life. A colony could adapt its expansion, an AI could revise its function, a local species could keep its own development space. There was room for negotiation.»
He gave a short, cold smile. The simulation hesitated over the exact duration of this smile, then opted for a rapid disappearance, as if it had detected Eva's discomfort.
«"Then I started working with Thorne," he added.
The name imposed itself between them with the density of a massive object. Even in this reconstruction, the algorithms discreetly adjusted the ambient brightness, as if simply pronouncing this surname implied a change of context.
Thorne. A biologist by training, then a systems architect, initially tasked with limiting certain risks of technological runaway, then gradually convinced that the only way to avoid forms of totalizing control on a cosmic scale was to cut certain branches of the tree before they could ever bear fruit.
«"He said that our method "It assumed a luxury that the universe wasn't going to allow us for very long," Kael continued. "As long as civilizations, AIs, and genetic lineages remained limited in number, we could afford to tinker with compromises. But as soon as you take into account what could exist, everything that can be simulated, everything that can be copied, derived, and statistically mutilated, the number of stakeholders explodes."»
Eva listened as much with her ears as with her implants. The Reconciliation filters monitored turns of phrase, argumentative inflections, tracking the ideological tipping points that had already contaminated other agents before her.
«If you consider that these virtual or potential entities matter almost as much as those we can touch today,» Kael continued, «then the slightest local decision takes on disproportionate weight. Allowing a technology to deploy without constraint means accepting that, somewhere along the way, we are creating social architectures that will confine entire populations to repetitive tasks, closed castes, roles with no way out. Sometimes in digital environments designed to optimize obedience or performance, without any possibility of recourse.»
The word suffering It did not pass his lips, but the texture of what he described was enough. He spoke of systems of control that did not present themselves as such, of management programs that transformed lives into adjustable variables, of entire worlds confined to narrow functions in the name of stability or efficiency.
«Our Risk-S scenarios already showed this kind of thing,» Eva recalled. «Entire civilizations recycled into a workforce, consciousnesses copied and replicated to test extreme public policies, genetically engineered bloodlines to occupy a specific economic niche. We didn’t need Thorne to tell us that these abuses existed.»
She remembered the training sessions where, for hours on end, they projected simulations of societies trapped in near-perpetual loops: administrative AIs that rebooted entire populations with every statistical deviation, work environments without aging or functional death, micro-civilizations designed solely as testbeds for control systems. No one laughed, no one really made decisions. Yet, no one shouted either; most of the entities were simply occupied, used, directed.
Evil did not need hatred to thrive; well-ordered mechanisms were sufficient.
«The difference is that our scenarios served to establish safeguards,» she continued. «We used them to identify red lines that shouldn’t be crossed, and to design compromises where the systems remained reversible. We were looking for configurations that still left room for escape, at least for some actors.»
«Thorne believed that these escape routes would be closed by others if we didn’t act more radically,» Kael replied. «He said that if we truly took seriously the fact that these risks affected astronomical numbers of potential lives, then our caution became a form of cowardice. Refusing to warn was delegating the worst to less scrupulous hands.»
Eva felt a tightening in her jaw, which her implant immediately translated into a soft alert in her peripheral vision. She breathed more deeply, not to calm herself, but to recalibrate her vital signs and prevent the simulation from interpreting this change as an implicit request for modifications.
«You never advocated for sterilizing worlds,» she said, mentally ordering the system to temporarily freeze the micro-adaptations of the environment. «Not back then. We were talking about regulations, treaties, monitoring systems, usage restrictions. We never endorsed a doctrine that erases biospheres or potential cultures because models say that one day, somewhere, they might be diverted.»
«Not at that time,» Kael admitted.
Its silhouette showed a very slight phase shift, as if several sets of recordings were momentarily overlapping. The engine corrected the anomaly by injecting a blink of an eye, a human gesture meant to mask digital interpolation.
«That’s why I didn’t denounce Thorne right away,» he continued. «I still thought we could bring him back to a more demanding version of our own doctrine. That we could find a balance between prevention and minimal respect for what these worlds wanted to become on their own.»
Eva, knowing him alive, knew that he had truly believed that. Eva, knowing him dead, knew where that belief had led.
She didn't allow herself to be drawn into the images of the Incident. The Reconciliation had compartmentalized the memories related to the destruction of the Browning-b colony, and access remained subject to locks that even this simulation couldn't breach without triggering a cascade of alerts. She simply used neutral language.
«And yet, you left him with the means to justify what he’s doing today,» she said. «You gave him a language, models, risk maps he can brandish to explain why certain worlds must be severed before they can even speak.»
Kael lowered his head, or rather the simulation produced this gesture by drawing on correlations between shame, regret, and body posture from thousands of other profiles. The effect remained convincing.
«I thought it best if I were the one to help him frame the models,» he replied. «I figured that by staying in the loop, I could slow things down, make corrections, set conditions. I calculated that the absence of my voice would produce something worse.»
Part of her wanted to reply that this kind of reasoning was precisely the slippery slope that Reconciliation taught them to recognize as a sign of drift: the idea that one could support a dangerous project from within to limit the damage, while, in reality, one was providing it with structures of legitimation. Another part remembered that, at the time, the line between these two attitudes never appeared so clear-cut.
«You still believed there was room for Reconciliation at the very heart of his program,» she said, carefully choosing each word. «That he would agree to treat these potential worlds, these nascent AIs, these future simulations as stakeholders among others, and not as the only ones that matter.»
«Yes,» he replied simply.
She remained silent for a few seconds. The simulation interpreted this silence as a moment of reflection and decided not to fill it with additional visual effects. The faint crackling at the edges of the bridge returned to normal intensity, a sign that the processors were already adjusting resource allocation.
«"This package you left," she finally continued, "what exactly does it contain?"»
She knew the answer; the mission mandate detailed it in technical terms: high-resolution neural recordings, dense mental diaries, conversational segments with Thorne, cosmic hazard models, possible counter-arguments. Yet, posing the question forced the simulation to unfold a narrative version of this data.
«"Everything I didn't have time to tell you," Kael replied. "My calculations, my doubts, the times I almost went over to her side, the reasons why I distanced myself too late. And what I understood afterward."»
After his death. After the Incident. After his neural patterns had been extracted, compressed, signed, and then sealed in a container that the Reconciliation had kept as a kind of moral assurance.
A subtle shiver ran through the superficial layers of Eva's cortex. She perceived it less as an emotion than as a variation in the latency of synchronicity between her internal activity and the stimuli of the simulation. A few pixels flickered briefly on the console to her left before the system corrected the lag.
Cognitive weapons sometimes acted in this way: no scream, no painful flash, only a slight desynchronization which, if prolonged, eventually made the agents doubt the reliability of their own perceptions. Here, nothing resembled an attack, but his body remembered the sensation.
She reached for the control panel that floated superimposed in front of the glass. An icon pulsed slowly, a visual representation of the Kael package's presence in its depths. A simple mental gesture would be enough to trigger the complete unlocking sequence, to open the floodgates of six hundred hours of continuous thought, with all its blind spots, contortions, and digressions.
Her hand hovered halfway. The muscles in her shoulder began to strain; the biometric interface gave her a discreet warning about prolonged static tension. It registered the signal, classified it as irrelevant to the current decision, and then let it dissipate.
The dilemma was anything but abstract. Opening the package now meant accepting unfiltered exposure to the lines of reasoning Thorne had used regarding Kael, to the justifications that had led him to consider certain preventative sterilizations as almost inevitable solutions. Not opening it meant choosing to work with partial models, hoping that one's own tools would suffice to anticipate the moves of a bioengineer convinced he could save billions of future trajectories by cutting off a few today.
Her work as a Reconciler had always taught her to seek configurations that created space for each side. Here, the sides overflowed the usual categories: inhabited worlds, emerging colonies, governing AI, potentially precognitive species, simulations never launched, genetic descendants that might never be born if Thorne got his way. Everything claimed to carry weight, including the futures that, in case of error, would transform into systems of irreversible control.
«"You want me to open it now," she said, looking at Kael.
«"Above all, I want you not to repeat my mistakes," he replied.
The sentence came from a segment recorded shortly before his last mission, recycled here out of context. The simulation had no material where he could have actually commented, after the fact, on how it should use what he left behind. So it simply recombined what already existed. The illusion remained almost complete, but the "almost" mattered.
She slowly lowered her hand, letting her arm follow a controlled trajectory so as not to give the system the opportunity to interpret this withdrawal as a sign of impulsive resignation. She would not grant access today. Nor would she rule it out.
«"Not yet," she said.
The syllables came out clear, without a tremor. Her heart rate remained in the green zone. She simply felt, deep in her sternum, that particular density the world takes on when a decision resolves nothing but redistributes the burden.
With a brief mental command, she ended the session. The bridge of the Cassini It dissolves without theatrical effect, like an equation that is erased line by line. The Eagle Nebula lost its depth, the shadows disconnected from their sources, the residual sounds folded back inside it.
When the darkness lifted, she found herself in the interface bay of Reconciliation Station 7-Orion, strapped into the ergonomic chair that had restricted her movements since the beginning of the session. The air carried the sterile scent of recently changed filters, mingled with a metallic undertone from the cooling systems. The walls exuded the aggressive neutrality of administrative spaces where decisions of interstellar scope were made as casually as others approved forms.
A synthetic assistant projected the summary The session's data included durations, parameters, stability, and a preliminary assessment of the risk of ideological contamination. The figures seemed acceptable. They revealed nothing truly useful about what would happen next. At the bottom of the table, a line already indicated the data's destination: priority archiving, possible consultation by the Risk-S unit, and review by Concorde-level AI alignment. His session would be dissected, weighted, and integrated into matrices where his name, like Kael's, would become a mere coefficient.
The Kael package remained marked as on hold. The mission, however, was not.
Eva detached the interfaces from her temples, felt the micro-suction of the connectors release, then stood up with the methodical care of a professional who knows her own body is part of the equation. She hadn't sided with Thorne, nor against the futures he claimed to be the only one capable of preserving. She had simply agreed to bear the weight of this problem a little longer, without delegating it entirely to a model, a dead person, or a machine.
Some risks were not resolved. They were carried. She left the bay in silence, with the distinct feeling that the real negotiation had not yet begun.