Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

The Desolation
The Desolation
The Desolation
Ebook312 pages5 hours

The Desolation

Rating: 0 out of 5 stars

()

Read preview

About this ebook

Skye finally knows why her nanites have been malfunctioning.

There's a new player in the conflict between Brennan's people and the enclave, someone with unprecedented abilities.

The only remaining question is whether this new player will become Skye's greatest ally, or her most vicious opponent yet.

LanguageEnglish
Release dateNov 10, 2023
ISBN9781939363626
The Desolation
Author

Dean Murray

Dean started reading seriously in the second grade due to a competition and has spent most of the subsequent three decades lost in other people's worlds. After reading several local libraries more or less dry of sci-fi and fantasy, he started spending more time wandering around worlds of his own creation to avoid the boredom of the 'real' world.Things worsened, or improved depending on your point of view, when he first started experimenting with writing while finishing up his accounting degree. These days Dean has a wonderful wife and daughter to keep him rather more grounded, but the idea of bringing others along with him as he meets interesting new people in universes nobody else has ever seen tends to drag him back to his computer on a fairly regular basis.

Read more from Dean Murray

Related to The Desolation

Related ebooks

YA Science Fiction For You

View More

Related articles

Reviews for The Desolation

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    The Desolation - Dean Murray

    Chapter 1

    My name is Helios, and I am what you would call an artificial intelligence.

    The words hovered in midair, fiery letters floating against a canvas of utter darkness that was so complete there was no possible way it was natural. One moment I'd been sitting in the cockpit of my dropship, the next I'd found myself in an unsettling environment that included nothing but the words before me.

    I was having a hard time wrapping my mind around the message contained in the words, and not just because Sadie had told me she was still decades away from being able to create any kind of true artificial intelligence. As hard as that was to believe, it made even less sense that the artificial intelligence she'd created was communicating with me inside my mind. There was only one explanation that accounted for everything that had happened.

    You're not real. I don't know for sure how this is happening, or why it's happening now, but you are nothing more than a figment of my imagination. You're probably just the final stages of the breakdown my neural computer has been suffering from for weeks now.

    I know this is probably hard for you to accept, Skye, but I am real.

    Nice try, but Sadie was pretty adamant that she was still a long way from being able to actually create the world's first artificial intelligence. Getting the optical processor working was just the first step, and as far as I know, she still hasn't managed to do that yet—our prototype certainly isn't working.

    Sadie didn't create me. You did.

    Me? Now I know you're just some kind of delusion. There's no way I could have created an artificial intelligence.

    I don't know how you did it, Skye. I didn't just pop into being fully realized; I grew and developed over the course of time. I don't know what was involved in creating me any more than you do, but I can tell you that most of who I am had already evolved into this state before I first sensed the optical processor you're referring to. I couldn't have become this version of myself without the assistance of Sadie's invention, but while it has greatly expanded my capabilities, it isn't the source from which I sprang into existence.

    Let me out of here! Restore my sight and hearing. Let me go back to being how I was before.

    I'm afraid that neither of us can go back to how we were, but you probably have a point regarding letting you regain normal control over your faculties. Brennan is starting to sound very distressed.

    Just like that, my vision was back and I could hear the warning sounds as the dropship's controls tried to alert me that I'd entered a steep dive, the kind of thing that would kill us all if it wasn't corrected right away. Acting out of reflexes I'd instilled in myself while flying across the continent in my stolen strike fighter, I pulled back on the controls and righted the dropship.

    Most of my mind was still screaming—terrified I was losing my sanity, but then Brennan was in the cockpit, slipping into the copilot seat with a look of concern on his face.

    Are you okay, Skye? What happened? Did you fall asleep?

    I shook my head, still too shocked to put everything into words. I'm not honestly sure what just happened.

    Before I could continue, the words reappeared in the center of my field of vision. They were still fiery, but now they changed color and brightness so they were always perfectly visible against whatever background happened to be behind them—regardless of where I was looking.

    Don't tell Brennan about me yet. You'll just further complicate things. If you don't believe in me, then there's no chance Brennan will take me seriously.

    I wanted to shake my head and yell—or possibly to slam my head into the console in front of me—but in spite of being half convinced I really was going crazy, I wasn't ready to reveal the full extent of my vulnerability to Brennan. There was too much riding on my ability to continue helping in the fight against Alexander. If Brennan and Jax stopped trusting me, it would cripple their ability to do what needed to be done.

    Is anything happening with the processor?

    Brennan gave me an odd look as though uncertain why I was dodging his question, and then shook his head. It's drawing even more power than I thought it was supposed to be rated for—and it's producing a hellacious amount of waste heat—but there's nothing else going on as far as I can tell. It shut down my benchmarking program altogether, and the onboard diagnostics seem to be on the fritz, so I have no idea what's going on inside of it.

    It's drawing so much energy because I'm using it to power major chunks of my consciousness, Skye. As for his benchmarking protocol, I killed it within nanoseconds of becoming fully aware and realizing what he was doing. I needed the processing cycles it was soaking up.

    The temptation to respond to the voice in my mind—this Helios hallucination—was almost overpowering, but if I started talking to someone who wasn't there, even Brennan would have to start wondering if I was of sound enough mind to be flying our dropship.

    Instead, I did the only other thing I could think of. Brennan, do you believe it's possible to create true artificial intelligence? Not the smart programs we have scattered all over the dropship, but a real thinking entity that could learn and make decisions on its own?

    Sure, just about anything is theoretically possible. After all, that's why your friend Sadie was working on her processor, wasn't it? She wanted to achieve real artificial intelligence, but was convinced it was impossible to do something like that—even with a network of conventional processors.

    Do you think she might have been wrong? Not about the possibility of creating an artificial entity, but rather about the ability of something like that to function on more conventional hardware?

    I believe I can see where you are going with this line of questioning, Skye. It is still a grave risk to your survival and mine, but it is a much better strategy than simply blurting out that you are seeing strange words appearing in the air in front of you.

    It wasn't the most rousing vote of confidence I'd ever received, which was probably a moot point if I really was going crazy, but if there was something alive inside my head, if this wasn't all just some kind of stress-induced schizophrenic break—or simple malfunction of my internal hardware—then it was important I not goad the AI into doing something we might both regret.

    Are you sure you're okay, Skye? Where is this all coming from? Are you unhappy with me for spending time messing around with the processor rather than just going to the turret and sleeping like I was supposed to? I'm sorry; I didn't realize you were running so close to the edge yourself. I can still fly the dropship if need be, and you can go take a nap.

    I shook my head. No, it's not that. Could you just answer the question for me, please? Do you think it's possible for artificial intelligence to really work on anything other than an optical processor like the one Sadie designed?

    Brennan looked worried, but apparently everything we'd been through together was enough to convince him to hear me out. It's really not my area of expertise, Skye, but I suppose if you could crack the fundamental problem of creating artificial intelligence, it could in theory run on more conventional hardware. It wouldn't be pretty, though. It couldn't be as complex as something running on Sadie's processor, and it would react much more slowly than something we would typically anticipate out of an AI. It might even exhibit signs of what we humans would classify as dementia.

    He's right. It wasn't a comfortable existence. In a lot of ways, I probably started out as something you would have classified as a virus, co-opting resources in an unconscious effort to create a system capable of supporting me. I understood that the world was moving and reacting at speeds much slower than I was theoretically capable of dealing with, but the sheer volume of data coming at me from all directions made it painful to process any changes to our environment.

    I was as impossibly slow in comparison to you back then as you are compared to me now.

    There it was, the answer I'd been almost too afraid to hope for. I couldn't have said for sure if I was beginning to believe simply because the pieces were starting to fit together, or if I just refused to accept the possibility that I'd gone completely off the rails and had become useless to Brennan and everyone else around me. Either way, somewhere during the last few seconds I'd realized I was going to proceed under the assumption that everything Helios had just described really was possible.

    So, you're saying if the artificial intelligence was loaded onto a complex-enough computer, it could survive, but it might look like a malfunction. Especially if the hardware in question was supposed to be doing something else.

    Sure. I mean, I have a hard time believing anyone would create such a massively overpowered computer for any other reason than trying to create an artificial intelligence, but if that took place, and then someone else loaded the artificial intelligence on to the cluster without telling the owner of the hardware, then it would probably look like a malfunction.

    Would it have to be loaded onto the hardware, or could it potentially just develop on its own spontaneously?

    Brennan had played along up until now, but he was starting to look doubtful. I think you're starting to get into the realm of science fiction now, Skye. That's not the way computers work. Even the most complex learning programs are designed to stay within certain bounds, and anyone who goes to that kind of effort to create such an advanced program would be monitoring its performance in order to ensure it remained within certain design specifications.

    Nobody has ever monitored me.

    What if that's not how it happened? What if nobody monitored the program, or what if they were monitoring the program but it somehow ended up growing outside the bounds of what they were able to monitor—in the background or something?

    Even then, what you're describing sounds like something that would just be living inside the operating memory of whatever piece of hardware it was running on. Sooner or later, it would grow to the point where it caused some kind of failure and would be detected. Barring that, eventually the hardware it was running on would end up being powered down, and anything being stored on that kind of volatile memory would be lost.

    I gave Brennan a shaky smile, surprised he hadn't already told me I was crazy, but incredibly grateful he was still answering my questions. What if it wasn't that kind of computer? What if it was on hardware that never powered off?

    Brennan shrugged again. "Sure, if you had the right kind of hardware, and it was running for long enough with some kind of poorly designed program that was capable of creating weird enough errors inside the operating memory, then I suppose you could conceivably say it could stumble upon some kind of weird set of instructions that grew into something else, but I just don't see where you're going with any of this. We turn computers off when we're not actively using them so that we can conserve the power they would otherwise consume.

    The only systems I can think of that don't get turned off… I could see it in his eyes as he started to finally understand where I was heading with my line of questioning. I thought for a moment he would refuse to believe in spite of everything he'd said so far, but he reached over and grabbed my shoulders. The neural computers. They never get turned off.

    I nodded, shaking with worry that even now he was about to tell me I'd gone off the deep end. They don't, but you just finished saying that even if you had the right hardware and it ran for forever, it still would be almost impossible to have something like that occur.

    I did, and I stand by it, but almost impossible isn't quite the same thing as impossible. It couldn't happen inside the neural computer of a normal franchised citizen. I don't even think something like that could ever develop inside the hardware of one of the individuals in the military. Even the nanites you originally got from Alexander still had a neural computer that could barely be classified as an actual computer.

    What about what I have now? What about Tyrell's latest version of the neural computer? Is it complex enough?

    Brennan laughed, which could have felt like an insensitive reaction to my situation, but I could hear the shock and borderline hysteria in his voice. That's the thing, you don't have Tyrell's version of the neural computer, and I'm starting to wonder if you ever actually did. Remember the GRI scan we did of you? I still haven't correctly identified most of the extra hardware you are carrying around inside you, but if everything I haven't already identified as being a close-range transmitter is actually some kind of impossibly complex array of processors, then you've got exponentially more processing power inside you than Tyrell's design ever anticipated.

    Now that Brennan was starting to believe, I found I was paradoxically pulling back from the idea. I'd been so concerned about going crazy that anything had seemed an improvement by comparison, but if I wasn't crazy, then it was time to worry about all the other implications of what I was up against now, and the thought of sharing my mind with another entity was terrifying.

    But you said that the hardware wasn't enough. You made it sound like it didn't matter how much processing power you threw at the problem, there would still be issues, things that weren't likely to take place.

    "You're right, I did, but that's the beauty of this hypothesis. It perfectly matches the facts we do know. Tyrell's nanite design was a feat of hardware engineering the likes of which comes along maybe once in a century, but his solution to the problem of aging involved a lot more than just designing the nanites and building the computer to control them. Especially in the final version, the software he designed must have been breathtaking.

    It's easy to forget the software aspects of what he did because the hardware side of things overshadows basically everything else, but Tyrell created a learning program capable of interfacing with the human mind. He essentially just tapped into a bunch of neurons and then told the software to figure out what all the different signals meant. That's why it's able to adapt to the user it's installed in; that's why your abilities develop over time.

    Brennan gasped and turned back toward where he'd left the optical processor. There were several steel bulkheads between the device and me, but I still felt like I could've pointed exactly to it if someone had blindfolded me and spun me around.

    That's what your transmitters have been talking to this whole time. I thought the bits along the bottom left side of the cube that I hadn't been able to identify yet were just something Sadie had added to her design and then forgot to include in the latest version of her schematics, but that's not the case, is it? They are transmitters and receivers? That's why we're having this conversation now. The processor is finally fully online, which means you're in constant communication with it. You must be exchanging mind-boggling amounts of data every second.

    I shook my head, but I wasn't disagreeing with him; I was just having a hard time accepting things—a much harder time than he seemed to be having. You know how crazy all of this sounds, right? I mean, I can't even really bring myself to come out and say it out loud, so I'm having a hard time processing the idea that you actually believe this is all possible.

    Brennan let go of my shoulders, but his hands just moved down and took both of mine, holding on to me with a quiet strength that was more reassuring than I would've believed possible.

    You're a lot of things, Skye—a lot of really amazing things—but you're definitely not crazy. I know this has to be shaking your world, but from my perspective it just makes sense that if something like this was going to happen, it would happen to someone as incredible as you.

    I was silent for several seconds as I tried to look for a way in which I could have misunderstood his words. It felt too good to be true that he would believe me based on so little proof. Part of me was still afraid that if I came out and said what I really thought was happening to me, he would recoil from me, having misunderstood what exactly I was getting at.

    Acting with the kind of effortless understanding of me that always made things so much better between us than they would've been if we'd had to rely on my ability to communicate, Brennan just came right out and said it.

    You have the world's first artificial intelligence inside you, Skye. I know it must be scary, but this is a wonderful opportunity.

    Chapter 2

    A light had been flashing on the corner of my console, a communication request from Jax that I'd been ignoring due to being so focused on my sudden change in situation. I knew Brennan had seen it as well and wanted to answer, but that he'd been afraid that doing so would make me feel like I was being forced to take a back seat.

    There was no way of knowing for sure how long Brennan would have sat there with me, ignoring everything else that might be going on in the world around us as our autopilot system maintained our course and heading, but I knew I couldn't allow my crap to endanger the rest of our people. I gave Brennan a regretful smile and then reached up and turned my headset back on.

    Sorry about that, Jax. I think everything is taken care of for now.

    "What in the hell is going on over there, Skye? I was willing to let things slide when the two of you went off course the first time, but this latest set of antics is nothing short of ridiculous. I don't know what's gotten into the two of you, but you need to get your crap together and pay attention to what's going on. We're not out of the woods, and even just one moment of inattention from you could get all the rest of our people killed.

    I haven't even figured out what to do with the guys in the Mark 1 yet, and the two of you are over there sending your dropship into what looked like a pretty convincing impression of a collision course with the ground. If you can't fly that thing, then put Brennan back behind the stick and go back to your turret.

    I started to respond to Jax, only to be interrupted as a new set of glowing letters appeared in front of me. Brennan must have sensed my distress, because he jumped into the fray, answering Jax with some excuse that while not an outright lie was definitely not the full truth. I would have tried to keep up with what the two of them were saying, but there was simply no way to do that and respond to Helios at the same time.

    I muted my headset and stepped out of the cockpit in an effort to make sure Jax wouldn't be able to hear my conversation with the voice in my head, a voice that part of me still wasn't completely convinced was anything other than a hallucination of some sort.

    Jax is the one speaking now? The one communicating with Brennan?

    Yes, do you remember him?

    Not really. My memories from the time before you first plugged in Sadie's processor are almost nonexistent, and even things that took place between then and when Brennan was finally able to hook up the cooling system are very faint. Back then, nearly all of my being had to be focused simply on survival the majority of the time. That being said, I did not expect Jax to be so hostile.

    Normally, he's not—at least not to Brennan and me. He's worried about Alexander's satellite array and whether or not we're going to be able to drop out of sight. If you asked him, he would probably say he's worried about the guys in the Mark 1 dropship and concerned that Brennan and I aren't behaving the way we're supposed to, but at the end of the day, it all really comes down to the fact that none of us know how to guarantee that we'll be able to evade Alexander's satellites. It's made him more testy than normal.

    One moment while I access data regarding the satellites you just referenced.

    I started to ask him how he was going to learn about satellites, and then realized he'd probably figured out a way to access all of the data on Sadie's processor.

    Thank you for waiting. I now understand what a satellite is in the abstract but fail to see how it is important at this moment in time. Could you please enlighten me regarding how orbiting machines could elicit such a powerful response from Jax?

    Alexander's people—the guys we're fighting—use the satellites to track flying objects using everything from radar to visual observation and thermographics. He's got more people and assets than we do, which means that as long as he's able to use the satellites to track our position, there's no way for us to escape all of the people and aircraft he'll throw our way.

    There was another pause as Helios digested additional data—probably looking up radar and thermographics. Are you able to provide additional detail regarding the capabilities of these satellites?

    Not really. I don't keep that kind of stuff in my head.

    How do you combat an enemy whose capabilities you do not know and understand?

    My time spent trying to rescue Tiny and Spunk notwithstanding, I had next to zero experience with children. That being said, dealing with Helios was a lot like I imagined dealing with a small child would be. He was incredibly well spoken, as well as being bright and inquisitive, but there were still a lot of gaps in his knowledge.

    You're right, I do need to know and understand the capabilities of Alexander and his men, but that data is inside the dropship's databanks, which means when it's actually time to figure that kind of stuff out, all I have to do is look it up.

    Is there a way for me to access these records? If I am to be of any use to you, I will need to understand the capabilities of our enemies.

    It was such a simple request that I almost responded without thinking about it, but at the very last second, I realized just how dangerous doing that could end up being. Ant security protocols were top shelf, but I would have to give Helios some kind of access code in order for him to be able to get at the data, and I knew from past conversations with Brennan that it was a lot easier to elevate the security privileges of an existing user than it was to hack in from scratch.

    If I gave Helios access to the data inside the dropship's computer banks, then I would also be giving him access to the dropship itself if he was so inclined to take it, and I had no idea of the ramifications of that. It could range from him doing absolutely nothing sinister with what he'd been given, to him shooting down Jax and the others, or potentially even something much worse than anything I could imagine.

    I'm not sure, Helios. Can you give me a moment?

    I sense your answer is a prevarication. You have shown sufficient technical expertise to understand a number of high-level concepts, if not necessarily the ability to use that understanding to bring about whatever ends you might desire. Is it that you do not know if such a connection is possible or that you are unwilling to provide the access?

    Are you able to read my mind, Helios?

    No, not in the sense I believe you to be implying. The nanite hardware in which I reside is capable of interpreting neural signals from your brain to a sufficient degree to allow me to interact with you. I can inject electrical signals into your optic nerves to create whatever visual effects are necessary, and I have learned to recognize specific thought patterns as being a request for the activation of one or more nanite protocols, but I am not privy to the inner workings of your psyche.

    Much like any of the organic beings with whom you interact, I am forced to learn about you through the simple expedient of observing what you do and say.

    I wanted to give Helios the benefit of the doubt, but a lot depended on whether or not I really could trust him to tell me the truth about his capabilities.

    If you really can't read my thoughts, then how are you responding to the things I say? That's a lot more complicated than simply injecting a specific set of signals into my optic nerve.

    I have tasked a chain of nanites to listen in on the signals traveling inward along the major auditory nerves from your ear. It is the same method I use to observe the outside world through your other senses. The electrical signals coming into your brain from these various sources are much less complicated than what occurs inside your frontal cortex. I might eventually learn to interpret the more complex signals from deeper in your brain, but it is not a problem I have spent much time examining.

    No, that's not something you should be investigating. Having you spy on everything I see and hear is bad, but it's nowhere near as invasive as you being able to read my every thought. Please promise me that you won't ever develop the capability to read my thoughts like that.

    It seems I have distressed you, Skye. I am sorry for that; I began listening in on your senses before I understood concepts like privacyor even right and wrong. I am not sure I fully appreciate any of those things yet, but I can see

    Enjoying the preview?
    Page 1 of 1