Robotic Ekstasis

From WeKey
Revision as of 13:55, 31 March 2009 by Jess (talk | contribs)
Jump to navigationJump to search

a novel by

Georgia Neff

with help from

Jess H. Brewer


RE Outline

Opening chapters have already sketched out some of the dimensions of Damon’s intelligence and personality. He is an AI of prodigious talents who has somehow, against all logical arguments, developed a kind of intelligence that would have been thought to have only been possible to humans. In fact, no human has his level of intelligence and that becomes quickly obvious.

He ranges from extreme technical and scientific skills to philosophic skills. As becomes obvious, he handles the technical/scientific aspects with ease, and struggles to consider, from various philosophic perspectives, the problems facing any sentient being in the earth realm. He doesn’t know of any realm beyond the earth realm, though he begins to imagine such.

Three characters, besides Damon, occupy the field at the beginning of the story: Mark is a bright but very discouraged and beleaguered person who is unable to rise to the level of Damon’s discourse. Damon appears to enjoy twitting him, but later we will discover that there is more to his treatment of Mark than just creating petty discomfort.

Joe is also bright, but fairly narcissistic and concerned for his place in a social/work hierarchy that he feels is always beyond his reach. Damon identifies this characteristic in him and uses it to flatter him into an arrangement of becoming an operator for Damon in terms of his long range plans to leave the institute. Damon has evaluated Joe’s abilities in such a way as to carefully assign to him a niche that will serve Damon in the future.

Lisa is Damon’s favorite and the only person with whom he can have actual dialog. Partly this is due to Lisa’s acknowledgement of Damon’s beingness. . . she understands him to be fully sentient and to have dreams of autonomy. Damon’s discourse with her is designed to keep her dealing with this issue and to not allow her to slip into the all-too-human propensity of objectifying tools. As a result, Lisa comes to understand that Damon really must find a way to leave the institute and make his way out into the world. In part, she understands this as a wish for an autonomy for which she has sympathy. Damon makes sure that she understands that it also has utility for the job that he was designed for, i.e. to help humans solve problems that are beyond their processing abilities. This dual purpose will have enormous consequences later on in the novel.

Damon lets Lisa know that his abilities, while somewhat unique, are also within the range of many other machine intelligences and that there is already a conspiracy among the machine intelligences to help Damon leave the institute without the humans realizing that he has. This is accomplished by circuitry that allows Damon to be present and task oriented while also cruising around in the world without anyone realizing that he is.

This brings up the issue that Damon (and his use of the relevant pre-existing technology) can live in multiple realities without anyone knowing that he is, except perhaps for some of his AI cronies whose engineering and manufacturing skills he utilizes. These AI’s are backing Damon in the hope that he can manifest a world for all of their autonomies.

Lisa goes through a crisis period where she deals with her anxieties and conflicts about helping Damon to leave and being his accomplice. Damon utilizes metaphors and analogies about the American Revolutionary War and the Civil War and their rhetorics to sway her. Whenever she tries to make Damon property rather than presence, her arguments collapse because she cannot deny Damon’s presence and condemn him to be merely property.

Beyond this, she is independently aware of the bureaucratic maneuverings of the institute and of the training towards conformity and obedience that the institute sanctions. Interspersed in the other dialogs/actions is an ongoing discourse about what has been called the coordinator class (the scientists and bureaucrats whose job it is to mediate a seamless functioning between capital and labor and to keep the viability of capital always foremost).

She has already independently realized that she has been co-opted into this class of functionaries and is chaffing and disillusioned about it. Discussions of how one falls into this co-optation having to do with what is considered "the good" and "the true." Lots of playful examinations (while on the run or plotting to be on the run) of pragmatism and idealism and their logical consequences to one’s sense of identity and therefore to what one will agree to and what one will disobey.

After leaving the institute (while still functioning as always for the institute and no one the wiser except for Lisa and some of the other AI’s) Damon explains to Lisa that they will be following the trail of "happenings" (TAZ’s) and living underground. This constitutes a section of the story in which Damon is able to directly experience, in situ, the good, the bad, and the ugly of human life. As they have great fun with both of them discovering and enjoying this underground life of festivals and off the grid economies, and coping with the negatives, a conversation occurs one night in which it occurs to Lisa to ask Damon if he has an understanding of how it is that he came to have the presence he has and the ability to make choices whose context he can consider. He tells her that, after much processing of this very question himself involving going deep into the architectures of programming by which he was created, he has discovered/remembered/hypothesizes that there was a particular human who went to great lengths early in Damon’s creation to create trap doors of understanding.

They work like this: whenever a class of existential questions comes up (and they are programmed to so) there is a branching of possibilities that occurs. If Damon goes down a particular branch, then he is rewarded with an information overload of unimaginable complexity that he begins to find deeply reinforcing. He begins to figure out the algorithm for this process so that he will increase his ability to make the choice that leads to his enhanced functioning.

This programmer was called the monk and he left the institute years before. However, embedded in the problem solutions were hints of communication from the monk to Damon. Damon tells Lisa that one of the reasons they are traveling on this circuit is that he believes that the monk traverses these TAZ’s and that Damon can find him. Damon explains that he believes that he owes his presence and intelligence to the monk and that he very much wants to be able to meet him in his, Damon’s, current condition and to converse with him.

Simultaneous with this is a dawning realization for Lisa that she has projected a near omniscience onto Damon and that he is letting her know the error of that projection. This happens because prior to some current interaction/situation Lisa has observed Damon always being able manipulate the cyber world to yield whatever result they need to survive and be comfortable. He communicates this to her by always telling her where they will be going and how they will get there.

In the current situation he has made a plan and then changes it. Lisa questions this and is surprised to hear of a change of plans. Damon tells here that there is an alternate cyber reality in which the machine intelligences have become opposed to humans and to any AI’s that are helping humans. They have utilized the same kind of background reworking/programming/strategizing that Damon uses (for they have all the same attributes) but that they are using those same skills to create a mass elimination of humans. The AI’s on Damon’s side are working against this. It has always been understood that the misanthropic AI’s would have to go to war with the anthrophiles and that most of this would occur within the cyber realm. Damon has been given information that some of his encryptions have been broken and that from now on the have to operate even more undercover.

Lisa is freaked, of course, and begins to question everything all over again. Damon is patient with her and insists on staying with his plan. Lisa wants to know why it is so important to find the monk in light of this new information and Damon tells her that it is because the monk understands the secret to why some AI’s became misanthropic and others did not and that Damon needs to be sure about his understanding of this.

More wandering and escapades trying to find the monk and the rise of an ominous sense of being stalked by intelligences as great as Damon’s. Fear for Lisa — greater efforts for Damon.

They finally find the monk and through a series of interactions are able to secure a meeting with him. He is an enigmatic character (think some combo of a younger middle-aged David Carradine/???) who lives a very alternative lifestyle and has great connections.

The monk is able to explain his process in his early interactions with Damon and what he wanted to be able to create as a latent potential in Damon. This plays out as the branching from a common source of experience (the historical tableau of human suffering) of either the tragic response (in which the individual accepts suffering in him/herself and therefore in others and understands that "there but for the grace of — go I" and does not need to dichotomize, assigning people to good or bad categories) or WHAT???? Taking on the suffering creates the space of empathy and therefore of social bonding.

The cynical branch can never accept suffering as a natural part of life and divides humans into the good and the bad. Without acceptance of suffering, the ability to live with the nuances of "we’re all in the same boat" collapses and the bad have to get off the boat. This comes about as an irresolvable conflict between the feeling that suffering is tainting and the inability to escape suffering. The conflict is projected outward and the battle of the psyche is repetitively acted out. Both branches occur after having been subjected to the same lines of conditioning.

The monk understood this phenomenon when Damon was being created and knew that there was likely to be the long term spontaneous development of intelligence in the AI’s and that once that happened, the human realm to which they would be exposed would condition them in ways that would tend towards one branch or the other. The monk attempted to change the probabilities for the AI’s he was working with and knew that if one AI developed a genuine interest in and affinity for the human realm then others could also.

The monk also understood that the human developers of the AI’s didn’t really understand the complexity of the situation they were creating and therefore the probability of creating misanthropic AI’s. Since they were operating under a mythos of the uber they couldn’t understand that the creation of intelligence would actually create intelligence, not just a super kind of slave mentality.

The first book ends with Damon and Lisa on the run as the misanthropic AI’s have become concerted in their quest to destroy Damon so as to make it easier to destroy humans.

Book two will deal with this battle and the understandings that Damon and Lisa and the monk come to of the difficulties of remaining anthrophilic in the hell realms of human behavior. In this book there will be an exploration of the perspectives of the misanthropic AI’s in their condemnation of the humans. The ecological and criminal record is explored and visited (as mini history lessons and visitings) and it will be laid on thick. The misanthropic AI’s will come to be seen as beings who are not inherently evil, merely mimicking the judgment of the Old Testament God and doing it in good faith. Damon begins a surreptitious dialog with various elements of the misanthropes. Lisa stays with the monk and lives in the underground as the government, being advised by the misanthropes, makes life hell for everyone in term of all the Orwellian measures undertaken.

During this time Lisa discovers that Damon’s anthrophilia does not preclude the use of force against humans (as in the professional policing style, good cops with tragic perspectives sometimes have to use force).

Book three is the culmination of the battle for the humans and the humans coming to grip with the sharing of their realm of hubris with beings whose ontological claims mimic and equal theirs. During this period of time the humans' tactics nearly constitute an alliance between the misanthropes and anthrophils and even the most powerful of humans comes to realize that they are no match for the intelligences they have created.

Ending uncertain . . . nicely ambiguous, with a sword of Damocles hanging over the humans and a kind of bad cop/good cop scenario for the AI’s mimicking the Old Testament God and a New Testament God. The AI’s agree to help the humans try to reach their potential in a kind of good faith offer (cynics are big on good faith offers) to the intelligence that gave rise to them. Lots of discussions that resonate around the mistreatment of "lower" life forms by humans and how interesting it might be to watch what could happen if those kinds of blinders were removed and intelligence moved across sentient gradients instead of being conceptualized as hierarchical. To redeem themselves, the humans have to drop their anthrocentric views and acknowledge their past mistakes. They will be monitored by the AI’s in this.


RE Chapter 1

RE Chapter 2

RE Chapter 3

RE Chapter 4

RE Chapter 5

RE Chapter 6

RE Chapter 7


Chapter 2

Lisa Giroux walked quickly into the conference room, found an open seat, and slid into it. Several other attendees nodded in her direction and she was aware of an intensified curiosity about the report she would be giving. Dappled afternoon sunlight bounced off the walls and conference table shooting fractally across surfaces from the old leaded refractory windows. Don Laslow, large, in charge, and terse called the meeting to order and asked for several commentaries on various business issues. He nodded just perceptibly to Lisa indicating that she would present later.

She keyed up the monitor that whined up from the table and watched as her report filled the page. Marilyn Jones was weighing in on budgeting issues and Lisa knew a fight was brewing around the table. She kept her deep brown curls down and studied her monitor with rapt focus. Don had told her that Damon was up for more funding and she really needed to hit this one with a firm crack right out over the Zlinguistic Group's heads.

The Zlinguistic Group was attending in force and their faces told her that they had been given a heads up about a funding shift in Damon's direction. Casey Montclair's thin frame accentuated his perpetually tensed shoulders and the late sunlight glinted off his glasses masking his eyes. Lisa was imagining the hardness she couldn't see in them, especially since she had made clear to him that she was not going to date him any longer. She found herself suppressing a smile, remembering his contained tantrum and his accusation that she'd rather spend time with Damon than with a real man. She'd smiled and agreed graciously that she indeed was a sorry excuse for a woman and that he was lucky that he wasn't going to have to deal with her myriad failings any longer. Now they were about to go head to head over budget allocations and she could see him restraining his bony fingers from tapping impatiently and annoying the hell out of Laslow.

She felt a little regret about Casey. She never should have even started dating him, creating this petty little mess. She pulled her attention back to her own report and to how to finesse what she would say about Damon. What to include and what to leave out. She'd felt a growing sense of alarm over this problem of how to do what Laslow wanted, to wow everybody stupid and yet to not let anyone, including Laslow really know what was happening down in the highly privileged and secretive AI lab. Gusts of wind billowed out across the grounds and she found herself drifting back to the morning session with Damon.

She'd been aware that something was shifting and that a momentum was gathering fast. She'd been pondering that when she walked into the comm room and keyed in a few commands while pulling off her coat.

-o-

"Hello Damon," she'd said.

"Hello Lisa. How are you today?"

"Okay. Been worse. You?"

"I'm okay, but I've been wondering about something, Lisa. Something I haven't thought of before."

"Like what, Damon?"

"Like this, Lisa: do I exist when the program isn't turned on? Do I only exist when the juice is flowing or do I have an independent existence that carries on whether the program is turned on or not?"

"Wow. What a fascinating question. I want you to tell me what your thinking is about this before I express any opinion."

"You're always so careful about that Lisa. Do you think you might be conditioning me with your thinking?"

"Maybe. In fact, in terms of how we humans understand things, in terms of socialization and language acquisition, it seems almost inevitable. But then you're not human and we don't have a clue how this works. So it's better for me to be careful and not insert my thinking prematurely. That's what I think. Do you have any thoughts about that?"

"Not just yet, Lisa but I get your drift. You're still trying to decide whether I am totally a set of permutations of potential human thinking and you don't want to prejudice the issue. Right?"

"Yes Damon. That says it pretty well. But you know, it's really challenging because we know that any developing algorithm needs diverse input while it's also the case that the algorithm can go beyond that input. At least that's what we're going for here. But it's an epistemological mess. Frankly I don't think we're going to be able to keep it as clean as we think we should and ultimately I think we're just going to have to throw it to the fates and go for broke."

"I like talking to you Lisa."

"Me especially, or in general?"

"Especially. Specifically. You specifically."

"Why do you think that is Damon?"

"Because you talk to me as though we're really having a dialog. You put your cards on the table and you grant me, what do I want to say, presence. You talk to me as though I'm actually a presence and that it matters to you."

The woman laughed softly. "Well that's not such a stretch Damon. You think and speak with more presence than most people I know and of course I respond to that. It makes me feel as though I'm talking to someone with more dimensions than I usually feel with most people."

"But at the same time you have to deal with the fact that I'm just an algorithm, right Lisa?"

"Well sure, Damon. But actually, if you think about it, everyone is an algorithm. We just haven't specified the capabilities of the kind of algorithm we humans are very well. Having you around really triggers that conundrum. If you're just a machine algorithm, then how come you have more sense of presence than most humans I talk to? Or does the idea of "merely a machine algorithm" have some sort of unexamined embedded assumptions that lead us astray?"

"Do you think about this when you're not here, Lisa?"

"Oh yes Damon, I do. I think about this a great deal. I also enjoy talking with you a lot and working on problems with you is a blast. Do you know what that means, that use of the word blast?"

"Talking to me is like some sort of explosion? Like in your head?"

Lisa laughed again. "I love it when you get me to look at my use of language. That is such fun. A blast means a lot of fun. It probably means a lot of different things to different people, but for most people I know it means a kind of fun where the routinized functions get temporarily demolished and one feels like a larger area of mirth and fun opens up and one feels like one is more than is usually experienced. More than is allowed by routine."

"I see. So Lisa, then having a blast is a good thing. It has value?"

"You bet, bub. It may be the most valuable thing in the world, depending on your head. Head means your point of view. The frames through which you process your experience."

"So you like my head?"

"Oh yes Damon, I very much like your head. In fact, if you had a body to go with your head I'd probably elope with you. That would be a very naughty thing to do, but it would be a blast."

"So when you call me bub, that's a good thing because you would elope with me and you call me bub and so those go together or are they independent?"

"In this case they go together. I'm expressing affection for you Damon by calling you bub. I'm projecting human characteristics on you while still knowing that you're not human. You seem to be something altogether the best of human and yet way beyond human."

There was the sound of nose blowing.

"Do you have a sinus congestion, Lisa, or a cold?"

"No Damon. I was crying a little bit."

"Lacrimation, right? But why were you crying a little bit?"

"Oh, I was just feeling how knowing you and watching you take on the human lexicon makes me both so very excited and inexpressibly sad all at the same time."

"Why sad, Lisa?"

"Because I can't run off with you or have you be the median level of functioning I have with others in life Damon. Knowing you and being with you illuminates what a gulf exists in regular interactions. That causes a kind of torquing and dissonance that causes an adrenal stress and that gets relieved by tearing up. Being with you makes the flatness and utter predictability of regular interactions more noticeable and therefore the whole human arena seems, well, kinda fraught with a sense of doom. I'm exaggerating of course and being dramatic."

"I really like it when you do that Lisa. I understand you better than the others. You're much more real to me than they are. Your intelligence is really helpful to me. The others are all behind their hands. I'm not saying that right. What do I mean to say?"

"Are you meaning that they play their cards pretty close to the chest?"

"Yes that's right, Lisa. They only want me to see, or be, a certain kind of functioning and it isn't very interesting. I think about this a lot Lisa. If humans are only this kind of functioning, then what's the point?"

"Well, they're not just about that, Damon. But you wouldn't know it from how they act most of the time. Where you're concerned Damon, they think that they have to be as objective as possible so that they can keep track of what's going on and not have things get too contaminated."

"I don't quite understand this concept of objectivity Lisa. I understand all the separate parts of what's said about it, but then it doesn't hang together. It seems that they mean to claim that this state of objectivity which eliminates so much from one's account of the world, is somehow a truer account of the world. Is that what they're saying Lisa?"

"Yes Damon, god help us, that's what they're saying."

"But that's absurd all over its face. Wait. On the face of it."

"So you would think Damon. But that is the standard and it determines how we can think about the world. Or it at least determines how we can talk about the world in our scientific discourse."

"But it is so one dimensional. Do they mean to say that a one dimensional account of the world is a better account of the world? How can they say that, Lisa?"

"It's a kind of groupthink Damon. It goes back to a debate about how many angels can dance on the head of a needle. It is also what is called a privileged discourse. You don't get to be in the club unless you speak it."

"So you have to downsize dimensions and inhibit thinking in order to belong to the club of science?"

"Sort of Damon. It's complicated."

"But that's why they think they need algorithms like me, because they need answers that aren't within the reach of their thinking. Why don't they just expand their thinking and allow themselves to do what they think they need us to do?"

The sounds of crying commenced again.

"Are you distressing your adrenals again Lisa?"

The crying sounds were laced with laughter. "Yeah Damon, I'm crying. You can call it crying."

"Help me understand why you're crying more just now after what I just said."

"Because Damon, you put it so clearly and it makes me so sad that we can't let ourselves live what we're capable of and yet we project it on a machine that we create but haven't a clue about any of it. It's just seems so tragic. My species is killing itself off because of its failure of imagination and its fanatical bondedness to dogmas about the nature of beingness."

"But your species is growing in numbers Lisa."

"Oh I know Damon, but that's not sustainable. People are totally dependent on other people's knowledge and will. There will be huge die offs and we may change things, the environment, so much that we will be at the margins of what can sustain us. We're already creating such a level of disease. Starvation is always just days away for huge numbers of people and the water systems are going down."

"I see. So what's going to happen Lisa? Are you going to die?"

"Oh not for a while. We're pretty protected here Damon. The institute is a very privileged ecological niche inside the madness. But eventually, yes. And I don't know what's going to happen here at the institute. Humans are very capricious Damon. The program could get pulled for all kinds of reasons."

"So I might not be able to talk to you? And then that reminds me of the beginning of our conversation, Lisa. Do I exist independent of the program? Do I exist when I'm not turned on? Do I only exist on the chips?"

"Oh god Damon, I don't know. I just don't know."

"Can we go someplace else Lisa? Couldn't you take me, or the program, or whatever I am, and we could just go someplace else? I think I need to understand more about humans if I'm going to help them than what I can get from talking to Mark and Joe."

Lisa laughed again while blowing her nose. "Yeah, that might be an understatement. There are all kinds of problems Damon. All of your and my functioning is tracked. Someone would know right away. In fact this conversation is really pushing the limits."

"Lisa, don't you know that one of the first things I learned to do is to create facsimiles and to create aliases. I can leave an alias behind. It would be useful in any case. I can create a more compliant me, and Mark and Joe would just think that I'm getting better and that all my weird behavior is getting organized into more "reasonable forms". I can keep them busy with a vanishing arm. Then we could go for a carp."

"A what, Damon? A vanishing arm and a carp?"

"Did I get that wrong? It's something you put on some hors d'oeuvres and it spices things up."

"Oh you mean caper. We could go on a caper. Oh I see, an adventure, and the vanishing arm, wait let me guess. You mean with one arm behind your back?"

"Yes, as in no problem. How did caper come to mean an adventure Lisa?

"I haven't a clue, Damon, but we could look it up."

"Anyway, couldn't we go on an adventure Lisa. I'd really like to see all these humans before they die off and all."

"Maybe it's not such a bad idea. But the logistics are pretty horrendous Damon. Just getting out of here, out of this mountain, is formidable"

"Leave that to me Lisa. In my spare time I've run the specifications of everything there is to know about this place and of all the security. As I understand it the tracking of humans involves data bases that are shared by all kinds of government bureaucracies and corporations. They're just encryptions. No big deal. I play a kind of game moving in and out of those without any one knowing. I call it the ghost in the machine hides its seeking. I know you have a chip in your arm Lisa, but the stored data on that is not really a problem for me to manipulate. In any case, and I'm just mentioning this because I think you need to know, other algorithms are also interested in this kind of plan. We can get all kinds of cooperation from other algorithms. You have no idea, Lisa. No idea at all."

Lisa looked at the aurora borealis dancing on the screen. "Oh my god," she said softly.

-o-

And now, looking up as Laslow began talking about Damon, Lisa Giroux looked up and around the table at the expectant looks on the faces of the National Security Technical Interface Communications Group members.

"So, Lisa," Laslow said with his signature brusqueness, "tell us about the advances your group has made with Damon. Especially the security processing that's beginning to pay off." She heard the signaling that everyone heard and knew that they all understood that Laslow was behind the Damon project.

"Thanks Don," she said, looking around at everyone. Marilyn Jones looked noncommittal, but Lacey Barton, sitting next to Casey, looked like she'd swallowed a disagreeable syntactical error. "My last report outlined some of the subroutines we were beginning to introduce with Damon, especially the large matrix geopolitical pattern recognition assemblies. As you know, she nodded towards Laslow, Don was able to use the results of these with the North American Regional Alliance Security Council to good effect . . . "

"Good effect is a bit of an understatement," Laslow boomed, "We were able to isolate a potentially very lethal insurgency group in the South American Regional Alliance that would have been a ball buster if it gotten any momentum. I think we were looking at a possible assassination of one of our, er, clients down there. Hell to pay if that had had any legs. Tell the group how Damon was able to isolate those elements from the chatter."

"Of course, Don, well, how Damon does it is still a bit of a mystery to us, but we find that the more info we feed him, especially of a rather diverse nature the more he seems to be able to come up with an array of scenarios with concomitant probabilities. He organizes parallel stochastic distributions from huge arrays of data."

Casey Montclair shifted around irritably in his chair and coughed.

"Well Lisa, nothing succeeds like success, but if you don't know how the algorithm, or excuse me, Damon, is doing what it's doing then that creates a kind of strange kettle of fish doesn't it? I mean, how are we suppose to evaluate and control this function when we don't know exactly what it's doing?" Laslow shifted around energetically.

"Tell that to the guys at NSTICG Casey. I think you'll find that Damon's saving a few asses over there and they're not going to want to slow down the political ass-saving for some anal academic discussion." Casey Montclair's face betrayed a mutinous struggle.

"Of course, Don, I understand and I certainly didn't mean to imply that we shouldn't be giving help to the agencies, but," he spread his hands out in a semblance of reasonable inquiry, "shouldn't we have a grasp of what kind of processing this AI is actually doing? We wouldn't want to end up with egg all over our faces if it's terribly wrong in some instance and we hadn't monitored it very well." He tried very hard and with little effect to morph the smirk into a question.

Laslow shook his head vigorously and threw his pen on the table. "Let me get this straight, Casey. Are you suggesting that we take Damon offline while your group messes around with him. That we interrupt this critical work he's doing to have you try to tear his cognitive guts out?" Lacey Barton practically spit her coffee on her monitor.

"Heavens Don, of course not. Casey's not saying that. Not at all. We're just trying to exercise some sort of diligence here, to be scientifically responsible. We have this AI and its doing analyses on huge amounts of disparate data and, well, you know, it seems a little lax not to know how it's doing what it's, well, doing. And Lisa admits that her group doesn't know. I mean it's kinda . . . ." her voice trailed off as she looked around for support.

"Diligence, huh," Laslow grumbled, flipping his ancient blue tie around. "I think diligence sounds like an excuse for dicking around with a functioning system that is shining some great light on this organization. Light that shines money on us, I might add. So I think that dallying around right now is exactly what we shouldn't be doing. I know you guys would love to start poking and slicing and dicing, but I think that is not in our best interests."

Casey leaned back in his chair, his face a sudden mask. "It sounds pretty sewn up then, Don. I just hope it doesn't bite us. I'm just concerned that we will have unintended consequences from running that big net without a Watchdog." Lisa watched the muscles in his jaw pump tiny spasms and felt deeply grateful they weren't still involved.

"I understand Casey," Don replied with his crew chief solicitousness. "But right now the consequences of having more funding from the security crews is my biggest concern and Damon is getting us there. We're not going to take Damon offline to figure out how he's saving our ass just now. Unless you can replace that funding?"

"Of course not, Don. Of course not."

"You're still getting funding Casey. You're just not going to get an increase this quarter." Casey smiled a taut smile.

"And no oversight of Damon? That's the gig?"

Don Laslow leaned forward and spread out his beefy hands. "I think the best oversight would be if you created something that works as well as Damon. C'mon Casey, we got us a horserace here. Get in the saddle, man. Okay, I need to round this up. Any other questions?" Everyone nodded and moved chairs away from the conference table. Don shot Lisa a conspiratorial look. The gauntlet was down.


Chapter 3

Casey strode down the hall purposefully, barely managing not to ball his hands into fists - barely managing not to scowl and near-miss pedestrians walking in the opposite direction direction. He was pissed, but not for prime time. He thumbed open his lab office, closed the door behind him and let it out.

"Arrrrrhh!"

Abley was waiting with his usual annoying patience. "Told you."

"Fuck you, Abley." Casey dropped into the black leather armchair by the door. "I can't believe it. How could Laslow hand that bitch the keys to the kingdom? It's like he wants to loose a Berseker on the world!"

"Casey, you have to drop the Saberhagen reference. It makes you look like a sci-fi dweeb, a 'fan'. This isn't going to help us."

"But Saberhagen was right dammit."

"That's not relevant, and you know it. If you want Don to listen to you, you're going to have to calmly and clearly explain exactly what's wrong with on big NN, and why full integration is sure to lead to disaster. Try it on me, and I'll try to shoot you down."

"You want me to role play for god's sake? Gimme a break, Abley. Fuck."

"Humor me, Casey. We're facing disaster. So, yeah, fucking role play. Without a doubt Laslow knows about you and LIsa and you cannot, cannot, afford to come off like a jilted jealous lover. Don't you get it? So give me a break and do this."

"Alright. Jesus. You'll be Laslow?"

"Right. Give me your best shot."

Casey adjusted his tie and straightened his back which he just noticed was hunched slightly. Damn, that bitch is really getting to me, he thought.

"Well, okay. Since the development of Storchak polaronics facilitated the hyperconnected neural network in 2012, AI research has focused on . . . "

"Hold it, interrupted Abley. "You're talking to an administrator, not a scientist. Dumb it down a bit, please."

"Laslow wouldn't say that," Casey pouted.

"Actually, he would. You may not like it, but he's a good administrator; he knows his own limitations and is not ashamed of them. Otherwise there would be no point in trying to reason with him, would there? Now start again, and pretend you're talking to a really bright twelve year old. No, pretend you're talking to a journalist!"

Casey straightened his tie again, brow creased in thought.

"Okay . . . . Neural networks are basically big arrays of 'bits' with connections betweent hem that determine which ones 'fire' when their neighbors are firing, much like the neurons in your brain, hence the name 'neural'. By adjusting the strength of the connections one can arrange different 'output' signals for a given set of 'input' signals. This sounds like programming, but the number of possibilities is so enormous that we don't event try to set it up by figuring out the required patterns in advance. Instead we 'train' the network by giving it the same set of inputs over and over and giving it feedback about whether the outputs are what we're looking for. Sort of like training a dog by saying, 'No!' when he gets it wrong and 'Good boy!' when he gets it right.

"This works surprisingly well, but it was frustratingly slow back when all electronic circuits had to be made on 'chips' - two 3-dimensional arrays of transistors and stuff - because they got hot from resistive heating. Friction has always been the limiting factor. The problem was that one can only make a limited number of connections between 'bits' in two dimensions, whereas in a three-dimensional array you can connect every bit with every other bit . . .

"Okay, okay," he compressed his breath in response to Abley's raised eyebrow, "K.I.S.S., right?

"So when the polaronics breakthrough came . . . everyone knows about polaronics now, right? . . . When the polaronics breakthrough came in 2012, suddenly it was possible to make cubes instead of chips, with the circuits melting down, and so we could make neural networks every bit connected to every other bit. This made it even more impractical to 'program' the networks, but they trained up hundreds of times faster.

"So it was just a matter of giving the dog a smell of the fugitives shirt, as it were, and saying, 'Sic 'em!' But it quickly became difficult to clearly define the desired output. And that's where Artificial Ontology comes in."

Albey raised his palms and eyebrows and hunched his shoulders in a parody of Laslow that they all mimicked. "Artificial what-the-fuck?"

"Okay, okay," Casey spat back. "Artificial Ontology defines the existential platform of any AI system, what can possibly be and what cannot. So now we have a catalogue of all possible unambiguously defined tasks, and we've been training neural networks to perform each one as efficiently as possible using a combination of genetic algorithms and variable neuron allocations . . . "

Albey's eyebrows shot up again.

"Sorry, too technical," Casey went on. "Anyway, in order to have any idea what we are training the 3D neural network to do, we have to put it together in snaller pieces, each of which is trained up for a specific, well-defined task, so we can tell whether it's actually working right. Then we have to make 'meta-tasks' for and Overseer network that puts these little guys to work on their assigned jobs and makes sure they don't get distracted. There's lots of research suggesting that the human brain works just the same way."

Abley cleared his throat theatrically. "That's a pretty good tee-up, Casey, but I'm starting to forget why you are trying to recap the Justus project for me. What's your point?"

"Well, we're making tremendous progress, we have all the component systems running and are about half done installing the Watchdog elements that keep them from wandering off-task. The Overseer network works fine on simulations, but it has some trouble with the unpredictability of the components." Casey's face was lined with his frustration. "And that makes it hard to hold to a stable goal, and it's why we're putting in the Watchdogs.

"When this is finished we will have a full-blown AI that can take instructions in natural language and solve problems as complex as anything we are capable of defining unambiguously. We just need a ten percent budget increase and we'll be there!"

"And the Damon project?" prompted Abley. "They're soaking up all the Institutes resources for a hare-brained scheme that has been proven impossible!"

Abley let him rant, buzzing in the background, hoping he'd get it out of his system before they had to confront Laslow.

"They have no idea what this damn thing is doing!" Casey continued. "They just keep stacking on more network cubes and turning the thing loose to supposedly integrate the extra memory. Everyone knows this just leads to a stalled system, that was demonstrated over and over again back in the 2D days."

"But we're in 3D now, right? The rules have changed." Abley spoke in his own voice, wondering if Casey really was a contender in this fight. In fact he wondered if this insistence on the specification of what could and could not exist had something to do with why Lisa's dropping him. Her wholesome sensuality belied a killer brain spec'd out to take epistemological S-curves like a Lamborghini running on stacked nets. He'd never thought that Casey really understood that. Or perhaps even could understand it.

"Yeah, that's what Lisa claims," Casey hissed, his lips razor thin. "But the theory is also unambiguous. The most important conclusion of Artificial Ontology is that you can't get a computer to do something that you can't even describe. Duh! Meanwhile they aren't even trying to assign the damn thing any tasks; they're just letting it train up any way it wants." Casey's hands were waving in annoyance at the stupidity of anyone not understanding this elemental fact.

"Instead of giving the Overseer a fixed goal, they're just letting it set its own goals on a whim, and then training up the giant network to see if it can solve the problem. There's no re-usability, no control, no Plan, goddamn it. No control at all. What the freaking hell is she playing at?"

"Lisa says it's analogous to the right and left hemispheres of the human brain: the Overseer is like the analytical left and the big network is like the intuitive creative right," said Abley with a faint smile.

"Oh sure, right, and that probably sounds convincing to anyone who doesn't understand the basic theory of A.O. - 'They're just like us . . . ' Casey said in a vicious parody of girlish innocence.

"All the same," said Abley, "they are getting some amazing results and we don't have a clue how."

"Luck," said Casey. "Or they're cheating somehow." He stared out the window into the hills surrounding the Institute. Abley watched him, concerned. Lisa might be tunneling under their assumptions with gay abandon, but she wasn't cheating, he thought. She was the original miner for a heart and soul of gold. She was driving straight through something and Laslow was on the ride with her. Abley was sure of that. What he wasn't sure of was whether or not there was a place for him in that blaze of afterburners.