RE Chapter 3

From WeKey
Jump to navigationJump to search
Robotic Ekstasis --> here

Casey strode down the hall purposefully, barely managing not to ball his hands into fists - barely managing not to scowl and near-miss pedestrians walking in the opposite direction direction. He was pissed, but not for prime time. He thumbed open his lab office, closed the door behind him and let it out.

"Arrrrrhh!"

Abley was waiting with his usual annoying patience. "Told you."

"Fuck you, Abley." Casey dropped into the black leather armchair by the door. "I can't believe it. How could Laslow hand that bitch the keys to the kingdom? It's like he wants to loose a Berseker on the world!"

"Casey, you have to drop the Saberhagen reference. It makes you look like a sci-fi dweeb, a 'fan'. This isn't going to help us."

"But Saberhagen was right dammit."

"That's not relevant, and you know it. If you want Don to listen to you, you're going to have to calmly and clearly explain exactly what's wrong with on big NN, and why full integration is sure to lead to disaster. Try it on me, and I'll try to shoot you down."

"You want me to role play for god's sake? Gimme a break, Abley. Fuck."

"Humor me, Casey. We're facing disaster. So, yeah, fucking role play. Without a doubt Laslow knows about you and LIsa and you cannot, cannot, afford to come off like a jilted jealous lover. Don't you get it? So give me a break and do this."

"Alright. Jesus. You'll be Laslow?"

"Right. Give me your best shot."

Casey adjusted his tie and straightened his back which he just noticed was hunched slightly. Damn, that bitch is really getting to me, he thought.

"Well, okay. Since the development of Storchak polaronics facilitated the hyperconnected neural network in 2012, AI research has focused on . . . "

"Hold it, interrupted Abley. "You're talking to an administrator, not a scientist. Dumb it down a bit, please."

"Laslow wouldn't say that," Casey pouted.

"Actually, he would. You may not like it, but he's a good administrator; he knows his own limitations and is not ashamed of them. Otherwise there would be no point in trying to reason with him, would there? Now start again, and pretend you're talking to a really bright twelve year old. No, pretend you're talking to a journalist!"

Casey straightened his tie again, brow creased in thought.

"Okay . . . . Neural networks are basically big arrays of 'bits' with connections betweent hem that determine which ones 'fire' when their neighbors are firing, much like the neurons in your brain, hence the name 'neural'. By adjusting the strength of the connections one can arrange different 'output' signals for a given set of 'input' signals. This sounds like programming, but the number of possibilities is so enormous that we don't event try to set it up by figuring out the required patterns in advance. Instead we 'train' the network by giving it the same set of inputs over and over and giving it feedback about whether the outputs are what we're looking for. Sort of like training a dog by saying, 'No!' when he gets it wrong and 'Good boy!' when he gets it right.

"This works surprisingly well, but it was frustratingly slow back when all electronic circuits had to be made on 'chips' - two 3-dimensional arrays of transistors and stuff - because they got hot from resistive heating. Friction has always been the limiting factor. The problem was that one can only make a limited number of connections between 'bits' in two dimensions, whereas in a three-dimensional array you can connect every bit with every other bit . . .

"Okay, okay," he compressed his breath in response to Abley's raised eyebrow, "K.I.S.S., right?

"So when the polaronics breakthrough came . . . everyone knows about polaronics now, right? . . . When the polaronics breakthrough came in 2012, suddenly it was possible to make cubes instead of chips, with the circuits melting down, and so we could make neural networks every bit connected to every other bit. This made it even more impractical to 'program' the networks, but they trained up hundreds of times faster.

"So it was just a matter of giving the dog a smell of the fugitives shirt, as it were, and saying, 'Sic 'em!' But it quickly became difficult to clearly define the desired output. And that's where Artificial Ontology comes in."

Albey raised his palms and eyebrows and hunched his shoulders in a parody of Laslow that they all mimicked. "Artificial what-the-fuck?"

"Okay, okay," Casey spat back. "Artificial Ontology defines the existential platform of any AI system, what can possibly be and what cannot. So now we have a catalogue of all possible unambiguously defined tasks, and we've been training neural networks to perform each one as efficiently as possible using a combination of genetic algorithms and variable neuron allocations . . . "

Albey's eyebrows shot up again.

"Sorry, too technical," Casey went on. "Anyway, in order to have any idea what we are training the 3D neural network to do, we have to put it together in snaller pieces, each of which is trained up for a specific, well-defined task, so we can tell whether it's actually working right. Then we have to make 'meta-tasks' for and Overseer network that puts these little guys to work on their assigned jobs and makes sure they don't get distracted. There's lots of research suggesting that the human brain works just the same way."

Abley cleared his throat theatrically. "That's a pretty good tee-up, Casey, but I'm starting to forget why you are trying to recap the Justus project for me. What's your point?"

"Well, we're making tremendous progress, we have all the component systems running and are about half done installing the Watchdog elements that keep them from wandering off-task. The Overseer network works fine on simulations, but it has some trouble with the unpredictability of the components." Casey's face was lined with his frustration. "And that makes it hard to hold to a stable goal, and it's why we're putting in the Watchdogs.

"When this is finished we will have a full-blown AI that can take instructions in natural language and solve problems as complex as anything we are capable of defining unambiguously. We just need a ten percent budget increase and we'll be there!"

"And the Damon project?" prompted Abley. "They're soaking up all the Institutes resources for a hare-brained scheme that has been proven impossible!"

Abley let him rant, buzzing in the background, hoping he'd get it out of his system before they had to confront Laslow.

"They have no idea what this damn thing is doing!" Casey continued. "They just keep stacking on more network cubes and turning the thing loose to supposedly integrate the extra memory. Everyone knows this just leads to a stalled system, that was demonstrated over and over again back in the 2D days."

"But we're in 3D now, right? The rules have changed." Abley spoke in his own voice, wondering if Casey really was a contender in this fight. In fact he wondered if this insistence on the specification of what could and could not exist had something to do with why Lisa's dropping him. Her wholesome sensuality belied a killer brain spec'd out to take epistemological S-curves like a Lamborghini running on stacked nets. He'd never thought that Casey really understood that. Or perhaps even could understand it.

"Yeah, that's what Lisa claims," Casey hissed, his lips razor thin. "But the theory is also unambiguous. The most important conclusion of Artificial Ontology is that you can't get a computer to do something that you can't even describe. Duh! Meanwhile they aren't even trying to assign the damn thing any tasks; they're just letting it train up any way it wants." Casey's hands were waving in annoyance at the stupidity of anyone not understanding this elemental fact.

"Instead of giving the Overseer a fixed goal, they're just letting it set its own goals on a whim, and then training up the giant network to see if it can solve the problem. There's no re-usability, no control, no Plan, goddamn it. No control at all. What the freaking hell is she playing at?"

"Lisa says it's analogous to the right and left hemispheres of the human brain: the Overseer is like the analytical left and the big network is like the intuitive creative right," said Abley with a faint smile.

"Oh sure, right, and that probably sounds convincing to anyone who doesn't understand the basic theory of A.O. - 'They're just like us . . . ' Casey said in a vicious parody of girlish innocence.

"All the same," said Abley, "they are getting some amazing results and we don't have a clue how."

"Luck," said Casey. "Or they're cheating somehow." He stared out the window into the hills surrounding the Institute. Abley watched him, concerned. Lisa might be tunneling under their assumptions with gay abandon, but she wasn't cheating, he thought. She was the original miner for a heart and soul of gold. She was driving straight through something and Laslow was on the ride with her. Abley was sure of that. What he wasn't sure of was whether or not there was a place for him in that blaze of afterburners.