iD (Machine Dynasty) Online - Madeline Ashby

ONE

Prologue: Satisfaction Guaranteed

REDMOND, WASHINGTON. 20–

“At some point, all human interaction tumbles down into the Uncanny Valley.”

The archbishops of New Eden Ministries, Inc., all nodded as though they knew exactly what Derek was talking about. He wondered if maybe they did. Surely they had played their share of MMOs. The pancaked pixels. The jerky blocking. Basic failures of the Turing test. They sat at a round table under a projector unit and regarded him placidly, waiting for him to expand upon his point. He had worked all night on this report. He kept trying to soften the language, somehow. He had to be nice, when he told them exactly how and why this whole project was going to fail.

Beside him, the gynoid twitched.

“You see it in completely organic contexts,” Derek continued. “Used car sales, for example. Have you ever met a person who’s really that positive, all the time?”

The archbishops cocked their heads at him. Of course they had met those people. They sculpted those people into being with prayer and song and service. They knew exactly what a happy robot should look like.

The gynoid, Susie, regarded him with the blankest of expressions. She was like old animation: only her eyes moved, while the rest of her face’s features remained stationary. When Susie wasn’t performing interaction, she looked dead. Not sleepy. Not bored. Just empty. Derek’s own parents had accused him of wearing that same expression more often than not. Couldn’t he at least make a little eye contact? Couldn’t he at least pretend to care?

“What I’m saying is, the whole point of most interaction is performance. And a lot of the time, we overdo it.”

The archbishops looked at each other. They were about to say something about his condition. He watched them come to that conclusion in a silent parallel process. The expressions surfaced fleetingly and then disappeared, like the numbered balls in the lottery tumbler on KSTW. He had a perfect memory of the tumbler turning on his television during long summer evenings in childhood: the television’s high keening hum, the press of nubby threads on his cheek, the feeling of being fossilized in broadcast amber.

“Are you sure your opinion isn’t unfairly biased by your own problems with affect detection?” one of them asked.

“It’s possible,” he conceded. “But I think what makes me the most nervous about what you’re proposing is that it’s an attempt to pin the very definition of humanity on affect detection, which is not only difficult to engineer, but notoriously subjective.”

He had been working on that statement for a while. He had practised it in the mirror, had rearranged the features of his face into their most convincing constellation so he would look extra believable when he spoke the words. Susie had helped. But now he’d missed the target, overdone it. He could tell, because the archbishops were looking at him as though he’d taken things all too personally, and maybe shouldn’t be in charge of something so important as the Elect’s final act of charity for all the world’s sinners.

He could have told them that basic human affect detection, the kind related to facial expression that most systems tried to emulate, usually tested below kappa values in studies. Without physiological inputs, it meant almost nothing. Every couple’s fight about speaking “in that tone of voice,” every customs officer’s groundless suspicion, all of it could be explained by that margin of error. In fact, he had told them that. Over and over again. He’d tested them with stock faces and told them to plot each face on an arousal/valence matrix. (They spent the afternoon in an “angry or constipated” argument.) He’d explained the nuances of the XOR function, how you needed to constrain the affect models down to the emoticon level in order for even multi-layered, non-linear perceptron networks to make a decision. Pain or pleasure? Laughter or crying? The machines had no idea.

A Turkish girl had died on a ferry crossing the Bosphorus because the machines had no idea. The system told the ferryman she looked pensive. He shot her. She’d just been through a breakup. Derek had written his thesis on the case. And now, New Eden wanted to build their failsafe on that uncertainty.

New Eden didn’t care, really, whether humans could tell the androids apart. What mattered to them was whether androids could tell humans apart. And that was hard. Harder than they could ever know. They kept saying humanity was like pornography: you knew it when you saw it. But Derek had