2011/06/13

Dudes, Nuts, They're Robots

My whole damn family have been getting into Sarah Connor Chronicles, and, well, I must confess I occasionally watch a bit of it meself. Why does Sarah's narration always sound like a bad pastiche of Rod Serling? And why would you waste time with her "ontological mystery" episodes, since they distract from the main story? I'll even admit that Summer Glau grows on you when she's not forced to speak dialogue that was written by Whedon—I'm a sucker for an assassin droid with her cute little voice.

Whedon is not, however, un-implicated: the network killed Sarah Connor, after all, in order to give him a shot with Dollhouse. Which was a Whedon show that wasn't funny. I ask you, what is the bloody point? No wonder it got canceled.

Anyway, I was thinking about some of the premises of the Terminator franchise, and just, man, how lame is it? Like, what is Skynet's deal? I could see it trying to take over, a Zeroth Law Rebellion and so forth, but why is it wiping out humanity? I recall something to the effect that it's programmed to seek peace or end wars or something stupid, and so that's why it wipes out mankind, but seriously, who would program that? Who would give a program like that access to tactical nukes, let alone the strategic arsenals? It all smacks of PlotInducedStupidity, because "The Author Is Making A Point".

And then there's the T-1000s: originally they were said to be "living metal", whatever the hell that's supposed to mean, but then it was ret-conned that they were a nanorobot swarm (probably because someone pointed out "living metal" would have the computing power of, well, an ingot). Only, then, what about the vitalism? If we can send back nanorobot swarms, why not machine guns? Again, a mass limit would've made more sense—maybe you can send back a few small arms, but no tanks. Also, how does a nanorobot swarm form blades? Maybe they all align their corners to make a cutting edge, but I really doubt they could get enough density to go through bone or anything.

I do like how the time travel involves that spherical plasma shell: presumably the time machine patches together stress-energy tensor metrics on opposite sides of it, in order to prevent the time traveler being turned inside out by reversing direction in the fourth dimension (also, according to the Feynman-Stuckelberg interpreation of quantum physics, normal matter going backward in time is identical with antimatter, so that would be unpleasant, too). Nah, I'm kidding: nobody involved in Terminator has ever heard of the Feynman-Stuckelberg interpretation, let alone stress-energy tensor metrics. Hell, they've never even heard of Lucas-Penrose.

I, by the way, consider not addressing the Lucas-Penrose argument a grievous oversight by a franchise that deals with AI. It wouldn't have been too hard—it's entirely possible that Skynet, for all its smarts and apparent willfulness, is just a "weak" AI solution (which aren't ruled out by Lucas-Penrose). But it and its Terminators are frequently implied to be "strong" AI, and I'd like an explanation of why that's supposed to be possible. I don't like handwaving in SF, unless it's in the direction of real theories, and "just pretend Lucas-Penrose never existed" is not even a handwave. It may be popular with the mainstream AI field ("yeah, uh, these guys have determined that what we've been trying to develop is logically impossible" isn't really a grant-money magnet), but don't filmmakers claim to be critical of "Big Science"?

Finally, all this stuff about whether the machines, especially Cameron (and the previous Doraemonators, as I like to call them), have feelings? Yeah, well, it's not really relevant. It wouldn't surprise me at all if you could completely make an AI with feelings, exactly like yours—since a cat or dog has feelings exactly like yours. What makes humans human is not their feelings, but some combination of their free will and their reason—which is not just ability to use logic, but to understand, which, hey guess what, is the thing ruled out by Lucas-Penrose. But, RE: feelings, feelings are a program. Love (as a feeling, not "caritas")? Seek out and aid mates, offspring, and allied conspecifics. Fear? Avoid danger. Anger? Destroy threats and rivals. As an AI says in one of my SF stories, "Just because they're not written in ISO 13211 doesn't mean they aren't programming."

And that is what separates humans from animals, and also, presumably, from "weak" AIs ("strong" AIs being impossible without some kind of cheating)—we can question that programming. They didn't, for instance, use females as war elephants, because a cow elephant will always submit to a bull. Nobody but gun control fanatics will claim women can't fight because they'll always submit to a man: the feeling might be there (primate dominance is very similar to that of elephants) but the woman's reason and willpower kick in and blam, Colonel Colt makes them equal.

1 comment:

penny farthing said...

Hooray for Ninja Turtles quotes!

I too have wondered about those aspects of Terminator, like how come they can't send guns (except if I recall, doesn't a terminator at some point have one concealed inside his leg tissue - because apparently muscle is strong enough to block the energy of the time travely thingy? For that matter, why can their metal skeletons go through, to say nothing of the T-1000?)

Also the AI does seem a bit weird, doesn't it? Skynet's programming is rather vague, and then the terminators can just be reprogrammed to help people, yet they seem to have some sort of will... ish? It's really not well thought out, is it? I still like those movies though, except for the third one, and the couple of Sarah Connor Chronicles I saw were pretty neat. As long as you don't think too much about it....