AI to Ai to Ai

"Meeting AI, and Love" (AIと会いと愛). Thoughts upon des personnes artificielles. Also, post #400!
  • It's surprising to me how few manga and anime use robot characters for the "kuudere" type (the seemingly emotionless girl who gradually warms up), despite it being an obvious match. They far too often just go the straight eager-to-please girl route, which, while certainly a valid interpretation of the concept of AI (especially for service-'droids designed for the general public), is just less rewarding. Maybe that's just me, though; I have a bit of a kuudere fetish.

    But I think the kuudere idea makes sense for AI, from a science-fictional standpoint. See, emotions are cognitive shortcuts, basically pre-loaded sets of hormonal instructions so your brain doesn't have to manually send operating parameters to your endocrine system. And a "smart" AI, one that learns from its actions, would be quite likely to develop emotions for much the same purpose, to take computational load off its processor while interacting with humans. All it would have to do to avoid emotional behavior resulting in danger (and it would know this) is to always give higher priority to its "safety space" programming than to its emotion programming (that humans can't do that without effort could be an interesting plot element in its own right).
  • How come we only ever—even back to R.U.R.—use robot stories for class-allegories (or occasionally race-allegories, generally rather ham-handedly)? Come to think of it I wonder if it's possible for a robot story to not be at least somewhat leftist—can you make a Libertarian story about sapient entities that are treated as property? Somehow I doubt it. Unless it was to be in favor of exploiting the AIs—the difference between Libertarians and Conservatives is the latter deny the leftist oppression-narrative, pointing out that capitalism benefits both employee and employer...while the former simply decide to posture as Marxism's version of Anton LaVey.

    Ayn Rand always glossed over the question of the working class—and capitalism really does have something to say for itself, go read Thomas Sowell, or for that matter Frédéric Bastiat—by the simple expedient of pretending "entrepreneurs" can actually accomplish a damn thing without a small army of employees. Again, Dagny Taggert couldn't lay one damn inch of track, sorry.

    But race- and class-stories, aside from being overdone in every other medium, aren't the only or even the best themes you can explore with robots. You could make some doozies about bioethics, IVF, designer-babies, and just the concept of personhood generally. Of course, "you treat people like dolls" is the major statement those sort of stories have to make, and there's a multi-million dollar reproduction-boutique industry, with at least hundreds of thousands of clients, that prefers not to be reminded that their business model was borrowed from the Build-A-Bear Workshop.
  • I keep looking for something from D&D 3E that I can use as a fantasy equivalent to AIs. I might have to buckle down and use the @#$%ing Warforged from @#$%ing Eberron, which setting I consider unworthy to serve as toilet paper (sorry, but if your divine magic is powered by "faith", you should be beaten to death with the complete works of Søren "Clap Your Hands If You Believe" Kierkegaard).

    I might just use the stats of the Medium Animated Object; magically-animated dolls as characters has a long and glorious tradition in our literature, and seven of the greatest characters in anime (not to mention any other medium) are precisely that (well, six greatest characters, and Canaria). I'd just have to come up with a special version of the spell that creates a mind for the object (presumably it has to be cast on a mannequin)—probably it actually binds an "Outsider" to the object, or maybe...

    As I write this, I have hit on the answer. A combination of animate object and simulacrum, with a specially-made statue standing in for the snow that forms the simulacrum's body—basically it'd be the fantasy version of mind-uploading AI. I guess it'd come under the "Craft Wondrous Item" feat? And, presumably, at least as hard to make as one of the better golems—the resulting magical android wouldn't be as strong as one, but golems don't have minds.
  • What's with people in stories resenting robots? Even the ones that only act according to programming, I mean. Then again, there really are people in this world whose behavior around guns (sometimes also swords) is only explicable—that is, not completely without cause—if you assume that they're attributing moral agency to weapons.

    Now, any robot that either lacks free will, or can have its free will circumvented under certain circumstances, is not what anyone remotely rational—that is, not batshit insane—would be resenting. They'd resent whatever yahoo programmed the robot, or circumvented its free will. I don't care how devastated some fictional country was by mechanized armies, nobody would be stupid enough to have prejudices against robot soldiers that remotely mirror those people have against other people. People might find the sight of the mech-armies terrifying, they might prefer to have them banned, but they would not behave toward them in the manner that, say, Serbs behave toward Turks, or Poles toward Germans.

    This device's overuse is what Jeff Cooper was talking about, when he coined the term "hoplophobia". Fundamentally there's no difference between this idea that people would resent robots (if those robots were in no way acting of their own will) and the idea that those few, possibly apocryphal, tribes that try axes for murder are the norm of the human race. Even I give people more credit than that, and I'm a gargantuan misanthrope.
  • And if it comes to that, how horrible would a mechanized army really be? Android soldiers don't rape, they don't loot, they don't even have to raid the silo or stock-pens to feed themselves. Those are the things people resent in wars; anyone with emotional maturity exceeding that of a damn third-grader knows that people die in battles. Seriously, read history—it's not, primarily, the directly bereaved who resent soldiers, it's those who are victimized by them in other ways. The actual fighting is a contest with rules every human knows instinctively, including that "getting killed" is the main lose-condition; only people whose minds have been deliberately warped, by themselves or other people, don't understand that.

    Of course, it's doubtful any healthy civilization would ever use large numbers of android soldiers, or, for that matter, much in the way of drone-strikes; there's too much temptation to play god when one can smite with impunity. Automated weapon systems of any kind, aside from the issues of them being hackable and jammable, are probably at least a temptation to immorality (if not actually immoral), unless used in direct support of flesh-and-blood troops. If you don't have to commit actual people to military operations, you're vastly more likely to get trigger-happy in your foreign policy.
  • I forget where it was, but a while back I read a thing about how Terminators make no sense as battle-bots—though the infiltration-models make sense, it's later revealed that huge swaths of Skynet's forces are just fleshless Terminator skeletons, and that's crazy. However, the alternative the writer described—essentially a mechanical centipede that moves "too fast for the eye to follow"—suffers from a much more basic problem. Namely...can you miniaturize a fission plant enough to power that? (Also, just as with alien life, more limbs means more processor-load and more power-drain; four is a good compromise, especially for anything that'll have to manipulate tools.)

    See, a major issue for bots is power supply. The robots in my books very seldom move any faster than zledo (which is to say, no faster than jaguars with Marine training); though they can keep up with highway traffic for very short sprints, it drains their power rapidly. A robot that moves twice as fast as a human is going to have to use twice as much energy as a human, assuming it masses the same as a human—as with spaceships, thermodynamics is a harsh mistress, who knows no forgiveness and brooks no argument.
  • Mass is another issue for androids. If they're as light as humans—and they'd have to be roughly to the same scale, just to interact with human environments safely—they're probably going to be made of materials roughly as resilient as bone and tendon (which are, I hate to tell you, actually crazy resilient—we still can't make prosthetics that are quite up to bone's structural performance).

    That's good news for writers, since it means fight-scenes with them are "man vs. other-man-hopped-up-on-something-scary" rather than "man vs. avatar of a battle-god", but it's bad news for the realism of most robot fiction—most of our bots are just unrealistically strong for their weight. Data, for instance, would only be able to use the strength he's displayed by bypassing his operational parameters; flouncing around with that kind of strength 24-7 would just put unnecessary stress on his actuators, put people in danger, and foreshorten his battery-life.

No comments: