Do amorous and amoral robots threaten our values?

WARNING: Some of these videos might not be safe for work. (Depends on how anthropomorphic your boss is.)

Did you know that YouTube was alleged to have censored robot sex? I didn't, and I find myself a bit perturbed, not only because the video in question is pretty amusing, but as far as I am concerned, what is going on does not constitute sexual intercourse.

Take a look; they are depictions of moving parts on what are clearly machines. No one would mistake them for humans.

I don't know whether this was prudishness on YouTube's part, and I understand why they might be afraid of pornography, but I see no way that this could legally be considered pornography.

So I looked again. YouTube seems to have reconsidered:

An article in the Washington Times raises the dicey issue of whether humans will marry robots, although the experts quoted admit that this won't be possible until the middle of this century. By then I'll be in my nineties if I'm alive, so I just can't get too worked up about culture war aspects of cybernetics -- especially whether robots might pose yet another threat to the "institution" of marriage.

Besides, what about the wording in the various marriage amendments which all recite that "marriage is between one man and one woman"? Assuming men and women are people (which I think they are), a robot is not a person, so it can't be called a man or a woman.

Presumably, a robot could be designed to look like either a man or a woman (or both, I suppose), and it could easily be programmed to be sexually responsive in either a heterosexual or homosexual manner, but as marriage between robots and humans would be legally impossible, I think cultural worries are premature.

What I'd really like to know is why so many people assume that once robots become intelligent and sentient beings, they'll necessarily want to take over. I saw the idea expressed again and again by commenters to a post Glenn Reynolds linked not all that long ago, and I culled these as typical examples:

  • Robots will be very "other" and may have far more power, thus they are far more likely to kill all humans than one human tribe is to kill most or all of another human tribe - and the latter has happened plenty of times.
  • why should sentient robots have any regard for humans or human rights? The concept of life and death doesn't apply to a robot; back it up, destroy the original, reload the memory into another copy and you're right back where you started from. Where would the tragedy be in a robot war? Similarly, we humans value our lives more than we value the lives of lesser creatures, e.g., cockroaches or cows. Who's to say that robots won't see us similarly?
    For all of these reasons, truly sentient robots that can self-replicate will probably spell the end of humanity.
  • One of the reasons that the notion of democracy bothers me is that successful genocide (and high breeding rate if you can pass on your views) is one of the win conditions in democracy.

    Instinctively people know this. This combines with the realistic fear that one day robots will be better than us. Also, its interesting how the cultural biases are. In Japan, people don't fear robots.

  • all law is legislated morality.
  • I am staring at the death knell of humanity in these comments
  • I don't fear superior people or superior technology, and I don't see the death knell of humanity. Besides, in an evolutionary sense, if superior beings were created by us and eventually surpassed us and did us in, they would stand on our shoulders, would they not?

    Anyway, I tend to think that hypothetical hyperbole about the future is a bit silly, and I felt like leaving a comment along these lines:

    I've noticed a recurrent theme in many of these comments, to the effect that robots will be necessarily be superior to humans in all respects, and that because they will be unbeatable, humanity's destruction is assured. Now, I won't live to see it, but I'm skeptical about the idea that man will be able to design a superior being that he will be unable to destroy in the event of some ultimate showdown.

    Should I care?

    Actually, I do care (at least in the theoretical sense) in that I would like to see robot technology evolve and develop without interference from anthropomorphic busybodies. In that respect, I think Glenn is right to be worried about robophobia. If they are our creation, they will always be an extension of us, and if each one of us is doomed anyway, isn't the creation of superior beings a bit like having superior children continue on?

    But I suppose if we're really paranoid, we could always make them edible.

    Or we could simply send disobedient robots back to the factory to be treated like Woody Allen.

    posted by Eric on 12.09.09 at 02:03 PM


    Fucking robots.

    John Drake   ·  December 11, 2009 1:58 AM

    Post a comment

    April 2011
    Sun Mon Tue Wed Thu Fri Sat
              1 2
    3 4 5 6 7 8 9
    10 11 12 13 14 15 16
    17 18 19 20 21 22 23
    24 25 26 27 28 29 30


    Search the Site


    Classics To Go

    Classical Values PDA Link


    Recent Entries


    Site Credits