Empathy for Non-Humans?  

Thursday, March 20, 2008



Watch the video above of the Big Dog balancing on rough terrain. Don't skimp. Go ahead and watch then entire video.

What did you feel when it was over?

I felt concern and a little bit of sadness. That's right, I was feeling empathy for a machine. When the "dog" stumbled on the ice and was kicked I felt bad. I know it doesn't have the capacity to feel pain, but I had these unreasonable emotions. It just moved too much like a dog - though it has none of the cuteness factor of a real dog - for me to separate the rational part of my mind from my emotions.

I've felt this way before from something much less real. In games I've grown attached to characters and pets. Who didn't try to protect Dogmeat in Fallout or the Companion Cube in Portal? It's ingrained in us to put human emotions on things that are non-human. Our perceptions of our world color our view and sometimes makes us act irrationally.

But what it really got me thinking about is a comment I heard on the SGU podcast a few weeks ago. If we develop machines and program them to "feel" emotions, when do they develop sentience and how do we determine their sentience? Do we assume because they're programmed that they aren't really alive or aware? When is it cruel to shut them off or "hurt" them? And is artificial programming any different than the programming we've received through nature? If there's no duality and thus no spirit where do we draw the line between us and artificial life? Even if they're not human, and they never will be human, what are the ramifications of creating intelligent creatures that feel emotions?

And, of course, that also spawns thoughts about animals and emotions and our treatment of them. Many people have argued that animals can't act rationality, which I disagree with. But even though our perception may be flawed and in my case I may be attributing my own explanations of animal behavior based on my human perception, I see no way to deny that animals feel emotions, feel hurt, and have intelligence. Will our creations, if we ever reach that level, equate the same to us as animals, creatures that are not really "ours" in any way. Should we have more responsibility or less? And is it time that re-evaluate our attitude towards animals as we approach the reality that we might one day create some sort of artificial sentience?

AddThis Social Bookmark Button

Email this post


14 comments: to “ Empathy for Non-Humans?

  • Lifeguard
    Thursday, March 20, 2008 at 7:50:00 AM CDT  

    In Consciousness Explained, Daniel Dennett asks at one point: If I could program a robot to look, speak, behave, and react exactly like a human being, would you be willing to say that it is conscious? If not, are you SURE you could say that? How different is that from a robot made up completely of organic matter?

    I see it as the reverse of the philosophical "zombie problem"-- how can we be sure that we're the only conscious being in the world surrounded by androids? If you turn it around, how can we be sure that we're not all blindly programmed androids?

    I think this is a FASCINATING question, and I'm so glad you posted this. Really.

  • Lifeguard
    Thursday, March 20, 2008 at 7:57:00 AM CDT  

    Oh, and if you haven't checked it out yet, watch the movie "Artificial Intelligence." It dovetails nicely with this post.

  • Ordinary Girl
    Thursday, March 20, 2008 at 9:30:00 AM CDT  

    Lifeguard: Thanks. I find knowing what makes us human to be a fascinating subject and the more I think about it the more I'm drawn to the conclusion that we're not all that unique or special. Animals, and possibly one day machines, all exhibit behaviors that we once thought to be purely human.

  • PhillyChief
    Thursday, March 20, 2008 at 10:29:00 AM CDT  

    That thing is fucking cool. I'm more fascinated with the movement than the philosophical issues of empathy and machines obtaining consciousness. That thing is awesome, mimicking animal quadruped movement beautifully. The recovery on the ice, getting kicked and then that jump at the end, that was all fantastic. Most of what I've seen, aside from Asimo, is mimicking insect movement instead, which is still confusing (I spent many a day scrubbing cockroach videos back and forth until I got down the sequence).

    As far as consciousness, yeah, I have no problem acknowledging that, either hardware based, software based or some hybrid of the two (I listen to their podcasts, too). I'd grant them full and equal rights the moment they exhibited both self awareness and an understanding of rights. This makes me want to watch Bladerunner now.
    Have you ever retired a human by mistake?
    - No
    But in your field that is a risk...
    Is this supposed to be an empathy test? Capillary dilation of the so-called blush response? Fluctuation of the... pupil. Involuntary dilation of the iris... Demonstrate it, I want to see it work... indulge me

  • The Exterminator
    Thursday, March 20, 2008 at 2:10:00 PM CDT  

    Fascinating little video. But any feelings I have are for the creature the gizmo represents, while all the time being fully aware that I'm watching a mechanical thing. I guess I haven't evolved to the point where I get emotional about robots.

    I do love my microwave, though.

  • PhillyChief
    Thursday, March 20, 2008 at 2:18:00 PM CDT  

    You can love your microwave, just don't LOVE your microwave. ;)

  • The Exterminator
    Thursday, March 20, 2008 at 2:25:00 PM CDT  

    Philly:

    I spill enough shit in there without adding to it by "spilling my seed."

  • Lynet
    Thursday, March 20, 2008 at 4:18:00 PM CDT  

    Would a robot have the same fear of death that we do? After all, we developed that fear after years of evolution. There isn't necessarily any reason why a constructed intelligence would have it.

    Same goes for feelings, actually. I, too, am pretty sure animals feel, but expecting a robot to feel in the same way we do might be silly. Perhaps everything has 'proto-feelings' of some sort, and perhaps only evolutionary processes would be likely to produces 'feelings' of the sort that we would recognise.

  • PhillyChief
    Thursday, March 20, 2008 at 4:22:00 PM CDT  

    But it could be intelligently designed with all that. ;)

  • Lifeguard
    Thursday, March 20, 2008 at 10:18:00 PM CDT  

    Well I think that's the question: could you design or program that? Whether feelings evolved or not, can we say that our feelings are nothing more than mechanistic reactions to information we gather from the environment. Functionally speaking, isn't that all they are? And if we could program something to function like that, how would you know that it wasn't as fully conscious as we are (or think we are)?

    If consciousness is an awareness of yourself as an agent, an awareness that we learn about through those same senses, then I don't think it's inconceivable that we could create a fully conscious robot at some point in the distant future.

    Okay... WAY distant future.

  • Venjanz
    Thursday, March 20, 2008 at 11:22:00 PM CDT  

    I actually winced when that guy kicked it.

    So how much longer before they rise up and destroy us?

    http://www.theonion.com/content/video/in_the_know_are_we_giving_the

  • Lifeguard
    Friday, March 21, 2008 at 6:55:00 AM CDT  

    "So how much longer before they rise up and destroy us?"

    It's going to be at least two weeks...

  • the chaplain
    Friday, March 21, 2008 at 7:14:00 PM CDT  

    "So how much longer before they rise up and destroy us?"

    I'm having visions of Cylons and a spacecraft named Galactica in my future.

  • PhillyChief
    Saturday, March 22, 2008 at 9:13:00 AM CDT  

    You're right, starting next Friday night. ;)

 

Design by Amanda @ Blogger Buster