Thursday, March 20, 2008
Watch the video above of the Big Dog balancing on rough terrain. Don't skimp. Go ahead and watch then entire video.
What did you feel when it was over?
I felt concern and a little bit of sadness. That's right, I was feeling empathy for a machine. When the "dog" stumbled on the ice and was kicked I felt bad. I know it doesn't have the capacity to feel pain, but I had these unreasonable emotions. It just moved too much like a dog - though it has none of the cuteness factor of a real dog - for me to separate the rational part of my mind from my emotions.
I've felt this way before from something much less real. In games I've grown attached to characters and pets. Who didn't try to protect Dogmeat in Fallout or the Companion Cube in Portal? It's ingrained in us to put human emotions on things that are non-human. Our perceptions of our world color our view and sometimes makes us act irrationally.
But what it really got me thinking about is a comment I heard on the SGU podcast a few weeks ago. If we develop machines and program them to "feel" emotions, when do they develop sentience and how do we determine their sentience? Do we assume because they're programmed that they aren't really alive or aware? When is it cruel to shut them off or "hurt" them? And is artificial programming any different than the programming we've received through nature? If there's no duality and thus no spirit where do we draw the line between us and artificial life? Even if they're not human, and they never will be human, what are the ramifications of creating intelligent creatures that feel emotions?
And, of course, that also spawns thoughts about animals and emotions and our treatment of them. Many people have argued that animals can't act rationality, which I disagree with. But even though our perception may be flawed and in my case I may be attributing my own explanations of animal behavior based on my human perception, I see no way to deny that animals feel emotions, feel hurt, and have intelligence. Will our creations, if we ever reach that level, equate the same to us as animals, creatures that are not really "ours" in any way. Should we have more responsibility or less? And is it time that re-evaluate our attitude towards animals as we approach the reality that we might one day create some sort of artificial sentience?