I'd never heard of Paro, a six pound robot modeled on a baby harp seal, until I read about him (for me, Paro is a "he") in the New York Times on Monday. Paro was developed in Japan for use with the elderly, especially those with dementia. If you go to the company website you can see videos of nursing home residents holding and petting Paro. A man, who is said to have been non-communicative, is shown singing to Paro.
Paro has internal sensors, responds to his name, and apparently can adapt to the preferences of the person interacting with it. The robot was developed in Japan, which has a rapidly expanding "old/old" population, as a source of therapeutic contact.
So - is Paro a humane creation or another sign of our loss of humanity? Is he an ethically acceptable invention or a monstrosity? The Times discussed the ethics of Paro with Sherry Turkle, Professor of Psychology at MIT:
As the technology improves, argues Sherry Turkle...it will only grow more tempting to substitute Paro and its ilk for a family member, friend — or actual pet — in an ever-widening number of situations.Last year I wrote about these questions in a post about CosmoBot, a 16 inch tall robot used in treating children with severe autism:
“Paro is the beginning,” she said. “It’s allowing us to say, ‘A robot makes sense in this situation.’ But does it really? And then what? What about a robot that reads to your kid? A robot you tell your troubles to? Who among us will eventually be deserving enough to deserve people?”
Experimenting with robot caretakers could seem like an ultimate form of dehumanization. In my view, the robots themselves are ethically admirable. The ethical uncertainty is how we humans use the robots. Ventilators are a kind of primitive robot carrying out a single repetitive function. When we use them well we help sick people recover and save lives. When we use them mindlessly (robotically) we flog patients and prolong the dying process.Perhaps I have a bit of the robot in me - I'd say exactly the same thing again!
As I kid I loved Ray Bradbury's story "Marionette's, Inc.," in which a husband who wants to leave his wife but doesn't want to hurt her purchases a robot of himself. On the last night before departure he feels a tender anticipatory sadness and puts his head against her chest. He hears a robotic "tick, tick," not a human heart.
There's no doubt that technologies like Paro or the robots in Bradbury's story could undermine deeply held human values. If we give grandma a Paro and stop visiting her we're committing a moral wrong, even if grandma takes just as much pleasure in Paro as in our visit. We owe grandma our best human effort, and we owe ourselves a commitment to learn all we can from her.
Sherry Turkle is right to imagine the possibility of a slippery slope of progressive detachment from those we should be closest to. That could happen, and probably has already happened, since Paro has been marketed since 2004, and more than 1,000 are in use in Japan. But Eileen Oldaker, the focus of the New York Times story, used it to supplement the loving visits she made to her mother. Paro was an add-on, an extension of her caretaking attention, not a replacement.
I'm comfortable with the argument I'm making here, but I'm aware of an apparent inconsistency with my views on gun control. The NRA argues that guns aren't bad, bad people misuse them. They're right. But for me the magnitude of harm bad people create with handguns and automatic weapons justifies restricting access to them. My impression is that thus far the Paros and CosmoBots of the world have done much more good than harm. If we see an epidemic of Paro-induced neglect of the elderly like the epidemic of gang shootings we've recently seen in Boston, I'll be on the side of Paro-control.