“Robotic companions are being promoted as an antidote to the burden of longer, lonelier human lives. At stake is the future of what it means to be human.” M. Jackson, The New York Times

ESL Voices Lesson Plan for this post with Answer Key

Excerpt: Would You Let a Robot Take Care of Your Mother? By Maggie Jackson, The New York Times

“After Constance Gemson moved her mother to an assisted living facility, the 92-year-old became more confused, lonely and inarticulate. Two full-time private aides, kind and attentive as they were, couldn’t possibly meet all their patient’s needs for connection.

So on a visit one day, Ms. Gemson brought her mom a new helper: a purring, nuzzling robot cat designed as a companion for older adults. “It’s not a substitute for care,” says Ms. Gemson, whose mother died last June at age 95. “But this was someone my mother could hug and embrace and be accepted by. This became a reliable friend.” When her mom was upset, her family or helpers brought her the cat to stroke and sing to, and she grew calmer. In her last days “what she could give, she gave to the cat,” says Ms. Gemson.

An aging population is fueling the rise of the robot caregiver, as the devices moving into the homes and hearts of the aging and sick offer new forms of friendship and aid…Winsome tabletop robots now remind elders to take their medications and a walk, while others in research prototype can fetch a snack or offer consoling words to a dying patient… Yet we should be deeply concerned about the ethics of their use. At stake is the future of what it means to be human, and what it means to care.

Issues of freedom and dignity are most urgently raised by robots that are built to befriend, advise and monitor seniors. This is Artificial Intelligence with wide, blinking eyes and a level of sociability that is both the source of its power to help and its greatest moral hazard.

When do a robot assistant’s prompts to a senior to call a friend become coercion of the cognitively frail? Will Grandma’s robot pet inspire more family conversation or allow her kin to turn away from the demanding work of supporting someone who is ill or in pain? ‘Robots, if they are used the right way and work well, can help people preserve their dignity,’ says Matthias Scheutz, a roboticist who directs Tufts University’s Human-Robot Interaction Lab. ‘What I find morally dubious is to push the social aspect of these machines when it’s just a facade, a puppet. It’s deception technology.’

For that is where the ethical dilemmas begin — with our remarkable willingness to banter with a soulless algorithm, to return a steel and plastic wink. It is a well-proven finding in the science of robotics: add a bit of movement, language, and ‘smart’ responses to a bundle of software and wires and humans see an intentionality and sentience that simply isn’t there. Such ‘agency’ is designed to prime people to engage in an eerie seeming reciprocity of care.

Social robots ideally inspire humans to empathize with them, writes Maartje de Graaf of the University of Utrecht in the Netherlands, who studies ethics in human-robot interactions. Even robots not designed to be social can elicit such reactions: some owners of the robot vacuum Roomba grieve when theirs gets ‘sick’ (broken) or count them as family when listing members of their household.

Many in the field see the tensions and dilemmas in robot care, yet believe the benefits can outweigh the risks. The technology is ‘intended to help older adults carry out their daily lives,’ says Richard Pak, a Clemson University scientist who studies the intersection of human psychology and technology design, including robots…

We know little about robot care’s long-term impact or possible indirect effects. And that is why it is crucial at this early juncture to heed both the field’s success stories and the public’s apprehensions.”