The Elderly Fear Their Future Robot Friends Will Corrupt Children

Senior citizens’ hesitance about using caretaking robots comes from a fear that their grandchildren will become emotionally dependent on the machines.

If all goes according to plan, grandparents will soon be lawn bowling with robots.

The elder care industry has recently struggled to keep pace with the swelling numbers of retiring Boomers, and the problem is expected to reach crisis levels within the coming years. With hopes of averting a disaster, techno-futurists are at work designing the best robotic caretakers possible. The machines have already been introduced on a wider scale in Japan, where the government has funneled billions of yen into the production of custodial robots capable of leading older folks around elderly facilities, “assisting with their toilet needs,” and locating the bold geriatrics who decide to walk off the premises.

But there appears to be at least one major blind spot in the sci-fi master plan. Senior citizens in the United States are surprisingly not overly concerned about the effects the robots will have on themselves, according to the results of a recent survey of 640 retirees above the age of 60 (average age was 68) published as part of the proceedings of the Association for Computer Machinery’s CHI conference. They are instead worried about how the machines might affect younger generations, like their children and grandchildren.

People frequently subscribe to the notion that they will not suffer from the effects of a certain form of media or technology but are certain that it will have a negative effect on others. This phenomenon is known as the “third-person effect.” “The greatest negative effects are predicted to occur among imagined audiences that are socially distant from the individual’s own reference group,” the Pennsylvania State University media researchers explain in their paper. “For the population of interest in this study, namely senior citizens, the obvious ‘other’ group is younger people.”

“Today’s youngsters are seen by older adults as hapless victims of new technologies, inexorably addicted to their gadgets and unable to carry out normal social interactions.”

Though the older respondents believed they would be invincible in the face of the robot’s deleterious charms, they believe younger people would suffer at the hands of the bots. This belief actually might make the technology a less appealing tool on the whole. According to research on other third-person effects, the feeling usually results in avoidance of the material in question. This could obviously have significant implications for the impending roll-out of robot elder care.

The observed third-person effect actually predicted the participants’ level of interest in using “companion robots,” the machines that are meant for socialization rather than chores or other assistant work. Specifically, they feared that these type of robots would “foster emotional and physical dependence among younger users.”

Older adults often have similar protective opinions about porn, hip-hop music, video games, and that new-fangled i-gadget-thingamajig. “Today’s youngsters are seen by older adults as hapless victims of new technologies, inexorably addicted to their gadgets and unable to carry out normal social interactions,” the researchers write. So, robots were a natural next target.

But the media researchers believe specific design fixes can be made to “companion robots” to overcome the notion that the machines will take over their children’s lives. In their paper, they suggest arming the robots with a sort of grandparental control mechanism, which would allow them to limit the amount of time the robot spent with their kids or grandchildren.

“Bottomline is that senior citizens ought to feel like that this robot will not be attractive to youngsters,” Shyam Sundar, co-director of Penn State’s Media Effects Research Laboratory and an author of the study, says in an email to Pacific Standard. “The robots could look like or have dialogue scripts and other functions that are attractive to seniors but not to young people. A middle-aged robot, for example!”

Though Carnegie Mellon University roboticist Jim Osborn has never heard of these concerns, he says that he can certainly imagine a kid annoying his grandmother with incessant requests to play with a robot. Besides just switching it off—which he maintains may not be such a bad idea— “the more elegant solution of tuning the robot’s interactions to each individual it encounters” might not be too far off from reality, he writes in an email.

He points to a therapeutic children’s robot called Romibo, which was developed at Carnegie Mellon and is now being launched in the private sector. “It has two modes: one for older adults and one for young people,” says Osborn, who directs the university’s Quality of Life Technology Center. “More precisely it has two pallets of motion commands and sound files that it plays (songs, jokes, questions, etc.).” Other pieces of the equation are in place too, he says. “[I]t is not difficult to give a robot the ability to recognize a human face, nor is it difficult to limit ranges of movement, speeds of movement, etc., for safety’s sake.” Here’s Romibo in action:

https://www.youtube.com/watch?v=knO0UQZ_6HQ

Amidst all this talk of tailoring robots specifically to different kinds of human interaction, beyond just task performance, it’s hard not to feel as though society is gradually devolving into an atomized, post-apocalyptic landscape where humans are closer to machines than their own species. In some ways, the thought seems to reinforce a haunting encroachment we can already feel. Are we really going to shove our grandma into the lifeless hands of some robot, too?

This thinking is not without its critics. Sherry Turkle, an MIT professor who studies the relationship between technology and society, is discomfited by the thought that a human, especially one plagued by decreasing mental faculties, could be convinced they’re having some genuine emotional exchange with a machine. In her studies of Paro, a Japanese baby seal robot designed “to have a calming effect on patients with dementia, Alzheimer’s and in health care facilities,” Turkle was “troubled when she saw a 76-year-old woman share stories about her life with the robot,” according to the New York Times.

“I felt like this isn’t amazing; this is sad. We have been reduced to spectators of a conversation that has no meaning,” she told the Times. “Giving old people robots to talk to is a dystopian view that is being classified as utopian.”

Osborn tends to agree with Turkle’s stance that these types of robots should not serve as a replacement for human contact. Instead, he says, they can effectively “fill in time gaps.” These could be certain spaces of elderly solitude that were perhaps, until now, never addressed at all. Besides that, there are other, more practical considerations. “Say for instance that I can’t reach my mother by phone, though I’m sure she’s home,” he says. “Did she fall? I could find out via the robot.”

Illusory relationships, Sundar argues, can actually make people quite happy, even if they aren’t, well, real. Take, for example, the affairs people carry on with celebrities or network journalists. In social science parlance, these one-sided interpersonal friendships are called parasocial relationships.

“In [the] 1950s, this was viewed as a psychiatric condition, but over the years, we have come to recognize as normal the human tendency to treat the weather reporter on the evening news or the soap opera character as social acquaintances,” he says in an email. “Robots are more sentient than these media characters and therefore likely to provide [a] greater sense of social presence and companionship.”

That may be true. But at least the weatherman is human.

Related Posts