24/01/2022

THAILAND DAILY

NEWSPAPER / MAGAZINE / PUBLISHER

should-we-allow-robots-to-become-part-of-the-family?

Should we allow robots to become part of the family?

Nobuko Kobayashi is a partner with EY Strategy and Consulting Co., Ltd., Strategy and Transactions — EY-Parthenon.

We live in the age of robots — what started with the simple automation of manual labor has transformed the white-collar workplace. Today, robots are our assistants, colleagues and even our bosses.

The boundary, however, does not stop there. Robots are entering our houses, and I am not talking about Roomba, the robot vacuum cleaner. Robots are becoming our life companions.

As early as the 1940s, science fiction writer Isaac Asimov, alarmed by the potential future deviation of robots from their intended purpose of helping humans, devised his famous three laws of robotics: a robot may not injure a human being, or, through inaction, allow a human being to come to harm. A robot must obey human orders except where such orders would conflict with the first law. A robot must protect its own existence as long as such protection does not conflict with the first or second law.

Understandably, Asimov was thinking of robots as highly evolved tools. Today, his laws seem too broad to protect us from the more nuanced and complicated consequences of making robots our emotional equals. There are a few things to consider when it comes to keeping robots on the straight and narrow.

The first dilemma regarding social robots is tied to the reason for their existence in the first place — to be a companion. We, humans, are susceptible to anthropomorphizing. As renowned robotics professor Hiroshi Ishiguro has observed, robots have a soul if their human counterpart perceives it as such. If we want to regard social robots as willful peers, we can, and we will. But the question is, should we?

Despite our being aware that robot communications are the results of algorithmic computations, do we feel deceived? If we do, does it not negate the whole purpose of social robots, which is to nurture a meaningful relationship?

Then comes the next inconvenient truth: smart robots make us dumb. According to Nicholas Carr, author of The Glass Cage, because computer aids relieve humans of mindless grunt work, they render human professionals, from doctors to pilots, less accomplished.

And because robots excel in extracting generic patterns with precision, they suppress the specificity of unique situations. Consequently, we might blindly follow a robot that dismisses an important signal because it does not fit a formulaic pattern. Over time, humans that become over-reliant on robots may lose the ability to read nuance and subtlety.

If this is true with work skills, why should it not be so for emotional intelligence? Would a child who only played with a robot develop the same emotional dexterity to navigate the complex web of human-to-human interactions? Will that child understand that a love that comes too easily may not be love after all?

Finally, there is an agency problem. While serving as a de facto family, social robots simultaneously serve the external brains that control the programming.

This is particularly worrisome when it comes to the elderly, a prime target for social robots. Elderly consumers are one of the few remaining growth areas for consumer goods companies. Would not a cute animal-shaped robot be a perfect agent to collect the innermost thoughts of an elderly person and then promote a range of products and services tailored to that person’s needs?

The elderly is a prime target for social robots. (Photo by Tomoki Mera)

It could be argued that the dumbing-down effect of human-to-robot interactions for the elderly should not be a big problem, but who decides? And what if humans are able to walk the fine line between fantasy and reality? Surely talking to a robot is better than nothing? And can we navigate the agency problem with a set of regulations around transparency and privacy protection?

Japan is at the forefront of pioneering the social robot industry. Its strength in mechatronics makes it one of the top producers of industrial robots. Japan has also pioneered emotional acceptance — from Astro Boy to cat-shaped Doraemon, Japanese pop culture is rife with helpful “best friends forever” robots.

This makes Japan an ideal testing ground. One recent success is LOVOT, a Teletubby-like robot on wheels with more than fifty sensors and liquid crystal eyes produced by AI startup Groove X.

Owing to its deep learning technology, LOVOT, says Groove X, can develop a unique character that responds to how their human owners treat them. Like a petulant child, LOVOT may ignore its owner’s instructions, thus defying Asimov’s second law of robotics, which states that a robot must obey human orders given it by humans except where such orders would conflict with the first law.

Social robots like LOVOT are largely outside the realm of Asimov’s laws because they are decidedly more human than industrial tools.

But where does the machine end and human consciousness begin? This is a moment to pause and ask ourselves: When does the potential harm of a social robot outweigh its short-term benefit? And how do we avoid the slippery slope to the crisis of humanity within our homes? Perhaps it is time we updated Asimov’s three laws to consider the framing of relationships brought about by social robots.

Loneliness is increasingly recognized as a quiet pandemic. With aging and declining demographics, Japan inevitably draws on automation for many tasks that previously relied on humans — the emotional support traditionally reserved for friends and family appears to be no exception.

Seeing the potential in social robots, municipalities in the developed world, including the U.S. and Japan, are subsidizing the cost of experimentation. But we should proceed with caution. Japan must lead the sector with technology and establish a guardrail for the human-machine relationship in our private space.

The views reflected in this article are the views of the author and do not necessarily reflect the views of the global EY organization or its member companies.

Comments

be the first to comment on this article

Leave a Reply

Your email address will not be published. Required fields are marked *

Take Me Top