The global robotics market is projected to grow from $27.73 billion in 2021 to $74.1 billion by 2026. This nearly threefold increase underscores the unprecedented pace of industry development. This raises an important question: how will the status of robots be defined in the future?
In Spain, a judge recently granted joint custody of a dog to a divorced couple for the first time, with new legislation recognizing animals as living beings rather than objects. Meanwhile, Hyundai has introduced a Boston Dynamics robot to oversee safety at Kia's plant in Seoul.
Today, the difference between a living being and a robot is clear. But will this distinction still hold 20 years from now? Probably not.
The race for leadership in robotics development among countries effectively began in 2017-2018. During this period, over two dozen different strategies were adopted by both international organizations and individual countries. In 2017, the EU passed the Resolution on “Civil Law Rules on Robotics,” which includes a Charter on Robotics and a code of ethical standards. These documents address issues related to research, intellectual property rights, standardization, safety, the integration of robots into various spheres of life, and ethical principles and accountability. The section on responsibility is of particular interest to us.
In Paragraph 56 of the Resolution, it is specified that the degree of responsibility for the actions of robots should correspond to their level of autonomy and adherence to instructions. The higher a robot’s capacity for learning and autonomy, the greater the responsibility placed on the person who trained it. It is crucial to distinguish between the skills a robot acquires independently and those that are programmed into it. For now, the responsibility lies with humans, not robots. However, Paragraph 59 (f) raises the question of potentially granting robots a special legal status in the future.
This implies that the most advanced autonomous robots could become “electronic persons” and bear responsibility for any harm they cause when acting independently. As the Resolution suggests, robots could become legal entities, similar to how animals are treated in Spain. However, how this will work in practice remains an open question.
David Gunkel, an American academic and professor, discusses the social status of robots in his book *”Robot Rights.”* He argues that robots currently exist in a “gray area” between objects and physical persons. Gunkel suggests that personhood is not exclusively a human category; robots, as the property of legal entities, could potentially claim a similar status.
Gunkel proposes that we should consider robots based on how we interact with them rather than focusing solely on what they are. The key question is whether social and moral status should be determined by specific criteria. He asks whether it might be more appropriate to first consider the significance of the entity to us, rather than its nature, and to act responsibly.
In 2019, German and Dutch researchers conducted a study where participants had to decide whether to sacrifice a subject (a human, a humanoid robot, or a machine-like robot) to save a group of strangers. The closer the robot resembled a human, the less willing people were to sacrifice it. In more complex scenarios, participants even prioritized the robot over an injured human.
A similar experiment was conducted by researcher Kate Darling. Participants were given robotic toy dinosaurs that responded to human movements. After playing with them, they were asked to destroy the robots, but almost everyone refused, which triggered negative emotions among all present.
These experiments demonstrate that the closer a robot resembles a living being, the more empathy it elicits, and people attribute a certain moral status to it. This cannot be ignored given the rapid development of robotics.
While opinions on granting robots moral status vary, no one completely dismisses this possibility. Experts increasingly discuss the issue, understanding that a decision needs to be made before robots become an integral part of society, or else ethical challenges will continue to escalate.