A robotics firm in Bristol has questioned calls to create a new legal status of ‘electronic persons’ – fearing it could diminish the responsibility of the programmers.
GWS Robotics, which supplies robots for a range of applications, has responded to the European Parliament’s plans to establish laws to govern robots and artificial intelligence.
A draft report outlining a possible regulatory framework was approved by the European Parliament’s legal affairs committee last month.
It covers a broad range of Artificial Intelligence (AI) related issues, including creating a new legal status of “electronic persons” for the most sophisticated autonomous robots.
But GWS Robotics, which develops software specifically tailored for businesses focused on customer service, believes it could have serious ethical implications.
Creative director David Graves said: “Robots are essentially sophisticated digital devices, not living creatures, and cannot possess the rights equivalent to animals or humans.
“The granting of ‘electronic person’ status to robots carries serious ethical risks - diminishing the responsibilities of the humans who program and operate them.
“Machines will only be as dangerous as they are allowed to be by their designers in the first place. The responsibility for ensuring they do not endanger people should be in the hands of the designers, programmers and operators.”
The report, authored by European member of parliament (MEP) Mady Delvaux, suggests that artificial intelligence should be broadly designed in accordance with The Three Laws of Robotics.
This is a set of rules devised by science fiction writer Isaac Asimov, stipulating robots must obey orders and not injure humans or allow them to come to harm.
It also recommends that robots should be fitted with “kill” switches so that they can be shut down in emergencies.
David, a Cambridge University graduate who has worked as a computer programmer for nearly two decades, said: “Of course robots shouldn’t harm humans, but to not allow humans to come to harm is difficult to codify and could justify actions that might be problematic from a legal standpoint.
“If a robot prevented you from doing something because it thought there was a risk of harm to you, whether or not it was correct that could still involve a significant reduction in your personal freedom.”
The team at the firm, which has invested in a four feet tall humanoid robot called Pepper, created by SoftBank Robotics, believe a ‘kill switch’ is a sensationalist way of describing an ‘off switch’.
David, 44, from Upper Knowle, said: “Since robots are machines, just like vacuum cleaners, industrial machinery and cars, there must of course be a way to switch them off quickly -whether in an emergency or in the course of normal use.
“If we were talking about military autonomous robots designed to physically coerce, disable or kill, then having multiple layers of protection, including a way to remotely disable them, would become very relevant.
“But Pepper and other social robots are no more of a threat than any other machine with limited mobility, autonomy and intelligence and limited physical ability to cause harm."
The report also calls for the creation of a European agency for robotics and artificial intelligence that can provide technical, ethical and regulatory expertise.
David said: “International cooperation on scientific research is well-established and very important to progress.
“Law and technology can be uncomfortable bedfellows, with the law used to prevent or impede technological progress by vested interests.
“But an agency like this could help to set out parameters and best practice for Artificial Intelligence (AI) developers and programmers worldwide, so I think it could be very valuable.”
GWS Robotics is a company spun off from design and digital marketing company GWS Media in Queen Charlotte Street.
The full house of the European Parliament will vote on the draft proposals this month.
"The granting of electronic person status to robots carries serious ethical risks - diminishing the responsibilities of the humans who program and operate them."
DISCLAIMER: The statements, opinions, views and advice expressed in this article are those of the author/organisation and not of ENTIRELY. This article should represent information correct at the time of publication however whilst every care has been taken to present up-to-date and accurate information, we cannot guarantee that inaccuracies will not occur. ENTIRELY will not be held responsible for any claim, loss, damage or inconvenience caused as a result of any information within this article or any information accessed through this site. The content of any organisations websites which you link to from ENTIRELY are entirely out of the control of ENTIRELY, and you proceed at your own risk. These links are provided purely for your convenience and do not imply any endorsement of or association with any products, services, content, information or materials offered by or accessible to you at the organisations site.