Nearly 80 years after author Isaac Asimov created robotic laws, modern robotics mean that new rules have to be introduced, says American lawyer Frank Pasquale, an expert in artificial intelligence.
The world has changed since the author of science fiction Isaac Asimov in 1942 formulated his three rules for robots, the first of which suggests that the robot can not hurt the human being.
Computers and Artificial Intelligence (AI) are now part of our daily life and need updating, AFP Frank Pasquale said at the margins of a symposium organized on this topic this week at the Academy. The Papal Society of the Vatican.
According to him, the four new rules of "legal inspiration" could be useful to apply to robots and artificial intelligence, and first, robots "need to supplement and not replace professionals".
"Instead of having a robotic physician, we hope to be treated by a doctor who understands how AI works, who gets good advice and who ultimately decides what to do or does not do," says Pasquale.
Another rule advocating is to stop the race for robotic weapons, whether intended for military or police use.
"It's depressing, and that's also the money out of the window, you build a robot that can tell if my robot predicts your robot will attack you and so on, it's endless …" says Frank Pasquale.
The third rule, most controversial to Mr. Pasquale, is not to create humanoid robots or artificial intelligence such as Duplex Assistant, suggested by Google, an application that invites you to book a restaurant table.
"The public immediately reacted that Google was trying to procure its machines to people, which is equally counterfeit, and I think we should avoid counterfeiting of humanity," says M Pasquale, who teaches law at Maryland University, East.
Robots can have a human appearance only if this property is "needed for the task they have to do, as is the case with robots used for nursing or sexual robots".
– "Affiliated with Man" –
The Fourth and Last Law: Every robot or artificial intelligence must be connected to or belong to a person or group of people "because we know how to punish people, but not machines," says an expert.
"If we create unmanned aids or cars running alone, robots expressing on Twitter or playing on the stock market, all this must be related to the human being," he says.
Frank Pasquale explains that companies such as Facebook are being sentenced to fines, but never one of their members is forced to pay out of his pocket, let alone put in jail if a serious mistake is made.
"We already have a business problem and if we allow robots to help us without being directly connected with a person or company, it will only worsen," warns Pasquale, author of the book "New Robotics Law" (New Robotic Laws) coming from the Harvard University Press .
What he has to avoid, he says, is two-speed technology, as suggested by Boeing for his 737-MAX aircraft, based on the deaths of hundreds of people in two accidents attributed to the sensor. he did not succeed.
"Boeing has decided to offer airline companies to buy another sensor, at an additional cost of about $ 80,000 per aircraft, which will certainly cause cascading problems," M said. Pasquale.
He believes that robotic law should be applied in this case to prohibit what might be considered a "good contract" for an aircraft manufacturer. "There should be one standard for all companies, avoiding checking whether Ryanair, American Airlines or Lufthansa have additional sensors, which is nonsense."
? 2019 AFP