Isaac Asimov's three laws of robotics might not be enough to protect humanity, according to experts. The popular science fiction writer, also a biochemistry professor at Boston University, used three laws to write about robots. However, there have been significant changes and experts now say today's robots are more varied than what Asimov wrote in his stories.
Three Laws Of Robotics
Asimov devised the three laws of robotics in a 1942 short story called "Runaround." According to the first law, a robot won't allow a human being to harm itself through inaction. At the same time, it will not harm humans either. Another law says that a robot has to obey orders, except when an order conflicts with the first law. According to the third law, a robot has to protect itself until such efforts do not conflict with the other two laws.
Today's experts say Isaac Asimov's servant-like robots need a lot of advanced programming to stop them from hurting their human masters. The technological advancements since 1942 have been significant enough to think about Asimov's rules again. Now, robotics is involved in far more diversified services including military drones and autonomous vacuum cleaners.
Isaac Asimov's laws of robotics are still considered as a standard template for developing robots till date, the Daily Mail Online reports. In 2007, the government of South Korea used Asimov's laws to propose a Robot Ethics Charter. However, artificial intelligence in the 21st century is way more updated than it was when Asimov published his first story. Now, it is far more complicated.
Yahoo News UK noted that the science fiction writer was right about his concern regarding unexpected robot behavior. However, his laws fail to be applicable to modern day robotics. For example, military drones have been designed to either harm or kill humans. This goes entirely against Asimov's laws that prevent robots from hurting humans.