{"id":10491,"date":"2014-03-31T19:31:40","date_gmt":"2014-04-01T02:31:40","guid":{"rendered":"http:\/\/lifeboat.com\/blog\/?p=10491"},"modified":"2017-06-04T12:08:32","modified_gmt":"2017-06-04T19:08:32","slug":"why-asimovs-three-laws-of-robotics-cant-protect-us","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2014\/03\/why-asimovs-three-laws-of-robotics-cant-protect-us","title":{"rendered":"Why Asimov\u2019s Three Laws Of Robotics Can\u2019t Protect Us"},"content":{"rendered":"<p class=\"first-text\" data-textannotation-id=\"08a4d342ef05496ec2d14a03ec686256\">George Dvorsky \u2014 i09<br \/> It\u2019s been 50 years since <a href=\"http:\/\/io9.com\/50-years-ago-isaac-asimov-predicted-what-2014-would-lo-1493111283\">Isaac Asimov<\/a> devised his famous Three Laws of Robotics \u2014 a set of rules designed to ensure friendly robot behavior. Though intended as a literary device, these laws are heralded by some as a ready-made prescription for avoiding the robopocalypse. We spoke to the experts to find out if Asimov\u2019s safeguards have stood the test of time \u2014 and they haven\u2019t.<\/p>\n<p>First, a quick overview of the Three Laws. As stated by Asimov in his 1942 short story \u201cRunaround\u201d:<br \/> 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.<\/p>\n<p>2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.<\/p>\n<p>3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.<\/p>\n<p><a href=\"http:\/\/io9.com\/why-asimovs-three-laws-of-robotics-cant-protect-us-1553665410\" target=\"_blank\">Read more<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>George Dvorsky \u2014 i09 It\u2019s been 50 years since Isaac Asimov devised his famous Three Laws of Robotics \u2014 a set of rules designed to ensure friendly robot behavior. Though intended as a literary device, these laws are heralded by some as a ready-made prescription for avoiding the robopocalypse. We spoke to the experts to [\u2026]<\/p>\n","protected":false},"author":76,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[],"class_list":["post-10491","post","type-post","status-publish","format-standard","hentry","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/10491","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/76"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=10491"}],"version-history":[{"count":2,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/10491\/revisions"}],"predecessor-version":[{"id":65006,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/10491\/revisions\/65006"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=10491"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=10491"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=10491"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}