{"id":153451,"date":"2022-12-22T23:22:19","date_gmt":"2022-12-23T05:22:19","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2022\/12\/meta-optics-the-disruptive-technology-you-didnt-see-coming"},"modified":"2022-12-22T23:22:19","modified_gmt":"2022-12-23T05:22:19","slug":"meta-optics-the-disruptive-technology-you-didnt-see-coming","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2022\/12\/meta-optics-the-disruptive-technology-you-didnt-see-coming","title":{"rendered":"Meta-optics: The disruptive technology you didn\u2019t see coming"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/meta-optics-the-disruptive-technology-you-didnt-see-coming2.jpg\"><\/a><\/p>\n<p>Robots and autonomous cars will have eyes that see much more than the human eye is capable of, a review of the growing field of meta-optics has found.<\/p>\n<p>Meta-optics is advancing science and technology far beyond the 3,000-year-old optical paradigm that we rely on for the visual <a href=\"https:\/\/phys.org\/tags\/human-machine+interface\/\" rel=\"tag\" class=\"\">human-machine interface<\/a>, such as through cameras in our mobile phones, the lenses in microscopes, drones, and telescopes. Optical components are the technology bottleneck that meta-optics aims to transform, bringing the stuff of science-fiction stories into everyday devices.<\/p>\n<p>The field, which blossomed after the early 2000s thanks to the conceptualization of a material with <a href=\"https:\/\/phys.org\/tags\/negative+refractive+index\/\" rel=\"tag\" class=\"\">negative refractive index<\/a> that could form a perfect lens, has grown rapidly in the last five years and now sees around 3,000 publications a year.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Robots and autonomous cars will have eyes that see much more than the human eye is capable of, a review of the growing field of meta-optics has found. Meta-optics is advancing science and technology far beyond the 3,000-year-old optical paradigm that we rely on for the visual human-machine interface, such as through cameras in our [\u2026]<\/p>\n","protected":false},"author":359,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1499,1488,1512,6],"tags":[],"class_list":["post-153451","post","type-post","status-publish","format-standard","hentry","category-cyborgs","category-drones","category-mobile-phones","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/153451","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/359"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=153451"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/153451\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=153451"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=153451"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=153451"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}