{"id":77869,"date":"2018-04-16T19:02:42","date_gmt":"2018-04-17T02:02:42","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2018\/04\/google-made-an-ar-microscope-that-can-help-detect-cancer"},"modified":"2018-04-16T19:02:42","modified_gmt":"2018-04-17T02:02:42","slug":"google-made-an-ar-microscope-that-can-help-detect-cancer","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2018\/04\/google-made-an-ar-microscope-that-can-help-detect-cancer","title":{"rendered":"Google made an AR microscope that can help detect cancer"},"content":{"rendered":"<p><\/p>\n<p><iframe style=\"display: block; margin: 0 auto; width: 100%; aspect-ratio: 4\/3; object-fit: contain;\" src=\"https:\/\/www.youtube.com\/embed\/9Mz84cwVmS0?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; encrypted-media; gyroscope;\n   picture-in-picture\" allowfullscreen><\/iframe><\/p>\n<p>In a talk given today at the American Association for Cancer Research\u2019s annual meeting, <a href=\"https:\/\/www.engadget.com\/2018\/02\/19\/google-ai-can-scan-your-eyes-to-predict-heart-disease\/\">Google<\/a> researchers described a prototype of an augmented reality microscope that could be used to help physicians diagnose patients. When pathologists are analyzing biological tissue to see if there are signs of cancer \u2014 and if so, how much and what kind \u2014 the process can be quite time-consuming. And it\u2019s a practice that Google thinks could benefit from deep learning tools. But in many places, adopting <a href=\"https:\/\/www.engadget.com\/2018\/01\/23\/ibm-researchers-ai-predict-risk-psychosis\/\">AI<\/a> technology isn\u2019t feasible. The company, however, believes this microscope could allow groups with limited funds, such as small labs and clinics, or developing countries to benefit from these tools in a simple, easy-to-use manner. Google says the scope could \u201cpossibly help accelerate and democratize the adoption of deep learning tools for pathologists around the world.\u201d<\/p>\n<p>The microscope is an ordinary light microscope, the kind used by pathologists worldwide. Google just tweaked it a little in order to introduce AI technology and augmented reality. First, neural networks are trained to detect cancer cells in images of human tissue. Then, after a slide with human tissue is placed under the modified microscope, the same image a person sees through the scope\u2019s eyepieces is fed into a computer. <a href=\"https:\/\/www.engadget.com\/2017\/09\/17\/ai-alzheimers-early-detection\/\">AI<\/a> algorithms then detect cancer cells in the tissue, which the system then outlines in the image seen through the eyepieces (see image above). It\u2019s all done in real time and works quickly enough that it\u2019s still effective when a pathologist moves a slide to look at a new section of tissue.<\/p>\n<p><!-- Link: <a href=\"https:\/\/www.engadget.com\/2018\/04\/16\/google-ar-microscope-help-detect-cancer\/\">https:\/\/www.engadget.com\/2018\/04\/16\/google-ar-microscope-help-detect-cancer\/<\/a> --><\/p>\n","protected":false},"excerpt":{"rendered":"<p>In a talk given today at the American Association for Cancer Research\u2019s annual meeting, Google researchers described a prototype of an augmented reality microscope that could be used to help physicians diagnose patients. When pathologists are analyzing biological tissue to see if there are signs of cancer \u2014 and if so, how much and what [\u2026]<\/p>\n","protected":false},"author":396,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1498,11,41,6],"tags":[],"class_list":["post-77869","post","type-post","status-publish","format-standard","hentry","category-augmented-reality","category-biotech-medical","category-information-science","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/77869","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/396"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=77869"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/77869\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=77869"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=77869"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=77869"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}