{"id":73442,"date":"2017-10-24T11:22:28","date_gmt":"2017-10-24T18:22:28","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2017\/10\/googles-ai-is-binge-watching-human-behavior-on-youtube"},"modified":"2017-10-24T11:22:28","modified_gmt":"2017-10-24T18:22:28","slug":"googles-ai-is-binge-watching-human-behavior-on-youtube","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2017\/10\/googles-ai-is-binge-watching-human-behavior-on-youtube","title":{"rendered":"Google\u2019s AI is binge-watching human behavior on YouTube"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/googles-ai-is-binge-watching-human-behavior-on-youtube.jpg\"><\/a><\/p>\n<p>Robots are watching us. Literally.<\/p>\n<p>Google has curated a set of YouTube clips to help machines learn how humans exist in the world. The AVAs, or \u201catomic visual actions,\u201d are three-second clips of people doing everyday things like drinking water, taking a photo, playing an instrument, hugging, standing or cooking.<\/p>\n<p>Each clip labels the person the AI should focus on, along with a description of their pose and whether they\u2019re interacting with an object or another human.<\/p>\n<p><!-- Link: <a href=\"http:\/\/nypost.com\/2017\/10\/23\/googles-ai-is-binge-watching-human-behavior-on-youtube\/\">http:\/\/nypost.com\/2017\/10\/23\/googles-ai-is-binge-watching-hu...n-youtube\/<\/a> --><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Robots are watching us. Literally. Google has curated a set of YouTube clips to help machines learn how humans exist in the world. The AVAs, or \u201catomic visual actions,\u201d are three-second clips of people doing everyday things like drinking water, taking a photo, playing an instrument, hugging, standing or cooking. Each clip labels the person [\u2026]<\/p>\n","protected":false},"author":354,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[],"class_list":["post-73442","post","type-post","status-publish","format-standard","hentry","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/73442","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/354"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=73442"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/73442\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=73442"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=73442"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=73442"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}