{"id":164240,"date":"2023-05-19T18:22:52","date_gmt":"2023-05-19T23:22:52","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2023\/05\/is-buzzy-startup-humanes-big-idea-a-wearable-camera"},"modified":"2023-05-19T18:22:52","modified_gmt":"2023-05-19T23:22:52","slug":"is-buzzy-startup-humanes-big-idea-a-wearable-camera","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2023\/05\/is-buzzy-startup-humanes-big-idea-a-wearable-camera","title":{"rendered":"Is buzzy startup Humane\u2019s big idea a wearable camera?"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/is-buzzy-startup-humanes-big-idea-a-wearable-camera2.jpg\"><\/a><\/p>\n<p>The demo is clever, questionably real, and prompts a lot of questions about how this device will actually work.<\/p>\n<p>Buzz has been building around the secretive tech startup Humane for <a href=\"https:\/\/www.theverge.com\/2023\/3\/9\/23631911\/humane-apple-startup-wearable-camera-artificial-intelligence-series-c-funding-round\">over a year<\/a>, and now the company is finally offering a look at what it\u2019s been building. At TED last month, Humane co-founder Imran Chaudhri gave a demonstration of the AI-powered wearable the company is building as a replacement for smartphones. Bits of the video <a href=\"https:\/\/www.theverge.com\/2023\/4\/21\/23692368\/humane-ted-talk-imran-chaudhri-wearable-screenless-device-voice-commands-projected-screen\">leaked online<\/a> after the event, but <a href=\"https:\/\/www.ted.com\/talks\/imran_chaudhri_the_disappearing_computer_and_a_world_where_you_can_take_ai_everywhere\">the full video<\/a> is now available to watch.<\/p>\n<p>The device appears to be a small black puck that slips into your breast pocket, with a camera, projector, and speaker sticking out the top. Throughout the 13-minute presentation, Chaudhri walks through a handful of use cases for Humane\u2019s gadget: * The device rings when Chaudhri receives a phone call. He holds his hand up, and the device projects the caller\u2019s name along with icons to answer or ignore the call. He then has a brief conversation. (Around 1:48 in the video) * He presses and holds one finger on the device, then asks a question about where he can buy a gift. The device responds with the name of a shopping district. (Around 6:20) * He taps two fingers on the device, says a sentence, and the device translates the sentence into another language, stating it back using an AI-generated clone of his voice. (Around 6:55) * He presses and holds one finger on the device, says, \u201cCatch me up,\u201d and it reads out a summary of recent emails, calendar events, and messages. (At 9:45) * He holds a chocolate bar in front of the device, then presses and holds one finger on the device while asking, \u201cCan I eat this?\u201d The device recommends he does not because of a food allergy he has. He presses down one finger again and tells the device he\u2019s ignoring its advice. (Around 10:55) <\/p>\n<p>Chaudhri, who previously worked on design at Apple for more than two decades, pitched the device as a salve for a world covered in screens. \u201cSome believe AR \/ VR glasses like these are the answer,\u201d he said, an image of VR headsets behind him. He argued those devices \u2014 like smartphones \u2014 put \u201ca further barrier between you and the world.\u201d<\/p>\n<p>Humane\u2019s device, whatever it\u2019s called, is designed to be more natural by eschewing the screen. The gadget operates on its own. \u201cYou don\u2019t need a smartphone or any other device to pair with it,\u201d he said.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The demo is clever, questionably real, and prompts a lot of questions about how this device will actually work. Buzz has been building around the secretive tech startup Humane for over a year, and now the company is finally offering a look at what it\u2019s been building. At TED last month, Humane co-founder Imran Chaudhri [\u2026]<\/p>\n","protected":false},"author":578,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1498,1506,1495,1512,6,1879,1977],"tags":[],"class_list":["post-164240","post","type-post","status-publish","format-standard","hentry","category-augmented-reality","category-food","category-health","category-mobile-phones","category-robotics-ai","category-virtual-reality","category-wearables"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/164240","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/578"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=164240"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/164240\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=164240"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=164240"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=164240"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}