{"id":157317,"date":"2023-02-08T15:24:09","date_gmt":"2023-02-08T21:24:09","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2023\/02\/googles-a-i-powered-multisearch-which-combines-text-and-images-in-a-single-query-goes-global"},"modified":"2023-02-08T15:24:09","modified_gmt":"2023-02-08T21:24:09","slug":"googles-a-i-powered-multisearch-which-combines-text-and-images-in-a-single-query-goes-global","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2023\/02\/googles-a-i-powered-multisearch-which-combines-text-and-images-in-a-single-query-goes-global","title":{"rendered":"Google\u2019s A.I.-powered \u2018multisearch,\u2019 which combines text and images in a single query, goes global"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/googles-a-i-powered-multisearch-which-combines-text-and-images-in-a-single-query-goes-global2.jpg\"><\/a><\/p>\n<p>Amid other A.I.-focused announcements, Google today shared that its newer \u201c<a href=\"https:\/\/blog.google\/products\/search\/multisearch\/\" target=\"_blank\" rel=\"noopener\">multisearch<\/a>\u201d feature would now be available to global users on mobile devices, anywhere that Google Lens is already available. The search feature, which allows users to search using both text and images at the same time, <a href=\"https:\/\/techcrunch.com\/2022\/04\/07\/googles-multisearch-search-using-text-images\/\">was first introduced last April<\/a> as a way to modernize Google search to take better advantage of the smartphone\u2019s capabilities. A variation on this, \u201c<a href=\"https:\/\/blog.google\/products\/search\/searching-for-holiday-meals\/\" target=\"_blank\" rel=\"noopener\">multisearch near me<\/a>,\u201d which targets searches to local businesses, will also become globally available over the next few months, as will multisearch for the web and a new Lens feature for Android users.<\/p>\n<p>As Google previously <a href=\"https:\/\/techcrunch.com\/2021\/09\/29\/google-introduces-a-new-way-to-search-that-combines-images-and-text-into-one-query\/\">explained<\/a>, multisearch is powered by A.I. technology called Multitask Unified Model, or <a href=\"https:\/\/www.blog.google\/products\/search\/introducing-MUM\/\" target=\"_blank\" rel=\"noopener\">MUM<\/a>, which can understand information across a variety of formats, including text, photos, and videos, and then draw insights and connections between topics, concepts, and ideas. Google put MUM to work within its Google Lens visual search features, where it would allow users to add text to a visual search query.<\/p>\n<p>\u201cWe redefined what we mean to search by introducing Lens. We\u2019ve since brought Lens directly to the search bar and we continue to bring new capabilities like shopping and step-by-step homework help,\u201d Prabhakar Raghavan, Google\u2019s SVP in charge Search, Assistant, Geo, Ads, Commerce and Payments products, said at <a href=\"https:\/\/techcrunch.com\/tag\/google-live-from-paris\/\">a press event in Paris<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Amid other A.I.-focused announcements, Google today shared that its newer \u201cmultisearch\u201d feature would now be available to global users on mobile devices, anywhere that Google Lens is already available. The search feature, which allows users to search using both text and images at the same time, was first introduced last April as a way to [\u2026]<\/p>\n","protected":false},"author":578,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[43,1512,6],"tags":[],"class_list":["post-157317","post","type-post","status-publish","format-standard","hentry","category-business","category-mobile-phones","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/157317","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/578"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=157317"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/157317\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=157317"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=157317"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=157317"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}