{"id":167425,"date":"2023-07-13T10:25:42","date_gmt":"2023-07-13T15:25:42","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2023\/07\/the-wizardly-owl-brain-uses-bayesian-inference-to-find-prey"},"modified":"2023-07-13T10:25:42","modified_gmt":"2023-07-13T15:25:42","slug":"the-wizardly-owl-brain-uses-bayesian-inference-to-find-prey","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2023\/07\/the-wizardly-owl-brain-uses-bayesian-inference-to-find-prey","title":{"rendered":"The wizardly owl brain uses \u201cBayesian inference\u201d to find prey"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/the-wizardly-owl-brain-uses-bayesian-inference-to-find-prey2.jpg\"><\/a><\/p>\n<p>This was a surprise. Animals have brain maps for vision and touch, but these are built from visual images and touch receptors that map onto the brain through direct point\u2011to\u2011point projections. With ears, it\u2019s entirely different. The brain compares information received from each ear about the timing and intensity of a sound and then translates the differences into a unified perception of a single sound issuing from a specific region of space. The resulting auditory map allows owls to \u201csee\u201d the world in two dimensions with their ears.<\/p>\n<p>This proved to be a big leap toward understanding how the brain of any animal, including humans, learns to grasp its environment through sound. Think of it. Standing in a forest, you hear the crack of a falling branch or the rustle of a deer\u2019s step in the dry leaves. Your brain calculates the time and intensity of sound to determine where it\u2019s coming from. Owls do this task with incredible speed and accuracy. Each cochlea in the owl provides the brain with the precise timing of the sound reaching that ear within 20 microseconds. This determines how accurately the brain can calculate the <a href=\"https:\/\/en.wikipedia.org\/wiki\/Interaural_time_difference\">interaural time difference<\/a>, which in turn determines the accuracy of the localization of a sound in the azimuth. \u201cThe precision in microseconds provided by the owl cochlea is better than in any other animal that has been tested,\u201d says K\u00f6ppl. \u201cWe have big heads, so the interaural time differences are larger, making the task for cochlea and brain easier. In a nutshell, it is the combination of a small head and very precise localization that makes the owl unique.\u201d<\/p>\n<p>And here\u2019s a finding to drop the jaw. Jos\u00e9 Luis Pe\u00f1a, a neuroscientist at the Albert Einstein College of Medicine, and his collaborators have discovered that the sound localization system in a barn owl\u2019s brain performs sophisticated mathematical computations to execute this pinpointing of prey. The space\u2011specific neurons in the owl\u2019s specialized auditory brain do advanced math when they transmit their information, not just adding and multiplying incoming signals but averaging them and using a statistical method called \u201c<a href=\"https:\/\/bigthink.com\/smart-skills\/bayesian-search-find-stuff-lost\/\" target=\"_blank\" rel=\"noreferrer noopener\">Bayesian<\/a> inference,\u201d which involves updating as more information becomes available.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>This was a surprise. Animals have brain maps for vision and touch, but these are built from visual images and touch receptors that map onto the brain through direct point\u2011to\u2011point projections. With ears, it\u2019s entirely different. The brain compares information received from each ear about the timing and intensity of a sound and then translates [\u2026]<\/p>\n","protected":false},"author":661,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[11,2229,47],"tags":[],"class_list":["post-167425","post","type-post","status-publish","format-standard","hentry","category-biotech-medical","category-mathematics","category-neuroscience"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/167425","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/661"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=167425"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/167425\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=167425"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=167425"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=167425"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}