{"id":185612,"date":"2024-03-21T10:25:24","date_gmt":"2024-03-21T15:25:24","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2024\/03\/machine-learning-tools-can-predict-emotion-in-voices-in-just-over-a-second"},"modified":"2024-03-21T10:25:24","modified_gmt":"2024-03-21T15:25:24","slug":"machine-learning-tools-can-predict-emotion-in-voices-in-just-over-a-second","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2024\/03\/machine-learning-tools-can-predict-emotion-in-voices-in-just-over-a-second","title":{"rendered":"Machine learning tools can predict emotion in voices in just over a second"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/machine-learning-tools-can-predict-emotion-in-voices-in-just-over-a-second2.jpg\"><\/a><\/p>\n<p>Words are important to express ourselves. What we don\u2019t say, however, may be even more instrumental in conveying emotions. Humans can often tell how people around them feel through non-verbal cues embedded in our voice.<\/p>\n<p>Now, researchers in Germany have sought to find out if technical tools, too, can accurately predict emotional undertones in fragments of voice recordings. To do so, they compared three ML models\u2019 accuracy to recognize diverse emotions in audio excepts. Their results were published in Frontiers in Psychology.<\/p>\n<p>\u201cHere we show that <a href=\"https:\/\/techxplore.com\/tags\/machine+learning\/\" rel=\"tag\" class=\"\">machine learning<\/a> can be used to recognize emotions from audio clips as short as 1.5 seconds,\u201d said the article\u2019s first author Hannes Diemerling, a researcher at the Center for Lifespan Psychology at the Max Planck Institute for Human Development. \u201cOur models achieved an accuracy similar to humans when categorizing meaningless sentences with emotional coloring spoken by actors.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Words are important to express ourselves. What we don\u2019t say, however, may be even more instrumental in conveying emotions. Humans can often tell how people around them feel through non-verbal cues embedded in our voice. Now, researchers in Germany have sought to find out if technical tools, too, can accurately predict emotional undertones in fragments [\u2026]<\/p>\n","protected":false},"author":661,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[],"class_list":["post-185612","post","type-post","status-publish","format-standard","hentry","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/185612","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/661"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=185612"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/185612\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=185612"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=185612"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=185612"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}