{"id":113162,"date":"2020-09-18T18:28:15","date_gmt":"2020-09-19T01:28:15","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2020\/09\/new-data-processing-module-makes-deep-neural-networks-smarter"},"modified":"2020-09-18T18:28:15","modified_gmt":"2020-09-19T01:28:15","slug":"new-data-processing-module-makes-deep-neural-networks-smarter","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2020\/09\/new-data-processing-module-makes-deep-neural-networks-smarter","title":{"rendered":"New data processing module makes deep neural networks smarter"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/new-data-processing-module-makes-deep-neural-networks-smarter2.jpg\"><\/a><\/p>\n<p>Artificial intelligence researchers at North Carolina State University have improved the performance of deep neural networks by combining feature normalization and feature attention modules into a single module that they call attentive normalization (AN). The hybrid module improves the accuracy of the system significantly, while using negligible extra computational power.<\/p>\n<p>\u201cFeature normalization is a <a href=\"https:\/\/techxplore.com\/tags\/crucial+element\/\" rel=\"tag\" class=\"\">crucial element<\/a> of training deep neural networks, and feature attention is equally important for helping networks highlight which features learned from raw data are most important for accomplishing a given task,\u201d says Tianfu Wu, corresponding author of a paper on the work and an assistant professor of electrical and computer engineering at NC State. \u201cBut they have mostly been treated separately. We found that combining them made them more efficient and effective.\u201d<\/p>\n<p>To test their AN module, the researchers plugged it into four of the most widely used neural <a href=\"https:\/\/techxplore.com\/tags\/network\/\" rel=\"tag\" class=\"\">network<\/a> architectures: ResNets, DenseNets, MobileNetsV2 and AOGNets. They then tested the networks against two industry standard benchmarks: the ImageNet-1000 classification <a href=\"https:\/\/techxplore.com\/tags\/benchmark\/\" rel=\"tag\" class=\"\">benchmark<\/a> and the MS-COCO 2017 object detection and instance segmentation benchmark.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Artificial intelligence researchers at North Carolina State University have improved the performance of deep neural networks by combining feature normalization and feature attention modules into a single module that they call attentive normalization (AN). The hybrid module improves the accuracy of the system significantly, while using negligible extra computational power. \u201cFeature normalization is a crucial [\u2026]<\/p>\n","protected":false},"author":427,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[],"class_list":["post-113162","post","type-post","status-publish","format-standard","hentry","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/113162","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/427"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=113162"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/113162\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=113162"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=113162"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=113162"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}