{"id":143795,"date":"2022-08-09T03:22:43","date_gmt":"2022-08-09T08:22:43","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2022\/08\/how-image-features-influence-reaction-times"},"modified":"2022-08-09T03:22:43","modified_gmt":"2022-08-09T08:22:43","slug":"how-image-features-influence-reaction-times","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2022\/08\/how-image-features-influence-reaction-times","title":{"rendered":"How image features influence reaction times"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/how-image-features-influence-reaction-times.jpg\"><\/a><\/p>\n<p>It\u2019s an everyday scenario: you\u2019re driving down the highway when out of the corner of your eye you spot a car merging into your lane without signaling. How fast can your eyes react to that visual stimulus? Would it make a difference if the offending car were blue instead of green? And if the color green shortened that split-second period between the initial appearance of the stimulus and when the eye began moving towards it (known to scientists as the saccade), could drivers benefit from an augmented reality overlay that made every merging vehicle green?<\/p>\n<p>Qi Sun, a joint professor in Tandon\u2019s Department of Computer Science and Engineering and the Center for Urban Science and Progress (CUSP), is collaborating with neuroscientists to find out.<\/p>\n<p>He and his Ph.D. student Budmonde Duinkharjav\u2014along with colleagues from Princeton, the University of North Carolina, and NVIDIA Research\u2014recently authored the paper \u201cImage Features Influence Reaction Time: A Learned Probabilistic Perceptual Model for Saccade Latency,\u201d presenting a model that can be used to predict temporal gaze behavior, particularly saccadic latency, as a function of the statistics of a displayed image. Inspired by neuroscience, the model could ultimately have great implications for <a href=\"https:\/\/techxplore.com\/tags\/highway+safety\/\" rel=\"tag\" class=\"\">highway safety<\/a>, telemedicine, e-sports, and in any other arena in which AR and VR are leveraged.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>It\u2019s an everyday scenario: you\u2019re driving down the highway when out of the corner of your eye you spot a car merging into your lane without signaling. How fast can your eyes react to that visual stimulus? Would it make a difference if the offending car were blue instead of green? And if the color [\u2026]<\/p>\n","protected":false},"author":427,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1498,11,47,1879],"tags":[],"class_list":["post-143795","post","type-post","status-publish","format-standard","hentry","category-augmented-reality","category-biotech-medical","category-neuroscience","category-virtual-reality"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/143795","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/427"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=143795"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/143795\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=143795"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=143795"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=143795"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}