Toggle light / dark theme

Emotion recognition based on paralinguistic information

Researchers at the University of Texas at Arlington have recently explored the use of machine learning for emotion recognition based solely on paralinguistic information. Paralinguistics are aspects of spoken communication that do not involve words, such as pitch, volume, intonation, etc.

Recent advances in have led to the development of tools that can recognize by analyzing images, voice recordings, electroencephalograms or electrocardiograms. These tools could have several interesting applications, for instance, enabling more efficient human-computer interactions in which a computer recognizes and responds to a human user’s emotions.

“In general, one may argue that speech carries two distinct types of : explicit or linguistic information, which concerns articulated patterns by the speaker; and implicit or paralinguistic information, which concerns the variation in pronunciation of the linguistic patterns,” the researchers wrote in their paper, published in the Advances in Experimental Medicine and Biology book series. “Using either or both types of information, one may attempt to classify an audio segment that consists of speech, based on the emotion(s) it carries. However, from speech appears to be a significantly difficult task even for a human, no matter if he/she is an expert in this field (e.g. a psychologist).”

Factory robot malfunctions and impales worker with 10 foot-long steel spikes

A CHINESE factory worker has survived being skewered with ten metal spikes when a machine malfunctioned.

The 49-year-old, named as Mr Zhou, was working on the night shift at a porcelain factory in Hunan Hunan province when he was struck by a falling mechanical arm.

The accident resulted in him being impaled with foot long, half inch thick metal rods, the People’s Daily reported.

‘Her’, OS Sentience, and the Desire to Love

After watching Spike Jonze’s epic sci-fi film Her, I felt as if my mind was, metaphorically of course, absolutely blown away. The film far exceeded my expectations of how it would make me feel, let alone make me think! I found myself wanting to tell everyone I knew to stop what they were doing and take the time to really watch it, listen to it, and absorb it. I spoke of other great films that captured both my heart and mind, like Robot and Frank, but no film has ever really achieved what Spike Jonze’s Her achieves.


A review of Spike Jonze’s 2013 sci-fi film.

China Launches 1st Mission to Land on the Far Side of the Moon

The first-ever surface mission to the far side of the moon is underway.

China’s robotic Chang’e 4 spacecraft streaked away from Earth today (Dec. 7), launching atop a Long March 3B rocket from the Xichang Satellite Launch Center at about 1:23 p.m. EST (1823 GMT; 2:23 a.m. on Dec. 8 local China time).

If all goes according to plan, Chang’e 4 will make history’s first landing on the lunar far side sometime in early January. The mission, which consists of a stationary lander and a rover, will perform a variety of science work and plant a flag for humanity in a region that remains largely unexplored to date. [China’s Chang’e 4 Moon Far Side Mission in Pictures].

How the brain’s face code might unlock the mysteries of perception

The view of the world through any primate’s eyes is funnelled from the retina into the visual cortex, the various layers of which do the initial processing of incoming information. At first, it’s little more than pixels of dark or bright colours, but within 100 milliseconds the information zaps through a network of brain areas for further processing to generate a consciously recognized, 3D landscape with numerous objects moving around in it.


Doris Tsao mastered facial recognition in the brain. Now she’s looking to determine the neural code for everything we see. Doris Tsao mastered facial recognition in the brain. Now she’s looking to determine the neural code for everything we see.

Amazon And Microsoft Claim AI Can Read Human Emotions. Experts Say the Science Is Shaky

Facial recognition technology is being tested by businesses and governments for everything from policing to employee timesheets. Even more granular results are on their way, promise the companies behind the technology: Automatic emotion recognition could soon help robots understand humans better, or detect road rage in car drivers.

But experts are warning that the facial-recognition algorithms that attempt to interpret facial expressions could be based on uncertain science. The claims are a part of AI Now Institute’s annual report, a nonprofit that studies the impact of AI on society. The report also includes recommendations for the regulation of AI and greater transparency in the industry.

“The problem is now AI is being applied in a lot of social contexts. Anthropology, psychology, and philosophy are all incredibly relevant, but this is not the training of people who come from a technical [computer science] background.” says Kate Crawford, co-founder of AI Now, distinguished research professor at NYU and principal researcher at Microsoft Research. “Essentially the narrowing of AI has produced a kind of guileless acceptance of particular strands of psychological literature that have been shown to be suspect.”