Toggle light / dark theme

Thinking With X ~ David Orban


We are convinced, through the collective narrative leveraging our scientific understanding, that we think with our brains. While that is true, there is much more: some of us think with our stomach, an sculptor will think with her hands, a ballet dancer with her entire body. Our proprioception extends feedback loops outside of the body, extending what we are, how we think and decide, to to tips of an airplane we are piloting. As we are going to be more and more thinking with AI systems, that will support us in interpreting and acting on the world, the responsibility of user interaction designers is huge. They are shaping the systems that are going to shape what we are.

SCIENTISTS have created a “shrink ray” that can reduce objects to one-thousandth of their original size.

The mind-blowing gizmo could one day be used to create nano-robots that are smaller that we can physically produce today.

This brings us a step closer to making 1989 sci-fi comedy Honey, I Shrunk the Kids a reality.

For more information on Jay Tuck, please visit our website www.tedxhamburg.de

US defense expert Jay Tuck was news director of the daily news program ARD-Tagesthemen and combat correspondent for GermanTelevision in two Gulf Wars. He has produced over 500 segments for the network. His investigative reports on security policy, espionage activities and weapons technology appear in leading newspapers, television networks and magazines throughout Europe, including Cicero, Focus, PC-Welt, Playboy, Stern, Welt am Sonntag and ZEITmagazin. He is author of a widely acclaimed book on electronic intelligence activities, “High-Tech Espionage” (St. Martin’s Press), published in fourteen countries. He is Executive Producer for a weekly technology magazine on international television in the Arab world. For his latest book “Evolution without us – Will AI kill us?” he researched at US drone bases, the Pentagon, intelligence agencies and AI research institutions. His lively talks are accompanied by exclusive video and photographs.

This talk was given at a TEDx event using the TED conference format but independently organized by a local community.

You have to admit it. Some of the uses of artificial intelligence are simply fascinating. One of the more exciting aspects of artificial intelligence is seeing all the potential ways the technology can be applied to our daily lives, even if it at times it seems a little creepy. We have seen artificial intelligence technology shape everything from the medical world to art. However, did you ever think that AI would go on to shape the world of stock images?

RELATED: AIS CONTINUE TO ACT IN UNPREDICTABLE WAYS, SHOULD WE PANIC?

Now if you are familiar with people using AI to create portraits of people who do not exist, then surely this idea came to your mind at some point. Yet, another industry influenced by the world of AI.

A new technology using artificial intelligence detects depressive language in social media posts more accurately than current systems and uses less data to do it.

The technology, which was presented during the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, is the first of its kind to show that, to more accurately detect depressive language, small, high-quality data sets can be applied to deep learning, a commonly used AI approach that is typically data intensive.

Previous psycholinguistic research has shown that the words we use in interaction with others on a daily basis are a good indicator of our mental and emotional state.

This is the final part in a series of in-depth articles examining China’s efforts to build a stronger domestic semiconductor industry amid rising trade tensions.


Some in China see custom AI chips, which can offer superior performance to conventional integrated circuits even when manufactured using older processes, as helping the country loosen its dependence on the US in core technology.

Combining new classes of nanomembrane electrodes with flexible electronics and a deep learning algorithm could help disabled people wirelessly control an electric wheelchair, interact with a computer or operate a small robotic vehicle without donning a bulky hair-electrode cap or contending with wires.

By providing a fully portable, wireless brain-machine interface (BMI), the wearable system could offer an improvement over conventional electroencephalography (EEG) for measuring signals from visually evoked potentials in the . The system’s ability to measure EEG signals for BMI has been evaluated with six human subjects, but has not been studied with disabled individuals.

The project, conducted by researchers from the Georgia Institute of Technology, University of Kent and Wichita State University, was reported on September 11 in the journal Nature Machine Intelligence.