Toggle light / dark theme

Imagine that your team is meeting to decide whether to continue an expensive marketing campaign. After a few minutes, it becomes clear that nobody has the metrics on-hand to make the decision. You chime in with a solution and ask Amazon’s virtual assistant Alexa to back you up with information: “Alexa, how many users did we convert to customers last month with Campaign A?” and Alexa responds with the answer. You just amplified your team’s intelligence with AI. But this is just the tip of the iceberg.

Intelligence amplification is the use of technology to augment human intelligence. And a paradigm shift is on the horizon, where new devices will offer less intrusive, more intuitive ways to amplify our intelligence.

Hearables, or wireless in-ear computational earpieces, are an example of intelligence amplification devices that have been adopted recently and rapidly. An example is Apple’s AirPods, which are smart earbuds that connect to Apple devices and integrate with Siri via voice commands. Apple has also filed a patent for earbuds equipped with biometric sensors that could record data such as a user’s temperature, heart rate, and movement. Similarly, Google’s Pixel Buds give users direct access to the Google Assistant and its powerful knowledge graph. Google Assistant seamlessly connects users to information stored in Google platforms, like email and calendar management. Google Assistant also provides users with highly-personalized recommendations, helps automate personal communication, and offloads monotonous tasks like setting timers, managing lists, and controlling IoT devices.

And it could work in wearables and light aircraft.

Researchers at Stanford University are developing an efficient new solar panel material that is fifteen times thinner than paper, a press statement reveals.

Made using transition metal dichalcogenides (TMDs), the materials have the potential to absorb a higher level of sunlight than other solar materials at the same time as providing an incredibly lightweight alternative to silicon-based solar panels.

Searching for silicon alternatives The researchers are part of a concerted effort within the scientific community to find alternative solar panel materials to silicon. Silicon is by far the most common material used for solar panels, but it’s heavy and rigid, meaning it isn’t particularly well suited to lightweight applications required for aircraft, spacecraft, electric vehicles, or even wearables.

Full Story:

In the novel-turned-movie Ready Player One by Ernest Cline, the protagonist escapes to an online realm aptly called OASIS. Instrumental to the OASIS experience is his haptic (relating to sense of touch) bodysuit, which enables him to move through and interact with the virtual world with his body. He can even activate tactile sensations to feel every gut punch, or a kiss from a badass online girl.

While no such technology is commercially available yet, the platform Meta, formerly known as Facebook, is in the early stages of creating haptic gloves to bring the virtual world to our fingertips. These gloves have been in the works for the past seven years, the company recently said, and there’s still a few more to go.

These gloves would allow the wearer to not only interact with and control the virtual world, but experience it in a way similar to how one experiences the physical world. The wearer would use the gloves in tandem with a headset for AR or VR. A video posted by Meta in a blog shows two users having a remote thumb-wrestling match. In their VR headsets, they see a pair of disembodied hands reflecting the motions that their own hands are making. In their gloves, they feel every squeeze and twitch of their partner’s hand—at least that’s the idea.

The team has set an internal deadline of 2025.

In a move that could peg it against electric vehicle market leader, Tesla, Apple has begun working aggressively on its fully autonomous electric car, Bloomberg reported. Developing a car has been on Apple’s agenda since 2014 but recent moves within the company signal a push towards making an Apple car a reality.

Given Apple’s history of taking regularly used products and transforming them into their must-have versions using excellent design, it is hardly a surprise. With Steve Jobs at the helm of affairs, Apple made the iPod even when music players were ubiquitous. Then the company revealed the iPhone when Nokia was still selling resistive touch screens as its premium product. And recently, the Apple Watch has become the “it” wearable even though there are other smartwatch options in the market. During a time where electric vehicles are in a surge, it only seems natural that the electric car is Apple’s next target.

OrCam’s reading device, ElectReon’s ‘smart road’ tech, a sensor for farming and security drones all make the list.


1. OrCam Read, a smart reading support device developed by OrCam Technologies, the maker of artificial intelligence-based wearable devices to help the blind and visually impaired read texts via audio feedback. The company launched OrCam Read in 2,020 a handheld digital reader meant to help people with language processing challenges, including dyslexia. The device (priced at $1,990) captures and reads out full pages of text and digital screens, and follows voice commands.

Wireless sensing devices, tools that allow users to sense movements and remotely monitor activities or changes in specific environments, have many applications. For instance, they could be used for surveillance purposes as well as to track the sleep or physical activities of medical patients and athletes. Some videogame developers have also used wireless sensing systems to create more engaging sports or dance-related games.

Researchers at Florida State University, Trinity University and Rutgers University have recently developed Winect, a new wireless sensing system that can track the poses of humans in 3D as they perform a wide range of free-form physical activities. This system was introduced in a paper pre-published on arXiv and is set to be presented at the ACM Conference on Interactive, Mobile, Wearables and Ubiquitous Technologies (Ubi Comp) 2,021 one of the most renowned computer science events worldwide.

“Our research group has been conducting cutting-edge research in wireless sensing,” Jie Yang, one of the researchers who carried out the study, told TechXplore. “In the past, we have proposed several systems to use Wi-Fi signals to sense various human activities and objects, ranging from large-scale human activities, to small-scale finger movements, sleep monitoring and daily objects For example, we proposed two systems dubbed E-eyes and WiFinger, which are among the first work to utilize Wi-Fi sensing to distinguish various types of daily activity and finger gestures.”

Penn State researchers developed a prototype of a wearable, noninvasive glucose sensor, shown here on the arm. Credit: Jia Zhu, Penn State.

Penn State researchers develop first-of-its-kind wearable, noninvasive glucose monitoring device prototype.

Noninvasive glucose monitoring devices are not currently commercially available in the United States, so people with diabetes must collect blood samples or use sensors embedded under the skin to measure their blood sugar levels. Now, with a new wearable device created by Penn State researchers, less intrusive glucose monitoring could become the norm.

The University of Bristol is part of an international consortium of 13 universities in partnership with Facebook AI, that collaborated to advance egocentric perception. As a result of this initiative, we have built the world’s largest egocentric dataset using off-the-shelf, head-mounted cameras.


Progress in the fields of artificial intelligence (AI) and augmented reality (AR) requires learning from the same data humans process to perceive the world. Our eyes allow us to explore places, understand people, manipulate objects and enjoy activities—from the mundane act of opening a door to the exciting interaction of a game of football with friends.

Egocentric 4D Live Perception (Ego4D) is a massive-scale dataset that compiles 3,025 hours of footage from the wearable cameras of 855 participants in nine countries: UK, India, Japan, Singapore, KSA, Colombia, Rwanda, Italy, and the US. The data captures a wide range of activities from the ‘egocentric’ perspective—that is from the viewpoint of the person carrying out the activity. The University of Bristol is the only UK representative in this diverse and international effort, collecting 270 hours from 82 participants who captured footage of their chosen activities of daily living—such as practicing a musical instrument, gardening, grooming their pet, or assembling furniture.

“In the not-too-distant future you could be wearing smart AR glasses that guide you through a recipe or how to fix your bike—they could even remind you where you left your keys,” said Principal Investigator at the University of Bristol and Professor of Computer Vision, Dima Damen.

A Japanese startup at CES is claiming to have solved one of the biggest problems in medical technology: Noninvasive continuous glucose monitoring. Quantum Operation Inc, exhibiting at the virtual show, says that its prototype wearable can accurately measure blood sugar from the wrist. Looking like a knock-off Apple Watch, the prototype crams in a small spectrometer which is used to scan the blood to measure for glucose. Quantum’s pitch adds that the watch is also capable of reading other vital signs, including heart rate and ECG.

The company says that its secret sauce is in its patented spectroscopy materials which are built into the watch and its band. To use it, the wearer simply needs to slide the watch on and activate the monitoring from the menu, and after around 20 seconds, the data is displayed. Quantum says that it expects to sell its hardware to insurers and healthcare providers, as well as building a big data platform to collect and examine the vast trove of information generated by patients wearing the device.

Quantum Operation supplied a sampling of its data compared to that made by a commercial monitor, the FreeStyle Libre. And, at this point, there does seem to be a noticeable amount of variation between the wearable and the Libre. That, for now, may be a deal breaker for those who rely upon accurate blood glucose readings to determine their insulin dosage.

Researchers from Georgia Tech University’s Center for Human-Centric Interfaces and Engineering have created soft scalp electronics (SSE), a wearable wireless electro-encephalography (EEG) device for reading human brain signals. By processing the EEG data using a neural network, the system allows users wearing the device to control a video game simply by imagining activity.