Toggle light / dark theme

Tohoku University scientists created lab-grown neural networks using microfluidic devices, mimicking natural brain activity and enabling advanced studies of learning and memory.

The phrase “Neurons that fire together, wire together” encapsulates the principle of neural plasticity in the human brain. However, neurons grown in a laboratory dish do not typically adhere to these rules. Instead, cultured neurons often form random, unstructured networks where all cells fire simultaneously, failing to mimic the organized and meaningful connections seen in a real brain. As a result, these in-vitro models provide only limited insights into how learning occurs in living systems.

What if, however, we could create in-vitro neurons that more closely replicate natural brain behavior?

Princeton engineers have developed a scalable 3D printing technique to produce soft plastics with customizable stretchiness and flexibility, while also being recyclable and cost-effective—qualities rarely combined in commercially available materials.

In a study published in Advanced Functional Materials, a team led by Emily Davidson detailed how they used thermoplastic elastomers—a class of widely available polymers—to create 3D-printed structures with adjustable stiffness. By designing the 3D printer’s print path, the engineers could program the plastic’s physical properties, allowing devices to stretch and flex in one direction while remaining rigid in another.

Davidson, an assistant professor of chemical and biological engineering, highlighted the potential applications of this technique in fields such as soft robotics, medical devices, prosthetics, lightweight helmets, and custom high-performance shoe soles.

2024: A year when AI, quantum computing, and cybersecurity converged to redefine our digital landscape. For those navigating these complex technological frontiers, clarity became the most critical currency.

Inside Cyber, Key moments that resonated with our community:

• Cybersecurity Trends for 2025 Diving deep into the evolving threat landscape and strategic priorities.

• AI, 5G, and Quantum: Innovation and Cybersecurity Risks Exploring the intersection of emerging technologies and security challenges https://lnkd.in/ex3ktwuF

• PCI DSS v4.0 Compliance Strategies Practical guidance for adapting to critical security standards https://lnkd.in/eK_mviZd.

Read more.


Scientists have discovered that future robots might be able to gauge how you are feeling by just touching human skin. According to a new study published in the journal IEEE Access, researchers used skin conductance as a way to figure out how an individual was feeling. Skin conductance is a measure of how well skin conducts electricity, which usually changes in response to sweat secretion and nerve activity, signifying different human emotional states.

Traditional emotion-detection technologies such as facial recognition and speech analysis, are often prone to error, especially in suboptimal audio-visual conditions. However, scientists believe that skin conductance offers a potential workaround, providing a non-invasive way to capture emotion in real-time.

For the study, the emotional responses of 33 participants were measured by showing them emotionally evocative videos and measuring their skin conductance. The findings revealed distinct patterns for different emotions: fear responses were the longest-lasting, suggesting an evolutionary alert mechanism; family bonding emotions, a blend of happiness and sadness, showed slower responses; and humour triggered quick but fleeting reactions.

Uber and Lyft drivers in Phoenix and Los Angeles are facing increasing challenges as driverless taxis, notably Waymo One, enter the market. These autonomous vehicles are making an already competitive ride-hailing industry even tougher for human drivers.

According to Jacob Zinkula’s report, driverless taxis are significantly impacting the ride-hailing landscape in key markets like Phoenix and Los Angeles. Jason D., a 50-year-old Uber driver based in Phoenix, attributes his decreasing earnings to the influx of Waymo One robotaxis. He notes that heightened competition and operational costs, along with reduced fares and tips, are exacerbating income challenges for both full-time and part-time drivers.

Waymo One, operating under Alphabet, has rolled out over 100,000 paid rides weekly across Los Angeles, San Francisco, and Phoenix. With planned expansions to Atlanta and Austin, these vehicles are set to be integrated into the Uber app. Despite potential regulatory hurdles and safety considerations, experts in the ride-hailing field anticipate a gradual decline in Uber and Lyft drivers’ earnings as autonomous vehicles become more commonplace.