Toggle light / dark theme

Learn physics and maths on Brilliant! First 30 days are free and 20% off the annual premium subscription when you use our link ➜ https://brilliant.org/sabine.

Can you really send a particle into the past? New Scientist published an article about this last week, and though I’m quite fond of the concept of retrocausality, I’m afraid to say that reality is much less interesting than fiction. Let’s have a look.

Paper: https://arxiv.org/abs/2403.

🤓 Check out my new quiz app ➜ http://quizwithit.com/

In this thought-provoking lecture, Prof. Jay Friedenberg from Manhattan College delves into the intricate interplay between cognitive science, artificial intelligence, and ethics. With nearly 30 years of teaching experience, Prof. Friedenberg discusses how visual perception research informs AI design, the implications of brain-machine interfaces, the role of creativity in both humans and AI, and the necessity for ethical considerations as technology evolves. He emphasizes the importance of human agency in shaping our technological future and explores the concept of universal values that could guide the development of AGI for the betterment of society.

00:00 Introduction to Jay Friedenberg.
01:02 Connecting Cognitive Science and AI
02:36 Human Augmentation and Technology.
03:50 Brain-Machine Interfaces.
05:43 Balancing Optimism and Caution in AI
07:52 Free Will vs Determinism.
12:34 Creativity in Humans and Machines.
16:45 Ethics and Value Alignment in AI
20:09 Conclusion and Future Work.

SingularityNET was founded by Dr. Ben Goertzel with the mission of creating a decentralized, democratic, inclusive, and beneficial Artificial General Intelligence (AGI). An AGI is not dependent on any central entity, is open to anyone, and is not restricted to the narrow goals of a single corporation or even a single country.

The SingularityNET team includes seasoned engineers, scientists, researchers, entrepreneurs, and marketers. Our core platform and AI teams are further complemented by specialized teams devoted to application areas such as finance, robotics, biomedical AI, media, arts, and entertainment.

On July 5, 2024, at around 1 a.m., Earth reached its farthest point from the Sun, known as aphelion. This annual event raises an intriguing question: why are we experiencing summer heat when our planet is at its greatest distance from the Sun?

Understanding Aphelion

During aphelion, Earth is about 94.5 million miles (152 million kilometers) away from the Sun. This contrasts with perihelion, which occurs in early January, when Earth is closest to the Sun at approximately 91.4 million miles (147 million kilometers).

New research identifies the molecule KIBRA as a critical “glue” in stabilizing long-term memories by maintaining synaptic strength, offering insights into memory persistence despite ongoing cellular changes.

Whether it’s a first-time visit to a zoo or when we learned to ride a bicycle, we have memories from our childhoods kept well into adult years. But what explains how these memories last nearly an entire lifetime?

A new study in the journal Science Advances, conducted by a team of international researchers, has uncovered a biological explanation for long-term memories. It centers on the discovery of the role of a molecule, KIBRA, that serves as a “glue” to other molecules, thereby solidifying memory formation.

A research team from Japan, including scientists from Hitachi, Ltd. (TSE 6,501, Hitachi), Kyushu University, RIKEN, and HREM Research Inc. (HREM), has achieved a major breakthrough in the observation of magnetic fields at unimaginably small scales.

In collaboration with National Institute of Advanced Industrial Science and Technology (AIST) and the National Institute for Materials Science (NIMS), the team used Hitachi’s atomic-resolution holography electron microscope—with a newly developed image acquisition technology and defocus correction algorithms—to visualize the magnetic fields of individual atomic layers within a crystalline solid.

Many advances in , catalysis, transportation, and have been made possible by the development and adoption of high-performance materials with tailored characteristics. Atom arrangement and electron behavior are among the most critical factors that dictate a crystalline material’s properties.

A combined team of roboticists from Stanford University and the Toyota Research Institute has found that adding audio data to visual data when training robots helps to improve their learning skills. The team has posted their research on the arXiv preprint server.

The researchers noted that virtually all training done with AI-based robots involves exposing them to a large amount of visual information, while ignoring associated audio. They wondered if adding microphones to robots and allowing them to collect data regarding how something is supposed to sound as it is being done might help them learn a task better.

For example, if a is supposed to learn how to open a box of cereal and fill a bowl with it, it may be helpful to hear the sounds of a box being opened and the dryness of the cereal as it cascades down into a bowl. To find out, the team designed and carried out four robot-learning experiments.