Toggle light / dark theme

Geoff Hinton — Will Digital Intelligence Replace Biological Intelligence? | Vector’s Remarkable 2024

Vector Institute’s Remarkable 2024 | Geoffrey Hinton — Will Digital Intelligence Replace Biological Intelligence?

In this profound keynote, Vector co-founder Geoffrey Hinton explores the philosophical implications of artificial intelligence and its potential to surpass human intelligence. Drawing from decades of expertise, Hinton shares his growing concerns about AI’s existential risks while examining fundamental questions about consciousness, understanding, and the nature of intelligence itself.

Geoffrey Hinton is one of the founding fathers of deep learning and artificial neural networks. He was a Vice President and Engineering Fellow at Google until 2023 and is Professor Emeritus at the University of Toronto. In 2024 Hinton was awarded the Nobel Prize in Physics.

Key Topics Covered:
• The distinction between digital and analog computation in AI
• Understanding consciousness and subjective experience in AI systems.
• Evolution of language models and their capabilities.
• Existential risks and challenges of AI development.

Timeline:
00:00 — Introduction.
03:35 — Digital vs. Analog Computation.
14:55 — Large Language Models and Understanding.
27:15 — Super Intelligence and Control.
34:15 — Consciousness and Subjective Experience.
41:35 — Q\&A Session.

Remarkable 2025 is coming, subscribe to our newsletter.

AI and Quantum Computing: Glimpsing the Near Future

Catch a glimpse of the near future as AI and Quantum Computing transform how we live. Eric Schmidt, decade-long CEO of Google, joins Brian Greene to explore the horizons of innovation, where digital and quantum frontiers collide to spark a new era of discovery.

This program is part of the Big Ideas series, supported by the John Templeton Foundation.

Participants:
Eric Schmidt.

Moderator:
Brian Greene.

WSF Landing Page: https://www.worldsciencefestival.com/.

SUBSCRIBE to our youtube channel and \.

Twisted Edison: Filaments Curling at the Nanoscale Produce Light Waves that Twirl as they Travel

Bright, twisted light can be produced with technology similar to an Edison light bulb, researchers at the University of Michigan have shown. The finding adds nuance to fundamental physics while offering a new avenue for robotic vision systems and other applications for light that traces out a helix in space.

“It’s hard to generate enough brightness when producing twisted light with traditional ways like electron or photon luminescence,” said Jun Lu, an adjunct research investigator in chemical engineering at U-M and first author of the study on the cover of this week’s Science.

“We gradually noticed that we actually have a very old way to generate these photons—not relying on photon and electron excitations, but like the bulb Edison developed.”

Nvidia transitions to advanced CoWoS-L chip packaging, signaling a major shift for TSMC

Speaking on the sidelines of an event hosted by chip supplier Siliconware Precision Industries in Taichung, Taiwan, Huang explained the transition in Nvidia’s chip packaging requirements. “As we move into Blackwell, we will use largely CoWoS-L. Of course, we’re still manufacturing Hopper, and Hopper will use CoWoS-S. We will also transition the CoWoS-S capacity to CoWoS-L,” he stated.

Huang emphasized that this shift does not indicate a reduction in capacity but rather an increase in capacity for CoWoS-L technology. “So it’s not about reducing capacity. It’s actually increasing capacity into CoWoS-L,” he said.

CoWoS-L (Chip-on-Wafer-on-Substrate with Local Silicon Interconnect) represents a significant advancement over CoWoS-S in terms of performance and efficiency for high-end computing applications like AI and HPC.

Ultra-small neuromorphic chip learns and corrects errors autonomously

Existing computer systems have separate data processing and storage devices, making them inefficient for processing complex data like AI. A KAIST research team has developed a memristor-based integrated system similar to the way our brain processes information. It is now ready for application in various devices, including smart security cameras, allowing them to recognize suspicious activity immediately without having to rely on remote cloud servers, and medical devices with which it can help analyze health data in real time.

The joint research team of Professor Shinhyun Choi and Professor Young-Gyu Yoon of the School of Electrical Engineering has developed the next-generation neuromorphic semiconductor-based ultra-small computing chip that can learn and correct errors on its own. The research is published in the journal Nature Electronics.

What is special about this computing chip is that it can learn and correct errors that occur due to non-ideal characteristics that were difficult to solve in existing neuromorphic devices. For example, when processing a , the chip learns to automatically separate a moving object from the background, and it becomes better at this task over time.

100x Faster: Light-Powered Memory That’s Revolutionizing Computing

A new era in computing is emerging as researchers overcome the limitations of Moore’s Law through photonics.

This cutting-edge approach boosts processing speeds and slashes energy use, potentially revolutionizing AI and machine learning.

Machine learning is a subset of artificial intelligence (AI) that deals with the development of algorithms and statistical models that enable computers to learn from data and make predictions or decisions without being explicitly programmed to do so. Machine learning is used to identify patterns in data, classify data into different categories, or make predictions about future events. It can be categorized into three main types of learning: supervised, unsupervised and reinforcement learning.