Toggle light / dark theme

Restoring And Extending The Capabilities Of The Human Brain — Dr. Behnaam Aazhang, Ph.D. — Director, Rice Neuroengineering Initiative, Rice University


Dr. Behnaam Aazhang, Ph.D. (https://aaz.rice.edu/) is the J.S. Abercrombie Professor, Electrical and Computer Engineering, and Director, Rice Neuroengineering Initiative (NEI — https://neuroengineering.rice.edu/), Rice University, where he has broad research interests including signal and data processing, information theory, dynamical systems, and their applications to neuro-engineering, with focus areas in (i) understanding neuronal circuits connectivity and the impact of learning on connectivity, (ii) developing minimally invasive and non-invasive real-time closed-loop stimulation of neuronal systems to mitigate disorders such as epilepsy, Parkinson, depression, obesity, and mild traumatic brain injury, (iii) developing a patient-specific multisite wireless monitoring and pacing system with temporal and spatial precision to restore the healthy function of a diseased heart, and (iv) developing algorithms to detect, predict, and prevent security breaches in cloud computing and storage systems.

Dr. Aazhang received his B.S. (with highest honors), M.S., and Ph.D. degrees in Electrical and Computer Engineering from University of Illinois at Urbana-Champaign in 1981, 1983, and 1986, respectively. From 1981 to 1985, he was a Research Assistant in the Coordinated Science Laboratory, University of Illinois. In August 1985, he joined the faculty of Rice University. From 2006 till 2014, he held an Academy of Finland Distinguished Visiting Professorship appointment (FiDiPro) at the University of Oulu, Oulu, Finland.

As Nvidia’s recent surge in market capitalization clearly demonstrates, the AI industry is in desperate need of new hardware to train large language models (LLMs) and other AI-based algorithms. While server and HPC GPUs may be worthless for gaming, they serve as the foundation for data centers and supercomputers that perform highly parallelized computations necessary for these systems.

When it comes to AI training, Nvidia’s GPUs have been the most desirable to date. In recent weeks, the company briefly achieved an unprecedented $1 trillion market capitalization due to this very reason. However, MosaicML now emphasizes that Nvidia is just one choice in a multifaceted hardware market, suggesting companies investing in AI should not blindly spend a fortune on Team Green’s highly sought-after chips.

The AI startup tested AMD MI250 and Nvidia A100 cards, both of which are one generation behind each company’s current flagship HPC GPUs. They used their own software tools, along with the Meta-backed open-source software PyTorch and AMD’s proprietary software, for testing.

New research by the University of Liverpool could signal a step change in the quest to design the new materials that are needed to meet the challenge of net zero and a sustainable future.

Published in the journal Nature, Liverpool researchers have shown that a mathematical algorithm can guarantee to predict the structure of any material just based on knowledge of the atoms that make it up.

Developed by an interdisciplinary team of researchers from the University of Liverpool’s Departments of Chemistry and Computer Science, the algorithm systematically evaluates entire sets of possible structures at once, rather than considering them one at a time, to accelerate identification of the correct solution.

Recent progress in AI has been startling. Barely a week’s gone by without a new algorithm, application, or implication making headlines. But OpenAI, the source of much of the hype, only recently completed their flagship algorithm, GPT-4, and according to OpenAI CEO Sam Altman, its successor, GPT-5, hasn’t begun training yet.

It’s possible the tempo will slow down in coming months, but don’t bet on it. A new AI model as capable as GPT-4, or more so, may drop sooner than later.

This week, in an interview with Will Knight, Google DeepMind CEO Demis Hassabis said their next big model, Gemini, is currently in development, “a process that will take a number of months.” Hassabis said Gemini will be a mashup drawing on AI’s greatest hits, most notably DeepMind’s AlphaGo, which employed reinforcement learning to topple a champion at Go in 2016, years before experts expected the feat.

AI applications are summarizing articles, writing stories and engaging in long conversations — and large language models are doing the heavy lifting.

A large language model, or LLM, is a deep learning algorithm that can recognize, summarize, translate, predict and generate text and other forms of content based on knowledge gained from massive datasets.

Large language models are among the most successful applications of transformer models. They aren’t just for teaching AIs human languages, but for understanding proteins, writing software code, and much, much more.

Join top executives in San Francisco on July 11–12 and learn how business leaders are getting ahead of the generative AI revolution. Learn More

New products like ChatGPT have captivated the public, but what will the actual money-making applications be? Will they offer sporadic business success stories lost in a sea of noise, or are we at the start of a true paradigm shift? What will it take to develop AI systems that are actually workable?

To chart AI’s future, we can draw valuable lessons from the preceding step-change advance in technology: the Big Data era.

Scientists at Brookhaven National Laboratory have used two-dimensional condensed matter physics to understand the quark interactions in neutron stars, simplifying the study of these densest cosmic entities. This work helps to describe low-energy excitations in dense nuclear matter and could unveil new phenomena in extreme densities, propelling advancements in the study of neutron stars and comparisons with heavy-ion collisions.

Understanding the behavior of nuclear matter—including the quarks and gluons that make up the protons and neutrons of atomic nuclei—is extremely complicated. This is particularly true in our world, which is three dimensional. Mathematical techniques from condensed matter physics that consider interactions in just one spatial dimension (plus time) greatly simplify the challenge. Using this two-dimensional approach, scientists solved the complex equations that describe how low-energy excitations ripple through a system of dense nuclear matter. This work indicates that the center of neutron stars, where such dense nuclear matter exists in nature, may be described by an unexpected form.

Keep Your Digital Life Private and Stay Safe Online: https://nordvpn.com/safetyfirst.
Welcome to an enlightening journey through the 7 Stages of AI, a comprehensive exploration into the world of artificial intelligence. If you’ve ever wondered about the stages of AI, or are interested in how the 7 stages of artificial intelligence shape our technological world, this video is your ultimate guide.

Artificial Intelligence (AI) is revolutionizing our daily lives and industries across the globe. Understanding the 7 stages of AI, from rudimentary algorithms to advanced machine learning and beyond, is vital to fully grasp this complex field. This video delves deep into each stage, providing clear explanations and real-world examples that make the concepts accessible for everyone, regardless of their background.

Throughout this video, we demystify the fascinating progression of AI, starting from the basic rule-based systems, advancing through machine learning, deep learning, and the cutting-edge concept of self-aware AI. Not only do we discuss the technical aspects of these stages, but we also explore their societal implications, making this content valuable for technologists, policy makers, and curious minds alike.

Leveraging our in-depth knowledge, we illuminate the intricate complexities of artificial intelligence’s 7 stages. By the end of the video, you’ll have gained a robust understanding of the stages of AI, the applications and potential of each stage, and the future trajectory of this game-changing technology.