Toggle light / dark theme

Computing has already accelerated scientific discovery. Now scientists say a combination of advanced AI with next-generation cloud computing is turbocharging the pace of discovery to speeds unimaginable just a few years ago.

Microsoft and the Pacific Northwest National Laboratory (PNNL) in Richland, Washington, are collaborating to demonstrate how this acceleration can benefit chemistry and materials science – two scientific fields pivotal to finding energy solutions that the world needs.

Scientists at PNNL are testing a new battery material that was found in a matter of weeks, not years, as part of the collaboration with Microsoft to use to advanced AI and high-performance computing (HPC), a type of cloud-based computing that combines large numbers of computers to solve complex scientific and mathematical tasks.

Without a more comprehensive set of big data, AI algorithms are more likely to generate an inaccurate or incomplete data model. Insufficient data leads to a model that is not capable of predicting outcomes with the level of accuracy that’s needed in the real world.

Anyone with experience in the art market also knows that markets can fluctuate without any indication as to why. AI will not have the answer. Tech entrepreneur Boris Pevzner, founder of AI-powered data platform Live Art, asserts that while AI is a tool that can be used as an indicator, it is not something that can predict any real-world auction prices.

Although AI is becoming increasingly prevalent in the art business, it does not have to be seen as a threat. Many people view AI as a dangerous tool, but AI does not need to be perceived in this way. Instead of a replacement for human expertise, we should see it as a tool of advancement to be used alongside humans to improve the quality of their work.

Lethal drones with facial recognition, armed robots, autonomous fighter jets: we’re at the dawn of a new age of AI-powered warfare, says technologist Alexandr Wang. He explores why data will be the secret weapon in this uncharted landscape and emphasizes the need to consider national security when developing new tech — or potentially face all-out AI warfare.

If you love watching TED Talks like this one, become a TED Member to support our mission of spreading ideas: https://ted.com/membership.

Follow TED!
Twitter: / tedtalks.
Instagram: / ted.
Facebook: / ted.
LinkedIn: / ted-conferences.
TikTok: / tedtoks.

The TED Talks channel features talks, performances and original series from the world’s leading thinkers and doers. Subscribe to our channel for videos on Technology, Entertainment and Design — plus science, business, global issues, the arts and more. Visit https://TED.com to get our entire library of TED Talks, transcripts, translations, personalized talk recommendations and more.

Watch more: https://go.ted.com/alexandrwang.

We may not have reached artificial general intelligence (AGI) yet, but as one of the leading experts in the theoretical field claims, it may get here sooner rather than later.

During his closing remarks at this year’s Beneficial AGI Summit in Panama, computer scientist and haberdashery enthusiast Ben Goertzel said that although people most likely won’t build human-level or superhuman AI until 2029 or 2030, there’s a chance it could happen as soon as 2027.

After that, the SingularityNET founder said, AGI could then evolve rapidly into artificial superintelligence (ASI), which he defines as an AI with all the combined knowledge of human civilization.

At the end of the 20th century, analog systems in computer science have been widely replaced by digital systems due to their higher computing power. Nevertheless, the question keeps being intriguing until now: is the brain analog or digital? Initially, the latter has been favored, considering it as a Turing machine that works like a digital computer. However, more recently, digital and analog processes have been combined to implant human behavior in robots, endowing them with artificial intelligence (AI). Therefore, we think it is timely to compare mathematical models with the biology of computation in the brain. To this end, digital and analog processes clearly identified in cellular and molecular interactions in the Central Nervous System are highlighted. But above that, we try to pinpoint reasons distinguishing in silico computation from salient features of biological computation. First, genuinely analog information processing has been observed in electrical synapses and through gap junctions, the latter both in neurons and astrocytes. Apparently opposed to that, neuronal action potentials (APs) or spikes represent clearly digital events, like the yes/no or 1/0 of a Turing machine. However, spikes are rarely uniform, but can vary in amplitude and widths, which has significant, differential effects on transmitter release at the presynaptic terminal, where notwithstanding the quantal (vesicular) release itself is digital. Conversely, at the dendritic site of the postsynaptic neuron, there are numerous analog events of computation. Moreover, synaptic transmission of information is not only neuronal, but heavily influenced by astrocytes tightly ensheathing the majority of synapses in brain (tripartite synapse). At least at this point, LTP and LTD modifying synaptic plasticity and believed to induce short and long-term memory processes including consolidation (equivalent to RAM and ROM in electronic devices) have to be discussed. The present knowledge of how the brain stores and retrieves memories includes a variety of options (e.g., neuronal network oscillations, engram cells, astrocytic syncytium). Also epigenetic features play crucial roles in memory formation and its consolidation, which necessarily guides to molecular events like gene transcription and translation. In conclusion, brain computation is not only digital or analog, or a combination of both, but encompasses features in parallel, and of higher orders of complexity.

Keywords: analog-digital computation; artificial and biological intelligence; bifurcations; cellular computation; engrams; learning and memory; molecular computation; network oscillations.

Copyright © 2023 Gebicke-Haerter.