Toggle light / dark theme

Jefferson Lab tests a next-generation data acquisition scheme

Nuclear physics experiments worldwide are becoming ever more data intensive as researchers probe ever more deeply into the heart of matter. To get a better handle on the data, nuclear physicists are now turning to artificial intelligence and machine learning methods to help sift through the torrent in real-time.

A recent test of two systems that employ such methods at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility found that they can, indeed, enable real-time processing of raw data. Such systems could result in a streamlined data analysis process that is faster and more efficient, while also keeping more of the original data for future analysis than conventional systems. An article describing this work was recently published in The European Physical Journal Plus .

The advanced Computer Vision and Artificial Intelligence technologies in X·TERROIR allow enologists to make optimal decisions about the wine destination of grapes.

X·TERROIR technology makes possible cost-effective phenotypic profiling of every vine in the vineyard. This is an exponential increase over what is possible with current technology. The more information that Enologists have to work their magic… the more quality and value they can extract from the vineyard.

Transcript:

To the naked eye, this vineyard looks homogeneous. One might assume that a vineyard like this will produce grapes that are fairly uniform in aromatic profile.
The reality is very different.

The grapes from this vine, will produce a different wine than the vines that are only 50 meters away. Plant genomics, varying soil types, cultural interventions, micro-climate, and even disease result in a substantial variety of aromatic expressions in a single vineyard.

So the grapes that this vine will produce will embody the unique complexity of the aromatic expressions of the soil, climate, and cultural interventions of its own micro-terroir.

The end of classical Computer Science is coming, and most of us are dinosaurs waiting for the meteor to hit.

I came of age in the 1980s, programming personal computers like the Commodore VIC-20 and Apple ][e at home. Going on to study Computer Science in college and ultimately getting a PhD at Berkeley, the bulk of my professional training was rooted in what I will call “classical” CS: programming, algorithms, data structures, systems, programming languages. In Classical Computer Science, the ultimate goal is to reduce an idea to a program written by a human — source code in a language like Java or C++ or Python. Every idea in Classical CS — no matter how complex or sophisticated — from a database join algorithm to the mind-bogglingly obtuse Paxos consensus protocol — can be expressed as a human-readable, human-comprehendible program.

When I was in college in the early ’90s, we were still in the depth of the AI Winter, and AI as a field was likewise dominated by classical algorithms. My first research job at Cornell was working with Dan Huttenlocher, a leader in the field of computer vision (and now Dean of the MIT School of Computing). In Dan’s PhD-level computer vision course in 1995 or so, we never once discussed anything resembling deep learning or neural networks—it was all classical algorithms like Canny edge detection, optical flow, and Hausdorff distances. Deep learning was in its infancy, not yet considered mainstream AI, let alone mainstream CS.

The Florida Institute for Human & Machine Cognition (IHMC) is well known in bipedal robotics circles for teaching very complex humanoid robots to walk. Since 2015, IHMC has been home to a Boston Dynamics Atlas (the DRC version) as well as a NASA Valkyrie, and significant progress has been made on advancing these platforms toward reliable mobility and manipulation. But fundamentally, we’re talking about some very old hardware here. And there just aren’t a lot of good replacement options (available to researchers, anyway) when it comes to humanoids with human-comparable strength, speed, and flexibility.

Several years ago, IHMC decided that it was high time to build their own robot from scratch, and in 2019, we saw some very cool plastic concepts of Nadia —a humanoid designed from the ground up to perform useful tasks at human speed in human environments. After 16 (!) experimental plastic versions, Nadia is now a real robot, and it already looks pretty impressive.