Toggle light / dark theme

Researchers at Heidelberg University and University of Bern have recently devised a technique to achieve fast and energy-efficient computing using spiking neuromorphic substrates. This strategy, introduced in a paper published in Nature Machine Intelligence, is a rigorous adaptation of a time-to-first-spike (TTFS) coding scheme, together with a corresponding learning rule implemented on certain networks of artificial neurons. TTFS is a time-coding approach, in which the activity of neurons is inversely proportional to their firing delay.

“A few years ago, I started my Master’s thesis in the Electronic Vision(s) group in Heidelberg,” Julian Goeltz, one of the leading researchers working on the study, told TechXplore. “The neuromorphic BrainScaleS system developed there promised to be an intriguing substrate for brain-like computation, given how its neuron and synapse circuits mimic the dynamics of neurons and synapses in the brain.”

When Goeltz started studying in Heidelberg, deep-learning models for spiking networks were still relatively unexplored and existing approaches did not use spike-based communication between neurons very effectively. In 2,017 Hesham Mostafa, a researcher at University of California—San Diego, introduced the idea that the timing of individual neuronal spikes could be used for information processing. However, the neuronal dynamics he outlined in his paper were still quite different from biological ones and thus were not applicable to brain-inspired neuromorphic hardware.

Let the OSS Enterprise newsletter guide your open source journey! Sign up here.

Neural Magic, which provides software to facilitate deep learning deployment in edge locations, today announced a $30 million series A funding round.

The market for edge AI is exploding as more companies deploy the technology in a variety of applications across industries — including in areas like asset maintenance and monitoring, factory automation, and telehealth. The market is expected to be worth $1.83 billion by 2,026 according to a report by Markets and Markets.

The transaction-based communications system ensures robot teams achieve their goal even if some robots are hacked.

Imagine a team of autonomous drones equipped with advanced sensing equipment, searching for smoke as they fly high above the Sierra Nevada mountains. Once they spot a wildfire, these leader robots relay directions to a swarm of firefighting drones that speed to the site of the blaze.

But what would happen if one or more leader robots was hacked by a malicious agent and began sending incorrect directions? As follower robots are led farther from the fire, how would they know they had been duped?

The use of blockchain technology as a communication tool for a team of robots could provide security and safeguard against deception, according to a study by researchers at MIT and Polytechnic University of Madrid, which was published today in IEEE Transactions on Robotics. The research may also have applications in cities where multirobot systems of self-driving cars are delivering goods and moving people across town.

The potential for AI to deliver transformative value is almost unlimited. And yet, accessing that value is by no means a given. So how do we crack the code?

As someone who’s been in the business of deploying enterprise-grade AI solutions since the earliest days of AI—from the inside, as a CIO at Verizon, and from the outside, as an advisor to an AI company ASAPP—I know that our job as CIOs is to get transformational value out of transformational technology. And yet as recently as 2,020 McKinsey reported that less than 25 percent of companies are “seeing significant bottom-line impact” from AI.

I believe that there are at least three ways we need to shift our thinking if our organizations are going to mine the full transformational potential of AI:

Advanced Nuclear Power Advocacy For Humanity — Eric G. Meyer, Founder & Director, Generation Atomic


Eric G. Meyer is the Founder and Director of Generation Atomic (https://generationatomic.org/), a nuclear advocacy non-profit which he founded after hearing about the promise of advanced nuclear reactors, and he decided to devote his life to saving and expanding the use of atomic energy.

Eric worked as an organizer on several political, union, and issue campaigns while in graduate school for applied public policy, taking time off to attend the climate talks in Paris and sing opera about atomic energy.

The availability of data can paralyze a company and its effort to bring software-centric products and services to market. To solve this issue, two-year-old data startup Rendered.ai is generating synthetic data for the satellite, medical, robotics and automotive industries.

At its most broad, synthetic data is manufactured rather than gathered from the real world. “When we use the term synthetic data what we really mean is engineered simulated datasets, and in particular, we focus on a physics-based simulation,” Rendered.ai CEO Nathan Kundtz explained in a recent interview with TechCrunch.

Kundtz received his PhD in physics from Duke University and cut his teeth in the space industry, heading the satellite antenna developer Kymeta Corporation. After leaving that company, he started working with other small space companies, when he noticed what he called a “chicken and egg” problem.

It’s not often that messing around in the lab has produced a fundamental breakthrough, à la Michael Faraday with his magnets and prisms. Even more uncommon is the discovery of the same thing by two research teams at the same time: Newton and Leibniz come to mind. But every so often, even the rarest of events does happen. The summer of 2021 has been a banner season for condensed-matter physics. Three separate teams of researchers have created a crystal made entirely of electrons — and one of them actually did it by accident.

The researchers were working with single-atom-thick semiconductors, cooled to ultra-low temperatures. One team, led by Hongkun Park along with Eugene Demler, both of Harvard, discovered that when very specific numbers of electrons were present in the layers of these slivers of semiconductor, the electrons stopped in their tracks and stood “mysteriously still.” Eventually colleagues recalled an old idea having to do with Wigner crystals, which were one of those things that exist on paper and in theory but had never been verified in life. Wigner had calculated that because of mutual electrostatic repulsion, electrons in a monolayer would assume a tri-grid pattern.

Park and Demler’s group was not alone in its travails. “A group of theoretical physicists led by Eugene Demler of Harvard University, who is moving to ETH [ETH Zurich, in Switzerland] this year, had calculated theoretically how that effect should show up in the observed excitation frequencies of the excitons – and that’s exactly what we observed in the lab,” said Ataç Imamoğlu, himself from ETH. Imamoğlu’s group used the same technique to document the formation of a Wigner crystal.

For more than 20 years, D-Wave has been synonymous with quantum annealing. Its early bet on this technology allowed it to become the world’s first company to sell quantum computers, but that also somewhat limited the real-world problems its hardware could solve, given that quantum annealing works especially well for optimization problems like protein folding or route planning. But as the company announced at its Qubits conference today, a superconducting gate-model quantum computer — of the kind IBM and others currently offer — is now also on its roadmap.

D-Wave believes the combination of annealing, gate-model quantum computing and classic machines is what its businesses’ users will need to get the most value from this technology. “Like we did when we initially chose to pursue annealing, we’re looking ahead,” the company notes in today’s announcement. “We’re anticipating what our customers need to drive practical business value, and we know error-corrected gate-model quantum systems with practical application value will be required for another important part of the quantum application market: simulating quantum systems. This is an application that’s particularly useful in fields like materials science and pharmaceutical research.”