Toggle light / dark theme

At the time of writing, scientists and engineers still haven’t figured out how to replicate every computer component that currently exists within semiconductor processors. Computation is nonlinear. It requires that different signals interact with each other and change the outcomes of other components. You need to build logic gates in the same way that semiconductor transistors are used to create logic gates, but photons don’t behave in a way that naturally works with this approach.

This is where photonic logic comes into the picture. By using nonlinear optics it’s possible to build logic gates similar to those used in conventional processors. At least, in theory, it could be possible. There are many practical and technological hurdles to overcome before photonic computers play a significant role.

Leonard Susskind (Stanford University)
https://simons.berkeley.edu/events/quantum-colloquium-black-…ing-thesis.
Quantum Colloquium.

A few years ago three computer scientists named Adam Bouland, Bill Fefferman, and Umesh Vazirani, wrote a paper that promises to radically change the way we think about the interiors of black holes. Inspired by their paper I will explain how black holes threaten the QECTT, and how the properties of horizons rescue the thesis, and eventually make predictions for the complexity of extracting information from behind the black hole horizon. I’ll try my best to explain enough about black holes to keep the lecture self contained.

Panel featuring Scott Aaronson (UT Austin), Geoffrey Penington (UC Berkeley), and Edward Witten (IAS); Umesh Vazirani (UC Berkeley; moderator). 1:27:30.

In a move that could add even more fuel to the booming Central Texas high-tech sector, chipmaker NXP Semiconductors is considering a $2.6 billion expansion in Austin that would create up to 800 jobs.

The potential expansion is the latest big project for which the Austin area is in the running. Tech firm Applied Materials said in March that it’s considering Hutto for a $2.4 billion research and development center, while chipmaker Infineon Technologies said in February that it’s considering Austin for a $700 million expansion.

NXP Semiconductors, which is based in the Netherlands and has two fabrication plants in Austin, is seeking tax breaks from the Austin Independent School District under the state’s Chapter 313 incentive program for proposed expansion. An initial presentation to the district’s board Tuesday night didn’t specify the amount, but previous incentives agreements from Texas school districts for similar Chapter 313 deals have been for tens of millions of dollars.

Summary

The human brain has the, remarkable ability to learn patterns from small amounts of data and then recognize novel instances of those patterns despite distortion and noise. Although advances in machine learning algorithms have been weakly informed by the brain since the 1940’s, they do not yet rival human performance.

A new study in the Journal of Neuroscience has some answers. By scanning the brains of 24 people actively suppressing a particular memory, the team found a neural circuit that detects, inhibits, and eventually erodes intrusive memories.

A trio of brain structures makes up this alarm system. At the heart is the dACC (for “dorsal anterior cingulate cortex”), a scarf-like structure that wraps around deeper brain regions near the forehead. It acts like an intelligence agency: it monitors neural circuits for intrusive memories, and upon discovery, alerts the “executive” region of the brain. The executive then sends out an abort signal to the brain’s memory center, the hippocampus. Like an emergency stop button, this stops the hippocampus from retrieving the memory.

The entire process happens below our consciousness, suppressing unwanted memories so that they never surface to awareness.

Algorithm set for deployment in Japan could identify giant temblors faster and more reliably.


Two minutes after the world’s biggest tectonic plate shuddered off the coast of Japan, the country’s meteorological agency issued its final warning to about 50 million residents: A magnitude 8.1 earthquake had generated a tsunami that was headed for shore. But it wasn’t until hours after the waves arrived that experts gauged the true size of the 11 March 2011 Tohoku quake. Ultimately, it rang in at a magnitude 9—releasing more than 22 times the energy experts predicted and leaving at least 18,000 dead, some in areas that never received the alert. Now, scientists have found a way to get more accurate size estimates faster, by using computer algorithms to identify the wake from gravitational waves that shoot from the fault at the speed of light.

“This is a completely new [way to recognize] large-magnitude earthquakes,” says Richard Allen, a seismologist at the University of California, Berkeley, who was not involved in the study. “If we were to implement this algorithm, we’d have that much more confidence that this is a really big earthquake, and we could push that alert out over a much larger area sooner.”

Scientists typically detect earthquakes by monitoring ground vibrations, or seismic waves, with devices called seismometers. The amount of advance warning they can provide depends on distance between the earthquake and the seismometers, and the speed of the seismic waves, which travel less than 6 kilometers per second. Networks in Japan, Mexico, and California provide seconds or even minutes of advance warning, and the approach works well for relatively small temblors. But beyond magnitude 7, the earthquake waves can saturate seismometers. This makes the most destructive earthquakes, like Japan’s Tohoku quake, the most challenging to identify, Allen says.