Toggle light / dark theme

SICE researchers part of grant to grow quantum information science

Researchers from the School of Informatics, Computing, and Engineering are part of a group that has received a multi-million dollar grant from IUs’ Emerging Areas of Research program.

Amr Sabry, a professor of informatics and computing and the chair of the Department of Computer Science, and Alexander Gumennik, assistant professor of Intelligent Systems Engineering, are part of the “Center for Quantum Information Science and Engineering” initiative led by Gerardo Ortiz, a professor of physics in IU’s College of Arts and Sciences. The initiative will focus on harnessing the power of quantum entanglement, which is a theoretical phenomenon in which the quantum state of two or more particles have to be described in reference to one another even if the objects are spatially separated.

“Bringing together a unique group of physicists, computer scientists, and engineers to solve common problems in quantum sensing and computation positions IU at the vanguard of this struggle,” Gumennik said. “I believe that this unique implementation approach, enabling integration of individual quantum devices into a monolithic quantum computing circuit, is capable of taking the quantum information science and engineering to a qualitatively new level.”

The Limits of Neuroplasticity in the Brian

One of the #brain’s mysteries is how exactly it reorganizes new #information as you learn new tasks. The standard to date was to test how neurons learned new behavior one #neuron at a time. Carnegie Mellon University and the University of Pittsburgh decided to try a different approach. They looked at the population of neurons to see how they worked together while #learning a new behavior. Studying the intracortical population activity in the primary motor cortex of rhesus macaques during short-term learning in a brain–computer interface (BCI) task, they were able to study the reorganization of population during learning. Their new research indicates that when the brain learns a new activity that it is less flexible than previously thought. The researchers were able to draw strong hypothesis about neural reorganization during learning by using BCI. Through the use of BCI the mapping between #neural activity and learning is completely known.

“In this experimental paradigm, we’re able to track all of the neurons that can lead to behavioral improvements and look at how they all change simultaneously,” says Steve Chase, an associate professor of biomedical engineering at Carnegie Mellon and the Center for the Neural Basis of #Cognition. “When we do that, what we see is a really constrained set of changes that happen, and it leads to this suboptimal improvement of performance. And so, that implies that there are limits that constrain how flexible your brain is, at least on these short time scales.”

It is often challenging to learn new tasks quickly that require a high level of proficiency. Neural plasticity is even more constrained than previously thought as results of this research indicate.

Proposed ‘Nanomechanical’ Computer is Both Old-School and Cutting-Edge

A group of engineers have proposed a novel approach to computing: computers made of billionth-of-a-meter-sized mechanical elements. Their idea combines the modern field of nanoscience with the mechanical engineering principles used to design the earliest computers.

Weapons of the future: Here’s the new war tech Lockheed Martin is pitching to the Pentagon

Yet several defense contractors are developing these engineering concepts for the U.S. military, hoping to get a piece of what is surely going to be a lucrative and lengthy contract.

Speaking to reporters at Lockheed Martin’s media day on Monday, CEO Marillyn Hewson touted investments in hypersonics, laser weapons, electronic warfare and artificial intelligence.

“Lockheed Martin has taken a leadership role in these four technology areas, and many others, to build an enterprise that can successfully support our customers’ rapidly evolving technology needs well into the future,” Hewson said.

Google backs its Bristlecone chip to crack quantum computing

Like every other major tech company, Google has designs on being the first to achieve quantum supremacy — the point where a quantum computer could run particular algorithms faster than a classical computer. Today it’s announced that it believes its latest research, Bristlecone, is going to be the processor to help it achieve that. According to the Google Quantum AI Lab, it could provide “a compelling proof-of-principle for building larger scale quantum computers.”

One of the biggest obstacles to quantum supremacy is error rates and subsequent scalability. Qubits (the quantum version of traditional bits) are very unstable and can be adversely affected by noise, and most of these systems can only hold a state for less than 100 microseconds. Google believes that quantum supremacy can be “comfortably demonstrated” with 49 qubits and a two-qubit error below 0.5 percent. Previous quantum systems by Google have given two-qubit errors of 0.6 percent, which in theory sounds like a miniscule difference, but in the world of quantum computing remains significant.

However, each Bristlecone chip features 72 qubits, which may help mitigate some of this error, but as Google says, quantum computing isn’t just about qubits. “Operating a device such as Bristlecone at low system error requires harmony between a full stack of technology ranging from software and control electronics to the processor itself,” the team writes in a blog post. “Getting this right requires careful systems engineering over several iterations.”

Modified, 3D-printable alloy shows promise for flexible electronics, soft robots

Researchers in Oregon State University’s College of Engineering have taken a key step toward the rapid manufacture of flexible computer screens and other stretchable electronic devices, including soft robots.

The advance by a team within the college’s Collaborative Robotics and Intelligent Systems Institute paves the way toward the 3D printing of tall, complicated structures with a highly conductive gallium alloy.

Researchers put nickel nanoparticles into the , galinstan, to thicken it into a paste with a consistency suitable for .

China is recruiting a new wave of astronauts from its civilians

China is intensifying its push into space, and broadening its astronaut recruiting.

The Chinese government, which plans to increase the number of manned missions in its military-backed space program to around two a year, will soon begin recruiting civilian astronauts, Yang Liwei, deputy director of the China Manned Space Engineering Office, told reporters on the sidelines of a ceremonial parliament session this weekend. That’s a departure from China’s practice of drawing its astronauts from among air force pilots.

Yang—who was China’s first man in space in 2003—said the trainees could include private-sector maintenance engineers, payload specialists, pilots, scientists, and people from universities and other research institutions, according to the Associated Press. More women are also being encouraged to apply. The loosening of restrictions comes amid NASA’s announcement that it has recruited America’s most competitive class of astronauts ever, as well as other initiatives like Canada’s Hunger Games -style search for new astronauts on the internet.

Using a laser to wirelessly charge a smartphone safely across a room

Although mobile devices such as tablets and smartphones let us communicate, work and access information wirelessly, their batteries must still be charged by plugging them in to an outlet. But engineers at the University of Washington have for the first time developed a method to safely charge a smartphone wirelessly using a laser.

As the team reports in a paper published online in December in the Proceedings of the Association for Computing Machinery on Interactive, Mobile, Wearable & Ubiquitous Technologies, a narrow, invisible beam from a laser emitter can deliver charge to a sitting across a room — and can potentially charge a smartphone as quickly as a standard USB cable. To accomplish this, the team mounted a thin power cell to the back of a smartphone, which charges the smartphone using power from the laser. In addition, the team custom-designed safety features — including a metal, flat-plate heatsink on the smartphone to dissipate from the laser, as well as a reflector-based mechanism to shut off the laser if a person tries to move in the charging beam’s path.

“Safety was our focus in designing this system,” said co-author Shyam Gollakota, an associate professor in the UW’s Paul G. Allen School of Computer Science & Engineering. “We have designed, constructed and tested this laser-based charging system with a rapid-response safety mechanism, which ensures that the laser emitter will terminate the charging beam before a person comes into the path of the laser.”

How China’s Massive AI Plan Actually Works

When the Chinese government released its Next Generation Artificial Intelligence Plan in July 2017, it crisply articulated the country’s ambition: to become the “world’s primary AI innovation center” by 2030. That headline goal turned heads within the global tech elite. Longtime Google CEO Eric Schmidt cited the plan as proof that China threatened to overtake the United States in AI. High-ranking American military leaders and AI entrepreneurs held it up as evidence that the United States was falling behind in the “space race” of this century. In December 2017, China’s Ministry of Industry and Information Technology followed up with a “three-year action plan,” a translation of which was recently released by New America’s DigiChina initiative.

But how do these plans actually work? There’s a tendency to place this AI mobilization within China’s longstanding tradition of centrally planned engineering achievements that have wowed the world. The rapid build-out of the country’s bullet train network stands as a monument to the power of combining central planning and deep pockets: in the span of a decade, the Chinese central government spent around $360 billion building 13,670 miles of high-speed rail (HSR) track, more mileage than the rest of the world combined.

But putting the AI plan in this tradition can be misleading. While it follows this model in form (ambitious goal set by the central government), it differs in function (what will actually drive the transformation). The HSR network was dreamed up and drawn up by central government officials, and largely executed by state-owned enterprises. In AI, the real energy is and will be with private technology companies, and to a lesser extent academia.