Menu

Blog

Archive for the ‘computing’ category: Page 547

May 31, 2020

‘One-way’ electronic devices enter the mainstream

Posted by in categories: computing, internet, military, mobile phones, quantum physics, virtual reality

Waves, whether they are light waves, sound waves, or any other kind, travel in the same manner in forward and reverse directions—this is known as the principle of reciprocity. If we could route waves in one direction only—breaking reciprocity—we could transform a number of applications important in our daily lives. Breaking reciprocity would allow us to build novel “one-way” components such as circulators and isolators that enable two-way communication, which could double the data capacity of today’s wireless networks. These components are essential to quantum computers, where one wants to read a qubit without disturbing it. They are also critical to radar systems, whether in self-driving cars or those used by the military.

A team led by Harish Krishnaswamy, professor of electrical engineering, is the first to build a high-performance non-reciprocal on a compact chip with a performance 25 times better than previous work. Power handling is one of the most important metrics for these circulators and Krishnaswamy’s new chip can handle several watts of power, enough for cellphone transmitters that put out a watt or so of power. The new chip was the leading performer in a DARPA SPAR (Signal Processing at RF) program to miniaturize these devices and improve performance metrics. Krishnaswamy’s group was the only one to integrate these non-reciprocal devices on a compact chip and also demonstrate performance metrics that were orders of magnitude superior to prior work. The study was presented in a paper at the IEEE International Solid-State Circuits Conference in February 2020, and published May 4, 2020, in Nature Electronics.

“For these circulators to be used in practical applications, they need to be able to handle watts of power without breaking a sweat,” says Krishnaswamy, whose research focuses on developing integrated electronic technologies for new high-frequency wireless applications. “Our earlier work performed at a rate 25 times lower than this new one—our 2017 device was an exciting scientific curiosity but it was not ready for prime time. Now we’ve figured out how to build these one-way devices in a compact chip, thus enabling them to become small, low cost, and widespread. This will transform all kinds of electronic applications, from VR headsets to 5G cellular networks to quantum computers.”

May 30, 2020

DARPA Seeks Secure Microchip Supply Chain

Posted by in categories: computing, security

“Once a chip is designed, adding security after the fact or making changes to address newly discovered threats is nearly impossible,” explains a DARPA spokesperson.

May 30, 2020

Bill Faloon — If Nothing Else Kills Us, Aging Will (Longevity #005)

Posted by in categories: biotech/medical, computing, food, life extension, neuroscience, quantum physics

https://facebook.com/LongevityFB https://instagram.com/longevityyy

https://linkedin.com/company/longevityy

- Please also subscribe and hit the notification bell and click “all” on these YouTube channels:

Continue reading “Bill Faloon — If Nothing Else Kills Us, Aging Will (Longevity #005)” »

May 30, 2020

World record internet speed achieved that is 1 million times faster than current broadband

Posted by in categories: computing, entertainment, internet

Researchers in Australia have achieved a world record internet speed of 44.2 terabits per second, allowing users to download 1,000 HD movies in a single second.

A team from Monash, Swinburne and RMIT universities used a “micro-comb” optical chip containing hundreds of infrared lasers to transfer data across existing communications infrastructure in Melbourne.

May 29, 2020

Quantum-Resistant Cryptography: Our Best Defense Against An Impending Quantum Apocalypse

Posted by in categories: computing, encryption, information science, quantum physics, security

As far back as 2015, the National Institute of Standards and Technology (NIST) began asking encryption experts to submit their candidate algorithms for testing against quantum computing’s expected capabilities — so this is an issue that has already been front of mind for security professionals and organizations. But even with an organization like NIST leading the way, working through all those algorithms to judge their suitability to the task will take time. Thankfully, others within the scientific community have also risen to the challenge and joined in the research.

It will take years for a consensus to coalesce around the most suitable algorithms. That’s similar to the amount of time it took ECC encryption to gain mainstream acceptance, which seems like a fair comparison. The good news is that such a timeframe still should leave the opportunity to arrive at — and widely deploy — quantum-resistant cryptography before quantum computers capable of sustaining the number of qubits necessary to seriously threaten RSA and ECC encryption become available to potential attackers.

The ongoing development of quantum-resistant encryption will be fascinating to watch, and security professionals will be sure to keep a close eye on which algorithms and encryption strategies ultimately prove most effective. The world of encryption is changing more quickly than ever, and it has never been more important for the organizations dependent on that encryption to ensure that their partners are staying ahead of the curve.

May 29, 2020

DARPA Selects Teams to Increase Security of Semiconductor Supply Chain

Posted by in categories: computing, economics, internet, security

As Internet of Things (IoT) devices rapidly increase in popularity and deployment, economic attackers and nation-states alike are shifting their attention to the vulnerabilities of digital integrated circuit (IC) chips. Threats to IC chips are well known, and despite various measures designed to mitigate them, hardware developers have largely been slow to implement security solutions due to limited expertise, high cost and complexity, and lack of security-oriented design tools integrated with supporting semiconductor intellectual property (IP). Further, when unsecure circuits are used in critical systems, the lack of embedded countermeasures exposes them to exploitation. To address the growing threat this poses from an economic and national security perspective, DARPA developed the Automatic Implementation of Secure Silicon (AISS) program. AISS aims to automate the process of incorporating scalable defense mechanisms into chip designs, while allowing designers to explore chip economics versus security trade-offs based on the expected application and intent while maximizing designer productivity.

Today, DARPA is announcing the research teams selected to take on AISS’ technical challenges. Two teams of academic, commercial, and defense industry researchers and engineers will explore the development of a novel design tool and IP ecosystem – which includes tool vendors, chip developers, and IP licensors – allowing, eventually, defenses to be incorporated efficiently into chip designs. The expected AISS technologies could enable hardware developers to not only integrate the appropriate level of state-of-the-art security based on the target application, but also balance security with economic considerations like power consumption, die area, and performance.

“The ultimate goal of the AISS program is to accelerate the timeline from architecture to security-hardened RTL from one year, to one week – and to do so at a substantially reduced cost,” said the DARPA program manager leading AISS, Mr. Serge Leef.

May 27, 2020

Mobile Nuclear Microreactor Development: A Military-Civilian Symbiosis

Posted by in categories: computing, military, nuclear energy

EXECUTIVE SUMMARY: The US Department of Defense has been working with American companies for the past year on a project to develop a prototype for a portable nuclear microreactor, a device intended for use by the US military in security scenarios around the world. The US Department of Energy is also involved in the project, with the aim of providing electricity to remote sites that are difficult to link to the grid. The project thus represents a symbiosis between military and civilian technological development.

A symbiotic relationship between military and civilian aspects of technological development gained momentum in the US after the end of WWII. This was particularly visible among applications in the communication, computing, and aerospace fields, but was also present in the field of nuclear technology. Some technology projects were presented as dual-use in order to justify the cost of their development.

One example of nuclear energy symbiosis was the development of nuclear power-generating reactors. By 1956, more than a decade after the destruction of the Japanese cities of Hiroshima and Nagasaki by nuclear bombs, only the UK’s Calder Hall nuclear power plant, which had four reactors each producing 60 MW electricity (MWe), was in operation. However, as of December 2019, 443 nuclear power generators were operating worldwide, with a total output of 395 gigawatts electric (GWe)—an average output of nearly 900 MWe per reactor.

May 27, 2020

Novel insight reveals topological tangle in unexpected corner of the universe

Posted by in categories: biological, computing, cosmology, mathematics, nanotechnology, particle physics

Just as a literature buff might explore a novel for recurring themes, physicists and mathematicians search for repeating structures present throughout nature.

For example, a certain geometrical structure of knots, which scientists call a Hopfion, manifests itself in unexpected corners of the universe, ranging from , to biology, to cosmology. Like the Fibonacci spiral and the golden ratio, the Hopfion pattern unites different scientific fields, and deeper understanding of its structure and influence will help scientists to develop transformative technologies.

Continue reading “Novel insight reveals topological tangle in unexpected corner of the universe” »

May 26, 2020

New 5G switches mean battery life improvements, higher bandwidth and speeds

Posted by in categories: computing, internet, mobile phones

The 5G revolution has begun, and the first lines of phones that can access the next generation of wireless speeds have already hit the shelves. Researchers at The University of Texas at Austin and the University of Lille in France have built a new component that will more efficiently allow access to the highest 5G frequencies in a way that increases devices’ battery life and speeds up how quickly we can do things like stream high-definition media.

Smartphones are loaded with switches that perform a number of duties. One major task is jumping between networks and spectrum frequencies: 4G, Wi-Fi, LTE, Bluetooth, etc. The current radio-frequency (RF) switches that perform this task are always running, consuming precious processing power and battery life.

“The switch we have developed is more than 50 times more energy efficient compared to what is used today,” said Deji Akinwande, a professor in the Cockrell School of Engineering’s Department of Electrical and Computer Engineering who led the research. “It can transmit an HDTV stream at a 100 gigahertz frequency, and that is unheard of in broadband switch technology.”

May 26, 2020

Are We In A Simulation? | Why You Are Not Real

Posted by in categories: computing, neuroscience

Eric Klien


Simulation theory points out that we might be living in a giant computer simulation. Exponential technological growth and how far we’ve already come in so little time are big indicators that we can’t possibly imagine what the future of humanity would look like in 100 years, or better yet, 1,000 years!
Will we be able to create simulations so indistinguishable from reality that the characters will not be aware that they are being simulated? Today on Cognitive Culture, you’ll learn about why you’re not real!

Continue reading “Are We In A Simulation? | Why You Are Not Real” »