Toggle light / dark theme

Just in case people are curious how accurate the news is, the following article says “Nvidia, AMD, and TSMC will still bear the bulk of the risk for establishing manufacturing within the United States.” The reality is that neither Nvidia or AMD makes chips. In that list, only TSMC is a chip manufacturer.


The U.S. Secretary of Commerce reminds investors that the federal government supports a sweeping shift in how and where chips are made.

Trying to make computers more like human brains isn’t a new phenomenon. However, a team of researchers from Johns Hopkins University argues that there could be many benefits in taking this concept a bit more literally by using actual neurons, though there are some hurdles to jump first before we get there.

In a recent paper, the team laid out a roadmap of what’s needed before we can create biocomputers powered by human brain cells (not taken from human brains, though). Further, according to one of the researchers, there are some clear benefits the proposed “organoid intelligence” would have over current computers.

“We have always tried to make our computers more brain-like,” Thomas Hartung, a researcher at Johns Hopkins University’s Environmental Health and Engineering department and one of the paper’s authors, told Ars. “At least theoretically, the brain is essentially unmatched as a computer.”

The Commerce Department opened up its application process for companies vying for a share of chip funding to boost US competitiveness with China.

The Biden administration launched its massive effort to outcompete China in semiconductor manufacturing Tuesday, offering $39 billion in funding incentives for companies seeking to build manufacturing plants in the US.

Authorized by the CHIPS and Science Act last year, the Commerce Department opened the application process Tuesday for companies jockeying for a share of the funding.


The $39 billion Chips for America program is Biden’s plan to outcompete China.

In an interview with EE Times, Classiq CEO Nir Minerbi said Classiq’s academic program is an essential part of its broader strategy to expand the platform’s reach and promote the quantum computing business.

“We believe that offering this program will give students the tools and knowledge they need to learn practical quantum software-development skills while also providing researchers with a streamlined means of developing advanced quantum computing algorithms capable of taking advantage of ever more powerful quantum hardware,” he said. “In addition, our program enables students and researchers to test, validate and run their quantum programs on real hardware, providing valuable real-world experience. Ultimately, we think that our academic program will have a significant impact on the quantum computing community by promoting education and research in the field—and helping to drive innovation and progress in the industry.”

Classiq and Microsoft are among the top companies developing quantum computing software. The quantum stack developed by the firms advances Microsoft’s vision for quantum programming languages, which was published in the 2020 issue of Nature.

A new paper has been released that outlines a type of ‘roadmap’ for biocomputers – computers drawing memory and power from human neurons – or brain cells.

The crux of the new work is a term called ‘organoid intelligence’ – this is the idea that a small group of human neurons could begin understanding it’s environment, learn and remember.

But to understand this, we first have to look to what an organoid is and how they are made.

OAKLAND, Calif. Feb 28 (Reuters) — Intel Corp (INTC.O) on Tuesday released a software platform for developers to build quantum algorithms that can eventually run on a quantum computer that the chip giant is trying to build.

The platform, called Intel Quantum SDK, would for now allow those algorithms to run on a simulated quantum computing system, said Anne Matsuura, Intel Labs’ head of quantum applications and architecture.

Quantum computing is based on quantum physics and in theory can perform calculations quicker than conventional computers.

During almost two-years of the COVID-19 pandemic, the growth of telemedicine and new ways of reaching people has changed and developed. In October 2021, NASA flight surgeon Dr. Josef Schmid, industry partner AEXA Aerospace CEO Fernando De La Pena Llaca, and their teams were the first humans “holoported” from Earth into space.

Using the Microsoft Hololens Kinect camera and a personal computer with custom software from Aexa, ESA (European Space Agency) astronaut Thomas Pesquet had a two-way conversation with live images of Schmid and De La Pena placed in the middle of the International Space Station. This was the first holoportation handshake from Earth in space.


Holoportation is a type of capture technology that allows high-quality 3D models of people to be reconstructed, compressed and transmitted live anywhere in real time.

It can identify hidden objects with 96 percent accuracy.

MIT scientists have engineered an X-ray vision augmented reality headset that combines computer vision and wireless perception to automatically locate items that are hidden from view.

There is one catch though: the hidden items have to have been labeled with RFID tags.


MIT researchers have built an augmented reality headset that gives the wearer X-ray vision.

Rare diseases affect 6–8% of the world’s population and, although we know that small changes in the patient’s DNA are responsible for causing the majority of cases, most people wait several years before they are diagnosed and potentially treated. This hunt for an explanation is extremely distressing for the patients and their families, as well as costing healthcare systems large sums of money for medical investigations and treatments.

Background

Even for the simplest cases, where a single change in a patient’s DNA disrupts a gene and always causes the rare disease, identifying which change in the three billion base pairs in each of our genomes is a huge challenge. Prior to the completion of the human genome in 2003, we did not even know what the normal state of affairs was. Even then, the available sequencing technology limited us to only interrogating small parts of a patient’s genome, directed by intelligent guesswork, with mixed results.