Toggle light / dark theme

Intel has been in the semiconductor chip manufacturing game for a while. Recently, it’s been looking to expand its foundries and production to the point where it could make cutting-edge chips for other companies, a territory we normally associate with TSMC and Samsung.

Currently, Intel is building and ramping up manufacturing in North America, Mexico, and Germany — and these foundries will be doing a lot more than simply creating chips for the latest generation of Core processors. Earlier this year, Intel and Arm announced a partnership to build mobile SoCs on Intel’s 18A process node, and we’ve even heard from the likes of NVIDIA stating that it’s open to working with Intel to produce its hardware.

All of this makes the recent statement from Darren Grasby, the executive vice president for strategic partnerships and president of AMD EMEA, a little shocking. His words were harsh when asked if Intel would succeed in its ambitious plans to build global foundries and develop and create chips for multiple companies. To say the least!


Some harsh words from an AMD exec who doesn’t think Intel’s global foundry expansion and semiconductor manufacturing will drum up new customers.

In the year 2001, the spaceship Discovery is betrayed by its on-board computer, HAL, while on a mission to Jupiter. Nine years later, with the United States and Russia on the brink of war, the superpowers launch a joint mission to return to the Discovery in 2010: The Year We Make Contact. During the three-year voyage to Jupiter, world war breaks out on Earth, threatening to extend to the spaceship. But the ghostly presence of Dave Bowman (Keir Dullea) of the Discovery crew intervenes, warning that something grand, dangerous and wonderful is about to occur…

WASHINGTON, Oct 6 (Reuters) — Since beginning operations last year, the James Webb Space Telescope has provided an astonishing glimpse of the early history of our universe, spotting a collection of galaxies dating to the enigmatic epoch called cosmic dawn.

But the existence of what appear to be massive and mature galaxies during the universe’s infancy defied expectations — too big and too soon. That left scientists scrambling for an explanation while questioning the basic tenets of cosmology, the science of the origin and development of the universe. A new study may resolve the mystery without ripping up the textbooks.

The researchers used sophisticated computer simulations to model how the earliest galaxies evolved. These indicated that star formation unfolded differently in these galaxies in the first few hundred million years after the Big Bang event 13.8 billion years ago that initiated the universe than it does in large galaxies like our Milky Way populating the cosmos today.

The method is still at its basic stage but multiple such microscopes could be pooled up to build a larger quantum computer.

Researchers at the IBS Center for Quantum Nanoscience (QNS) in Seoul, South Korea, have successfully demonstrated using a scanning tunneling microscope (STM) to perform quantum computation using electrons as qubits, a press release said.

Quantum computing is usually associated with terms such as atom traps or superconductors that aid in isolating quantum states or qubits that serve as a basic unit of information. In many ways, everything in nature is quantum and can be used to perform quantum computations as long as we can isolate its quantum states.

Tinkerers, developers, and general-purpose nerds will be happy to hear that Raspberry Pi CEO Eben Upton was wrong about the Raspberry Pi 5. When asked about the foundation’s fifth-generation single-board computer late last year, Upton said we should not expect to see it in 2023. But surprise, the Raspberry Pi 5 is launching this month with a big performance boost and a reasonable price.


The new model, which will be much faster, starts at just $60 with 4GB of RAM.

In a new Physical Review Letters study, scientists have successfully presented a proof of concept to demonstrate a randomness-free test for quantum correlations and non-projective measurements, offering a groundbreaking alternative to traditional quantum tests that rely on random inputs.

“Quantum correlation” is a fundamental phenomenon in and one that is central to quantum applications like communication, cryptography, computing, and information processing.

Bell’s inequality, or Bell’s theory, named after physicist John Stewart Bell, is the standard test used to determine the nature of correlation. However, one of the challenges with using Bell’s theorem is the requirement of seed for selecting measurement settings.

The famous Copenhagen Interpretation favored by the founders of quantum mechanics is most definitely psi-epistemic. Niels Bohr, Werner Heisenberg, and others saw the state vector as being related to our interactions with the Universe. As Bohr said, “Physics is not about how the world is; it is about what we can say about the world.”

QBism is also definitively psi-epistemic, but it is not the Copenhagen Interpretation. Its epistemic focus grew organically from its founders’ work in quantum information science, which is arguably the most important development in quantum studies over the last 30 years. As physicists began thinking about quantum computers, they recognized that seeing the quantum in terms of information — an idea with strong epistemic grounding — provided new and powerful insights. By taking the information perspective seriously and asking, “Whose information?” the founders of QBism began a fundamentally new line of inquiry that, in the end, doesn’t require science fiction ideas like infinite parallel universes. That to me is one of its great strengths.

But, like all quantum interpretations, there is a price to be paid by QBism for its psi-epistemic perspective. The perfectly accessible, perfectly knowable Universe of classical physics is gone forever, no matter what interpretation you choose. We’ll dive into the price of QBism next time.

Quantum physicists have simulated super diffusion in quantum particles on a quantum computer, paving the way for deeper insights into condensed matter physics and materials science. This achievement, realized on a 27-qubit system programmed remotely from Dublin, emphasizes the potential of quantum computing in both commercial and fundamental physics inquiries.

Quantum physicists at Trinity, working alongside IBM Dublin, have successfully simulated super diffusion in a system of interacting quantum particles on a quantum computer.

This is the first step in doing highly challenging quantum transport calculations on quantum hardware and, as the hardware improves over time, such work promises to shed new light in condensed matter physics and materials science.

Physicists have performed the first quantum calculations to be carried out using individual atoms sitting on a surface.

The technique, described on 5 October in Science1, controls titanium atoms by beaming microwave signals from the tip of a scanning tunnelling microscope (STM). It is unlikely to compete any time soon with the leading approaches to quantum computing, including those adopted by Google and IBM, as well as by many start-up companies. But the tactic could be used to study quantum properties in a variety of other chemical elements or even molecules, say the researchers who developed it.

At some level, everything in nature is quantum and can, in principle, perform quantum computations. The hard part is to isolate quantum states called qubits — the quantum equivalent of the memory bits in a classical computer — from environmental disturbances, and to control them finely enough for such calculations to be achieved.

With millions of tons of human waste we could make mountains of graphene microchips :3.


A trio of researchers, two from the University of Chemistry and Technology, Praha 6, the other the University of Toronto, has demonstrated that chicken feces can be used to make graphene a better catalyst. In their paper published in the journal ACS Nano, Lu Wang, Zdenek Sofer and Martin Pumera argue that researchers churning out papers describing newly found dopants for graphene are not contributing to understanding graphene’s electrocatalytic abilities.

Graphene has been found to have conductivity and strength characteristics that make it a desirable material for use in commercial products. Some have suggested it might also make an excellent catalyst if the right dopant can be found. To that end, researchers have been testing various materials as dopants for graphene to find new ways to use graphene. In their paper, Pumera et al. argue that rather than simply testing materials one after another with graphene, researchers might make better use of their time by devising experiments designed to better understand the fundamentals of graphene’s electrocatalytic abilities. To drive their point home, they wondered if any “crap” they tested would work as a possible dopant—to find out, they tested chicken crap. They prepared samples of graphene oxide using two different methods, then combined each with chicken feces—they then used thermal exfoliation on the results to make graphene.