Menu

Blog

Archive for the ‘computing’ category: Page 373

Mar 3, 2022

Dark energy: Neutron stars will tell us if it’s only an illusion

Posted by in categories: computing, cosmology, mathematics

A huge amount of mysterious dark energy is necessary to explain cosmological phenomena, such as the accelerated expansion of the Universe, using Einstein’s theory. But what if dark energy was just an illusion and general relativity itself had to be modified? A new SISSA study, published in Physical Review Letters, offers a new approach to answer this question. Thanks to huge computational and mathematical effort, scientists produced the first simulation ever of merging binary neutron stars in theories beyond general relativity that reproduce a dark-energy like behavior on cosmological scales. This allows the comparison of Einstein’s theory and modified versions of it, and, with sufficiently accurate data, may solve the dark energy mystery.

For about 100 years now, general relativity has been very successful at describing gravity on a variety of regimes, passing all experimental tests on Earth and the solar system. However, to explain cosmological observations such as the observed accelerated expansion of the Universe, we need to introduce dark components, such as and , which still remain a mystery.

Enrico Barausse, astrophysicist at SISSA (Scuola Internazionale Superiore di Studi Avanzati) and principal investigator of the ERC grant GRAMS (GRavity from Astrophysical to Microscopic Scales) questions whether dark is real or, instead, it may be interpreted as a breakdown of our understanding of gravity. “The existence of dark energy could be just an illusion,” he says, “the accelerated expansion of the Universe might be caused by some yet unknown modifications of general relativity, a sort of ‘dark gravity’.”

Mar 3, 2022

Elon Musk’s claims Neuralink’s brain implants can take ‘photos’ of memories and help paraplegics walk again

Posted by in categories: computing, Elon Musk, space travel

Elon Musk has a knack for accomplishing feats that others consider improbable. From blasting rockets into space to becoming the king of the EV industry, Musk is determined to make history.

His latest passion project is Neuralink—a company that is developing a brain implant that will link the human brain directly to computers. He claims this brain-computer interface (BCI) will enable humans to carry out actions through thought alone. One of Musk’s first goals: helping paraplegics regain their independence.

But it doesn’t stop there. The company’s technology, Musk hopes, will one day not only treat but cure brain disorders and even save memories so people can revisit them like photo albums.

Mar 3, 2022

Novel design greatly improves output from commercial circuit boards next to superconducting qubits

Posted by in categories: computing, quantum physics

Researchers at the National Institute of Standards and Technology (NIST) have constructed and tested a system that allows commercial electronic components—such as microprocessors on circuit boards—to operate in close proximity with ultra-cold devices employed in quantum information processing. That design allows four times as much data to be output for the same number of connected wires.

In the rising excitement about , it can be easy to overlook the physical fact that the data produced by manipulation of quantum bits (qubits) at cryogenic temperatures a few thousandths of a degree above absolute zero still has to be initiated, read out, and stored using conventional electronics, which presently work only at room temperature, several meters away from the qubits. This separation has obstructed development of quantum computing devices that outperform their classical counterparts.

That extra distance between the quantum computing elements and the external electronics requires extra time for signals to travel, which also causes signals to degrade. In addition, each (comparatively very hot) wire needed to connect the electronics to the cryogenic components adds heat, making it hard to maintain the ultracold temperature required for the quantum devices to work.

Mar 3, 2022

Researchers show they can steal data during homomorphic encryption

Posted by in categories: computing, encryption, mathematics, security

Homomorphic encryption is considered a next generation data security technology, but researchers have identified a vulnerability that allows them to steal data even as it is being encrypted.

“We weren’t able to crack using mathematical tools,” says Aydin Aysu, senior author of a paper on the work and an assistant professor of computer engineering at North Carolina State University. “Instead, we used . Basically, by monitoring in a device that is encoding data for homomorphic encryption, we are able to read the data as it is being encrypted. This demonstrates that even next generation encryption technologies need protection against side-channel attacks.”

Homomorphic encryption is a way of encrypting data so that third parties cannot read it. However, homomorphic encryption still allows third parties and third-party technologies to conduct operations using the data. For example, a user could use homomorphic encryption to upload sensitive data to a cloud computing system in order to perform analyses of the data. Programs in the cloud could perform the analyses and send the resulting information back to the user, but those programs would never actually be able to read the .

Mar 3, 2022

The benefits of peripheral vision for machines

Posted by in category: computing

Caption :

New research from MIT suggests that a certain type of computer vision model that is trained to be robust to imperceptible noise added to image data encodes visual representations similarly to the way humans do using peripheral vision.

Mar 3, 2022

Will the Pandemic Create a Generation of Lost Youth Who Have Seen Their Education Disrupted?

Posted by in categories: biotech/medical, computing, education, internet

Governments need to ensure ubiquitous access to broadband Internet, and quality desktops and laptops, as well as tutoring programs.

Mar 2, 2022

Small, diamond-based quantum computers could be in our hands within five years

Posted by in categories: computing, quantum physics

Circa 2021


Small, affordable, ‘plug-and-play’ quantum computing is one step closer. An Australian startup has won $13 million to make its diamond-based computing cores shine. Now it needs to grow.

ANU research spinoff Quantum Brilliance has found a way to use synthetic diamonds to drive quantum calculations. Now it’s on a five-year quest to produce commercially viable Quantum Accelerators. The goal is a card capable of being plugged into any existing computer system similar to the way graphics cards are now.

Continue reading “Small, diamond-based quantum computers could be in our hands within five years” »

Mar 1, 2022

Illinois farmers push for right to repair their own equipment

Posted by in categories: computing, food, sustainability

These days, new tractors and combines are more like big computers, and require special tools to repair them. Farmers say they’re having to travel farther and pay more to fix them to make sure their harvest schedules stay on track. Jim Birge grew up farming in central Illinois and is now the Manager of the Sangamon County Farm Bureau in Springfield. He describes how new tractors and combines have gone high-tech, and farmers no longer have access to the tools to fix them.

Mar 1, 2022

Digital Twins: The Virtual Future Of Healthcare

Posted by in categories: biotech/medical, computing, neuroscience

While advancements in healthcare have come in leaps and bounds since the 20th century, there is perhaps none more exciting than what digital twin technology could offer. The healthcare industry has the potential to be revolutionized by this application of new advancements, which will ultimately lead to improved research capabilities and patient outcomes.

Defined as the virtual representation of a physical object or system across its life cycle, a digital twin is a computer program that uses real world data to create simulations that can predict the outcomes of a product or process. A concept initially utilized by NASA in the 1960s, this technology has grown exponentially in the last decade, now further expanding into the world of healthcare.

Beginning in 2014 with The Living Heart Project headed by Dassault Systémes, healthcare research with digital twins has broadened to include organs such as the brain and lungs, as well as projects for virtual parts of the body. With these models, doctors have the potential to discover undeveloped illnesses, experiment with treatments, and improve surgical outcomes. They allow clinicians to test multiple treatments across a vast range of therapies, equipment, and interventions by comparing possible outcomes without taking any risks in terms of patient safety. Ultimately, care can become more precise, targeted, and based on the most accurate data available when digital twins are utilized.

Feb 28, 2022

Linus Torvalds prepares to move the Linux kernel to modern C

Posted by in category: computing

We all know Linux is written in C. What you may not know is that it’s written in a long-outdated C dialect: The 1989 version of the C language standard, C89. This is also known as ANSI X3.159‑1989, or ANSI C. Linus Torvalds has decided that enough is enough and will move Linux’s official C to 2011’s C11 standard.


The Linux kernel’s foundation is the ancient C89 standard of C. Now, Torvalds has decided to upgrade to 2011’s more modern C11 standard.