Toggle light / dark theme

The Mathematical Structure of Particle Collisions Comes Into View

And that’s where physicists are getting stuck.

Zooming in to that hidden center involves virtual particles — quantum fluctuations that subtly influence each interaction’s outcome. The fleeting existence of the quark pair above, like many virtual events, is represented by a Feynman diagram with a closed “loop.” Loops confound physicists — they’re black boxes that introduce additional layers of infinite scenarios. To tally the possibilities implied by a loop, theorists must turn to a summing operation known as an integral. These integrals take on monstrous proportions in multi-loop Feynman diagrams, which come into play as researchers march down the line and fold in more complicated virtual interactions.

Physicists have algorithms to compute the probabilities of no-loop and one-loop scenarios, but many two-loop collisions bring computers to their knees. This imposes a ceiling on predictive precision — and on how well physicists can understand what quantum theory says.

Quantum computers to explore precision oncology

The most promising application in biomedicine is in computational chemistry, where researchers have long exploited a quantum approach. But the Fraunhofer Society hopes to spark interest among a wider community of life scientists, such as cancer researchers, whose research questions are not intrinsically quantum in nature.

“It’s uncharted territory,” says oncologist Niels Halama of the DKFZ, Germany’s national cancer center in Heidelberg. Working with a team of physicists and computer scientists, Halama is planning to develop and test algorithms that might help stratify cancer patients, and select small subgroups for specific therapies from heterogeneous data sets.

This is important for precision medicine, he says, but classic computing has insufficient power to find very small groups in the large and complex data sets that oncology, for example, generates. The time needed to complete such a task may stretch out over many weeks—too long to be of use in a clinical setting, and also too expensive. Moreover, the steady improvements in the performance of classic computers are slowing, thanks in large part to fundamental limits on chip miniaturization.

How we could Time Travel through a (special) black hole — Back to the PAST!

Get your SPECIAL OFFER for MagellanTV here: https://try.magellantv.com/arvinash — It’s an exclusive offer for our viewers! Start your free trial today. MagellanTV is a new kind of streaming service run by filmmakers with 3,000+ documentaries! Check out our personal recommendation and MagellanTV’s exclusive playlists: https://www.magellantv.com/genres/science-and-tech.

Chapters.
0:00 — You are a time traveler.
2:32 — Spacetime & light cone review.
6:15 — Flat Spacetime equations.
7:03 — Schwarzschild radius, metric.
8:42 — Light cone near a black hole.
10:15 — How to escape black hole.
10:39 — Kerr-Newman metric.
11:34 — How to remove the event horizon.
11:50 — What is a naked singularity.
12:20 — How to travel back in time.
13:26 — Problems.

Summary.
Time travel is nothing special. You’re time traveling right now into the future. Relativity theory shows higher gravity and higher speed can slow time down enough to allow you to potentially travel far into the future. But can you travel back in time to the past?

In this video I first do a quick review of light cones, world lines, events, light like curves, time-like curves, and space-like curves in this video so that you can understand the rest of the video.

A space like-world line means that the object has to travel faster than light. But moving anything to the speed of light requires an infinite amount of energy to accelerate. So this is not possible.

Going faster than the speed of light can create scenarios that allow you to travel back in time. But since this is not physically possible, we need to figure out a clever manipulation of space time. This means we have to solve Einstein’s equations of General relativity.

Is God in Physics? Fine Tuning Scrutinized

Signup for your FREE TRIAL to The GREAT COURSES PLUS here: http://ow.ly/5KMw30qK17T. Until 350 years ago, there was a distinction between what people saw on earth and what they saw in the sky. There did not seem to be any connection.

Then Isaac Newton in 1,687 showed that planets move due to the same forces we experience here on earth. If things could be explained with mathematics, to many people this called into question the need for a God.

But in the late 20th century, arguments for God were resurrected. The standard model of particle physics and general relativity is accurate. But there are constants in these equations that do not have an explanation. They have to be measured. Many of them seem to be very fine tuned.

Scientists point out for example, the mass of a neutrino is 2X10^-37kg. It has been shown that if this mass was off by just one decimal point, life would not exist because if the mass was too high, the additional gravity would cause the universe to collapse. If the mass was too low, galaxies could not form because the universe would have expanded too fast.

On closer examination, it has some problems. The argument exaggerates the idea of fine tuning by using misleading units of measurement, to make fine tuning seem much more unlikely than it may be. The mass of neutrinos is expressed in Kg. Using kilograms to measure something this small is the equivalent of measuring a person’s height in light years. A better measurement for the neutrino would be electron volts or picograms.

Another point is that most of the constants could not really be any arbitrary number. They are going to hover around some value close to what they actually are. The value of the mass of a neutrino could not be the mass of a bowling ball. Such massive particles with the property of a neutrino could not have been created during the Big Bang.

China unveils detailed goals for 5G-aided Industrial Internet of Things development

China’s Ministry of Industry and Information Technology (MIIT) on Saturday released its second batch of extended goals for promoting the usage of China’s 5G network and the Industrial Internet of Things (IIoT).

IIoT refers to the interconnection between sensors, instruments and other devices to enhance manufacturing efficiency and industrial processes. With a strong focus on machine-to-machine communication, big data and machine learning, the IIoT has been applied across many industrial sectors and applications.

The MIIT announced that the 5G IIoT will be applied in the petrochemical industry, building materials, ports, textiles and home appliances as the 2021 China 5G + Industrial Internet Conference kicked off Saturday in Wuhan, central China’s Hubei Province.

‘Deepfaking the mind’ could improve brain-computer interfaces for people with disabilities

Researchers at the USC Viterbi School of Engineering are using generative adversarial networks (GANs)—technology best known for creating deepfake videos and photorealistic human faces—to improve brain-computer interfaces for people with disabilities.

In a paper published in Nature Biomedical Engineering, the team successfully taught an AI to generate synthetic brain activity data. The data, specifically called spike trains, can be fed into to improve the usability of (BCI).

BCI systems work by analyzing a person’s brain signals and translating that into commands, allowing the user to control like computer cursors using only their thoughts. These devices can improve quality of life for people with motor dysfunction or paralysis, even those struggling with locked-in syndrome—when a person is fully conscious but unable to move or communicate.

Why This Lab Is Slicing Human Brains Into Little Pieces

There’s a multibillion-dollar race going on to build the first complete map of the brain, something scientists are calling the “connectome.” It involves slicing the brain into thousands of pieces, and then digitally stitching them back together using a powerful AI algorithm.

Presented by Polestar.

#HelloWorld #Science #BloombergQuicktake.

About Hello World:

Meet the exotic, colorful, and endlessly entertaining characters that make up the technology industry beyond big tech. Watch Bloomberg’s Ashlee Vance in a journey around the world to find the inventors, scientists and technologists shaping our future: https://youtube.com/playlist?list=PLqq4LnWs3olU-bP2R9uD8YXbt02JjocOk.

——-

Researchers Find Human Learning Can be Duplicated in Synthetic Matter

Rutgers researchers and their collaborators have found that learning — a universal feature of intelligence in living beings — can be mimicked in synthetic matter, a discovery that in turn could inspire new algorithms for artificial intelligence (AI).

The study appears in the journal PNAS.

One of the fundamental characteristics of humans is the ability to continuously learn from and adapt to changing environments. But until recently, AI has been narrowly focused on emulating human logic. Now, researchers are looking to mimic human cognition in devices that can learn, remember and make decisions the way a human brain does.

Understanding Bias in AI: What Is Your Role, and Should You Care?

There are billions of people around the world whose online experience has been shaped by algorithms that utilize artificial intelligence (AI) and machine learning (ML). Some form of AI and ML is employed almost every time people go online, whether they are searching for content, watching a video, or shopping for a product. Not only do these technologies increase the efficiency and accuracy of consumption but, in the online ecosystem, service providers innovate upon and monetize behavioral data that is captured either directly from a user’s device, a website visit or by third parties.

Advertisers are increasingly dependent on this data and the algorithms that adtech and martech employ to understand where their ads should be placed, which ads consumers are likely to engage with, which audiences are most likely to convert, and which publisher should get credit for conversions.

Additionally, the collection and better utilization of data helps publishers generate revenue, minimize data risks and costs, and provide relevant consumer-preference-based audiences for brands.

/* */