Menu

Blog

Page 2460

Feb 21, 2023

The fungus in the HBO series The Last of Us turns humans into zombies. Should you be afraid?

Posted by in categories: biological, neuroscience

The fungal pathogen that wipes out much of humanity in HBO’s latest series The Last of Us is real, but can the cordyceps fungus actually turn humans into zombies one day?

“It’s highly unlikely because these are organisms that have become really well adapted to infecting ants,” Rebecca Shapiro, assistant professor at University of Guelph’s department of molecular and cellular biology, told Craig Norris, host of CBC Kitchener-Waterloo’s The Morning Edition.

In the television series, the fungus infects the brain of humans and turns them into zombies. In real life, it can only infect ants and other insects in this manner.

Feb 21, 2023

Path To AGI, AI Alignment, Digital Minds | Nick Bostrom and Juan Benet | Breakthroughs in Computing

Posted by in categories: neuroscience, robotics/AI

Protocol Labs founder Juan Benet speaks with Nick Bostrom, a Swedish-born philosopher with a background in theoretical physics, computational neuroscience, l…

Feb 21, 2023

Surprise! Colliding neutron stars create perfectly spherical ‘kilonova’ explosions

Posted by in category: space

Kilonova explosions created when neutron stars collide and merge are perfectly spherical, not flattened discs as previously expected, a new study suggests.

Feb 21, 2023

James Webb Space Telescope spies baby stars dancing in swirling gas and dust (photos)

Posted by in categories: physics, space

The James Webb Space Telescope (JWST) is still doing its job — and doing it very well. Released today, this image shows the arms of barred spiral galaxy NGC 1,433 teeming with young stars that can be seen affecting the clouds of gas and dust around them. The image was taken as part of the Physics at High Angular resolution in Nearby Galaxies (PHANGS) collaboration, of which more than 100 researchers around the world are a part.

One of the James Webb Space Telescope’s first science programs is to image 19 spiral galaxies for PHANGS with its Mid-Infrared Instrument (MIRI), which is capable of seeing through gas and dust clouds that are impenetrable with other types of imaging.

Feb 21, 2023

When dinosaurs roamed Antarctica

Posted by in category: futurism

Antarctica — icy, empty, desolate, cold — these are words you may use to describe it, but it hasn’t always been that way.

There was once a time when the great southern landmass was covered in forests and dinosaurs roamed free. How could such an icy wilderness once have been so warm that it could support Earth’s most gigantic creatures?

Feb 21, 2023

Telomeres Found To Encode Two Proteins, Potentially Transforming Cancer Research

Posted by in categories: biotech/medical, genetics, life extension

Telomeres – the protective caps at the tips of chromosomes – can encode two proteins, something that was previously thought impossible, new research has suggested. The discovery of genetic information coding for these proteins, one of which is elevated in some human cancers, could have huge ramifications for the fields of health, medicine, and cell biology.

“Discovering that telomeres encode two novel signaling proteins will change our understanding of cancer, aging, and how cells communicate with other cells,” study author Jack Griffith, the Kenan Distinguished Professor of Microbiology and Immunology at the University of North Carolina at Chapel Hill, said in a statement.

“Based on our research, we think simple blood tests for these proteins could provide a valuable screen for certain cancers and other human diseases,” Griffith, who is also a member of the UNC Lineberger Comprehensive Cancer Center, added. “These tests also could provide a measure of ‘telomere health,’ because we know telomeres shorten with age.”

Feb 21, 2023

Best Prompt Engineering Tips for Beginners in 2023

Posted by in category: robotics/AI

What is Prompt Engineering?

Artificial intelligence, particularly natural language processing, has a notion called prompt engineering (NLP). In prompt engineering, the job description is included explicitly in the input, such as a question, instead of being provided implicitly. Typically, prompt engineering involves transforming one or more tasks into a prompt-based dataset and “prompt-based learning”—also known as “prompt learning”—to train a language model. Prompt engineering, also known as “prefix-tuning” or “prompt tuning,” is a method wherein a big, “frozen” pretrained language model is used, and just the prompt’s representation is learned.

Developing the ChatGPT Tool, GPT-2, and GPT-3 language models was crucial for prompt engineering. Multitask prompt engineering in 2021 has shown strong performance on novel tasks utilizing several NLP datasets. Few-shot learning examples prompt with a thought chain provide a stronger representation of language model thinking. Prepaying text to a zero-shot learning prompt that supports a chain of reasoning, such as “Let’s think step by step,” may enhance a language model’s performance in multi-step reasoning tasks. The release of various open-source notebooks and community-led image synthesis efforts helped make these tools widely accessible.

Feb 21, 2023

Meet LAMPP: A New AI Approach From MIT To Integrate Background Knowledge From Language Into Decision-Making Problems

Posted by in categories: information science, robotics/AI

Common sense priors are essential to make decisions under uncertainty in real-world settings. Let’s say they want to give the scenario in Fig. 1 some labels. As a few key elements are recognized, it becomes evident that the image shows a restroom. This assists in resolving some of the labels for certain more difficult objects, such as the shower curtain in the scene rather than the window curtain and the mirror instead of the portrait on the wall. In addition to visual tasks, prior knowledge of expected item or event co-occurrences is crucial for navigating new environments and comprehending the actions of other agents. Moreover, such expectations are essential to object categorization and reading comprehension.

Unlike robot demos or segmented pictures, vast text corpora are easily accessible and include practically all aspects of the human experience. Current machine learning models use task-specific datasets to learn about the previous distribution of labels and judgments for the majority of problem domains. When training data is skewed or sparse, this can lead to systematic mistakes, particularly on uncommon or out-of-distribution inputs. How might they provide models with broader, more adaptable past knowledge? They suggest using learned distributions over natural language strings known as language models as task-general probabilistic priors.

LMs have been employed as sources of prior knowledge for tasks ranging from common-sense question answering to modeling scripts and tales to synthesizing probabilistic algorithms in language processing and other text production activities. They frequently give higher diversity and fidelity than small, task-specific datasets for encoding much of this information, such as the fact that plates are found in kitchens and dining rooms and that breaking eggs comes before whisking them. It has also been proposed that such language monitoring contributes to common-sense human knowledge in areas that are challenging to learn from first-hand experience.

Feb 21, 2023

Why is the sky blue?

Posted by in category: futurism

What’s the scientific reason behind Earth’s sky appearing blue to the human eye?

Feb 21, 2023

Researchers at CERN break “The Speed of Light”

Posted by in category: particle physics

Scientists said they recorded particles travelling faster than light – a finding that could overturn one of Einstein’s fundamental laws of the universe. Antonio Ereditato, spokesman for the international group of researchers, saidthat measurements taken over three years showed neutrinos pumped from CERN near Geneva to Gran Sasso in Italy had arrived 60 nanoseconds quicker than light would have done.