Menu

Blog

Archive for the ‘innovation’ category: Page 76

Apr 8, 2022

A freeze-thaw molten salt battery for seasonal storage

Posted by in categories: innovation, robotics/AI

It relies on a new “freeze-thaw” design. A recent study has just been published by U.S. scientists who have managed to develop an aluminum-nickel molten salt battery that can retain over 90% of its initial capacity over a period of up to 12 weeks. Having an energy density of 260 W/hour per kg, the new battery was built with an aluminum anode and a nickel cathode, immersed in a molten-salt electrolyte.


The breakthrough could have many applications in soft robotics including in the Metaverse.

Apr 8, 2022

Amazon and Johns Hopkins announce new AI institute

Posted by in categories: innovation, robotics/AI

Today Amazon and The Johns Hopkins University announced the creation of the JHU + Amazon Initiative for Interactive AI (AI2AI). The collaboration will focus on … See more.


Amazon and Johns Hopkins University (JHU) today announced the creation of the JHU + Amazon Initiative for Interactive AI (AI2AI).

The Amazon-JHU collaboration will focus on driving ground-breaking AI advances with an emphasis on machine learning, computer vision, natural language understanding, and speech processing. Sanjeev Khudanpur, an associate professor in the Department of Electrical and Computer Engineering, will serve as the founding director of the initiative.

Continue reading “Amazon and Johns Hopkins announce new AI institute” »

Apr 7, 2022

Research places new limits on the bizarre behavior of neutrinos

Posted by in categories: innovation, particle physics

In a laboratory under a mountain, physicists are using crystals far colder than frozen air to study ghostly particles, hoping to learn secrets from the beginning of the universe. Researchers at the Cryogenic Underground Observatory for Rare Events (CUORE) announced this week that they had placed some of the most stringent limits yet on the strange possibility that the neutrino is its own antiparticle. Neutrinos are deeply unusual particles, so ethereal and so ubiquitous that they regularly pass through our bodies without us noticing. CUORE has spent the last three years patiently waiting to see evidence of a distinctive nuclear decay process, only possible if neutrinos and antineutrinos are the same particle. CUORE’s new data shows that this decay doesn’t happen for trillions of trillions of years, if it happens at all. CUORE’s limits on the behavior of these tiny phantoms are a crucial part of the search for the next breakthrough in particle and nuclear physics—and the search for our own origins.

“Ultimately, we are trying to understand matter creation,” said Carlo Bucci, researcher at the Laboratori Nazionali del Gran Sasso (LNGS) in Italy and the spokesperson for CUORE. “We’re looking for a process that violates a fundamental symmetry of nature,” added Roger Huang, a postdoctoral researcher at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and one of the lead authors of the new study.

CUORE—Italian for “heart”—is among the most sensitive neutrino experiments in the world. The new results from CUORE are based on a data set ten times larger than any other high-resolution search, collected over the last three years. CUORE is operated by an international research collaboration, led by the Istituto Nazionale di Fisica Nucleare (INFN) in Italy and Berkeley Lab in the US. The CUORE detector itself is located under nearly a mile of solid rock at LNGS, a facility of the INFN. U.S. Department of Energy-supported nuclear physicists play a leading scientific and technical role in this experiment. CUORE’s new results were published today in Nature.

Apr 7, 2022

Google announces new tool for data storage and integration

Posted by in categories: computing, innovation

Throughout the past decade, several metaphors and labels have evolved to describe the software that curates the data storage. Some were called warehouses; they generally offered stronger structure and compliance, but they were often unable to manage the larger volumes of information from modern web applications. Another term, the “data lake,” referred to less structured collections that were engineered to scale easily, in part because they enforced fewer rules. Google wants BigLake to offer the control of the best data warehouses with the seemingly endless availability of cloud storage.

“All of these organizations who try to innovate on top of the data lake found it to be, at the end of the day, just a data swamp,” said Kazmaier. “Our innovation at Google Cloud is that we take BigQuery and its unique architecture, its unique Serverless model, its unique storage architecture and a unique compute architecture and [integrate it] with open-source file formats and open-source processing engines.”

The open-source architecture is intended to allow customers to adopt Google’s tools slowly through integration with existing data infrastructure. These open formats simplify sharing information, making it a more welcoming environment.

Apr 6, 2022

Breakthrough Discovery of New Model for “Global” DNA Repair

Posted by in categories: biotech/medical, innovation

Breakthrough techniques in living cells upend field.

Two studies provide a radically new picture of how bacterial cells continually repair damaged sections (lesions) in their DNA.

Led by researchers from NYU Grossman School of Medicine, the work revolves around the delicacy of DNA molecules, which are vulnerable to damage by reactive byproducts of cellular metabolism, toxins, and ultraviolet light. Given that damaged DNA can result in detrimental DNA code changes (mutations) and death, cells evolved to have DNA repair machineries. A major unresolved question in the field, however, is how do these machineries rapidly search for and find rare stretches of damage amid the “vast fields” of undamaged DNA.

Apr 5, 2022

Pathways Language Model (PaLM): Scaling to 540 Billion Parameters for Breakthrough Performance

Posted by in categories: innovation, robotics/AI

In recent years, large neural networks trained for language understanding and generation have achieved impressive results across a wide range of tasks. GPT-3 first showed that large language models (LLMs) can be used for few-shot learning and can achieve impressive results without large-scale task-specific data collection or model parameter updating. More recent LLMs, such as GLaM, LaMDA, Gopher, and Megatron-Turing NLG, achieved state-of-the-art few-shot results on many tasks by scaling model size, using sparsely activated modules, and training on larger datasets from more diverse sources. Yet much work remains in understanding the capabilities that emerge with few-shot learning as we push the limits of model scale.

Last year Google Research announced our vision for Pathways, a single model that could generalize across domains and tasks while being highly efficient. An important milestone toward realizing this vision was to develop the new Pathways system to orchestrate distributed computation for accelerators. In “PaLM: Scaling Language Modeling with Pathways”, we introduce the Pathways Language Model (PaLM), a 540-billion parameter, dense decoder-only Transformer model trained with the Pathways system, which enabled us to efficiently train a single model across multiple TPU v4 Pods. We evaluated PaLM on hundreds of language understanding and generation tasks, and found that it achieves state-of-the-art few-shot performance across most tasks, by significant margins in many cases.

Apr 3, 2022

Forget ray tracing — Nvidia calls path tracing one of the “largest breakthroughs for real-time graphics in many years”

Posted by in category: innovation

That’s quite the claim.

Apr 1, 2022

Introducing, Museum of The Future — The chance to live in 2071!

Posted by in categories: habitats, innovation

Another apple of the eye in the face of Dubai.

Dubai’s penchant for housing some of the world’s most magnificent creations is no secret. Beautiful buildings and jaw-dropping structures with breath-taking designs have helped the UAE capital build a solid foundation for its identity as one of the top tourist destinations throughout the globe. On the palindrome date of 22nd February 2022, Dubai added yet another feather in the cap to its stunning collection of architectural marvels as it unveiled the Museum of The Future — a standing tribute to science and technology that will allow the visitors an immersive experience of living the future. It will house some of the world’s most futuristic technologies, ideas, and innovative products.

The spectacular structure of the Museum of The Future is perhaps one of the most complex and complicated designs ever created and willed into solid reality in the history of architecture. So much so that His Highness Sheikh Mohammed bin Rashid Al Maktoum, the ruler of Dubai, has already touted it as ‘the most beautiful building in the world’ to give a tribute to its marvelous design. Talking more about the structure, the museum has an elliptical shape that has invited different symbolic interpretations. Some say the elliptical shape represents humanity and the void represents the unknown future. On the flip side, some have compared the structure to that of the human eye that is looking at the future.

Mar 27, 2022

8 NASA concepts that could unlock new secrets about the universe

Posted by in categories: innovation, space travel

Each year, the NASA Innovative Advanced Concepts (NIAC) program awards grants to researchers to develop the next generation of technology that will help us explore cosmic unknowns.


From the oceans of Europa to the atmosphere of Venus, these inventions funded by NASA could propel space exploration even further.

Mar 22, 2022

Microsoft Translator enhanced with Z-code Mixture of Experts models

Posted by in categories: innovation, robotics/AI

Translator, a Microsoft Azure Cognitive Service, is adopting Z-code Mixture of Experts models, a breakthrough AI technology that significantly improves the quality of production translation models. As a component of Microsoft’s larger XYZ-code initiative to combine AI models for text, vision, audio, and language, Z-code supports the creation of AI systems that can speak, see, hear, and understand. This effort is a part of Azure AI and Project Turing, focusing on building multilingual, large-scale language models that support various production teams. Translator is using NVIDIA GPUs and Triton Inference Server to deploy and scale these models efficiently for high-performance inference. Translator is the first machine translation provider to introduce this technology live for customers.

Z-code MoE boosts efficiency and quality

Z-code models utilize a new architecture called Mixture of Experts (MoE), where different parts of the models can learn different tasks. The models learn to translate between multiple languages at the same time. The Z-code MoE model utilizes more parameters while dynamically selecting which parameters to use for a given input. This enables the model to specialize a subset of the parameters (experts) during training. At runtime, the model uses the relevant experts for the task, which is more computationally efficient than utilizing all model’s parameters.

Page 76 of 196First7374757677787980Last