Menu

Blog

Archive for the ‘information science’ category: Page 50

Apr 27, 2023

Metamaterial Provides Underwater Stealth

Posted by in categories: genetics, information science

A lightweight structure made of rubber and metal layers can provide an object with underwater acoustic stealth over a broad frequency range.

An acoustic “cloak” could hide an underwater object from detection by sonar devices or by echolocating marine animals. Much like camouflage clothing allows figures to fade into a background, acoustic camouflage can make an object indistinguishable from the surrounding water. Underwater acoustic cloaks have previously been demonstrated, but they typically work over a narrow range of frequencies or are too bulky to be practical. Now Hao-Wen Dong at the Beijing Institute of Technology and colleagues demonstrate a lightweight, broadband cloak made of a thin shell of layered material. The cloak achieves acoustic stealth by both blocking the reflection of sonar pings off the surface and preventing the escape of sound generated from within the cloaked object [1].

Dong and colleagues designed a 4-cm-thick structure—combining an outer rubber layer and a “metamaterial” made of porous aluminum—which covered a steel plate. Using a genetic algorithm, they optimized the metamaterial’s elastic properties to tailor the interaction with underwater sound waves. Specifically, the metamaterial converts impinging longitudinal sound waves, which can travel long distances underwater, to transverse sound waves, which cannot propagate through water. These transverse waves get trapped in the rubber layer, where they get absorbed, eliminating reflected and transmitted waves simultaneously. The researchers built and tested a prototype cloak, confirming that it behaved as predicted. In particular, it absorbed 80% of the energy of incoming sound waves while offering 100-fold attenuation of acoustic noise produced on the side of the steel plate.

Apr 27, 2023

A new quantum approach to solve electronic structures of complex materials

Posted by in categories: chemistry, computing, engineering, information science, quantum physics

If you know the atoms that compose a particular molecule or solid material, the interactions between those atoms can be determined computationally, by solving quantum mechanical equations—at least, if the molecule is small and simple. However, solving these equations, critical for fields from materials engineering to drug design, requires a prohibitively long computational time for complex molecules and materials.

Now, researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory and the University of Chicago’s Pritzker School of Molecular Engineering (PME) and Department of Chemistry have explored the possibility of solving these electronic structures using a quantum .

The research, which uses a combination of new computational approaches, was published online in the Journal of Chemical Theory and Computation. It was supported by Q-NEXT, a DOE National Quantum Information Science Research Center led by Argonne, and by the Midwest Integrated Center for Computational Materials (MICCoM).

Apr 24, 2023

Auto-GPT May Be The Strong AI Tool That Surpasses ChatGPT

Posted by in categories: information science, robotics/AI, transportation

Like many people, you may have had your mind blown recently by the possibility of ChatGPT and other large language models like the new Bing or Google’s Bard.

For anyone who somehow hasn’t come across them — which is probably unlikely as ChatGPT is reportedly the fastest-growing app of all time — here’s a quick recap:

Continue reading “Auto-GPT May Be The Strong AI Tool That Surpasses ChatGPT” »

Apr 23, 2023

On theoretical justification of the forward–backward algorithm for the variational learning of Bayesian hidden Markov models

Posted by in categories: computing, information science

Hidden Markov model (HMM) [ 1, 2 ] is a powerful model to describe sequential data and has been widely used in speech signal processing [ 3-5 ], computer vision [ 6-8 ], longitudinal data analysis [ 9 ], social networks [ 10-12 ] and so on. An HMM typically assumes the system has K internal states, and the transition of states forms a Markov chain. The system state cannot be observed directly, thus we need to infer the hidden states and system parameters based on observations. Due to the existence of latent variables, the Expectation-Maximisation (EM) algorithm [ 13, 14 ] is often used to learn an HMM. The main difficulty is to calculate site marginal distributions and pairwise marginal distributions based on the posterior distribution of latent variables. The forward-backward algorithm was specifically designed to tackle this problem. The derivation of the forward-backward algorithm heavily relies on HMM assumptions and probabilistic relationships between quantities, thus requiring the parameters in the posterior distribution to have explicit probabilistic meanings.

Bayesian HMM [ 15-22 ] further imposes priors on the parameters of HMM, and the resulting model is more robust. It has been demonstrated that Bayesian HMM often outperforms HMM in applications. However, the learning process of a Bayesian HMM is more challenging since the posterior distribution of latent variables is intractable. Mean-field theory-based variational inference is often utilised in the E-step of the EM algorithm, which tries to find an optimal approximation of the posterior distribution in a factorised family. The variational inference iteration also involves computing site marginal distributions and pairwise marginal distributions given the joint distribution of system state indicator variables. Existing works [ 15-23 ] directly apply the forward-backward algorithm to obtain these values without justification. This is not theoretically sound and the result is not guaranteed to be correct, since the requirements of the forward-backward algorithm are not met in this case.

In this paper, we prove that the forward-backward algorithm can be applied in more general cases where the parameters have no probabilistic meanings. The first proof converts the general case to an HMM and uses the correctness of the forward-backward algorithm on HMM to prove the claim. The second proof is model-free, which derives the forward-backward algorithm in a totally different way. The new derivation does not rely on HMM assumptions and merely utilises matrix techniques to rewrite the desired quantities. Therefore, this derivation naturally proves that it is unnecessary to make probabilistic requirements on the parameters of the forward-backward algorithm. Specifically, we justify that heuristically applying the forward-backward algorithm in the variational learning of Bayesian HMM is theoretically sound and guaranteed to return the correct result.

Apr 23, 2023

Quantum circuit learning as a potential algorithm to predict experimental chemical properties

Posted by in categories: chemistry, information science, quantum physics

We introduce quantum circuit learning (QCL) as an emerging regression algorithm for chemo-and materials-informatics. The supervised model, functioning on the rule of quantum mechanics, can process linear and smooth non-linear functions from small datasets (100 records). Compared with conventional algorithms, such as random forest, support vector machine, and linear regressions, the QCL can offer better predictions with some one-dimensional functions and experimental chemical databases. QCL will potentially help the virtual exploration of new molecules and materials more efficiently through its superior prediction performances.

Apr 22, 2023

The Multiverse: Our Universe Is Suspiciously Unlikely to Exist—Unless It Is One of Many

Posted by in categories: alien life, information science, particle physics

But we expect that it’s in that first tiny fraction of a second that the key features of our universe were imprinted.

The conditions of the universe can be described through its “fundamental constants”—fixed quantities in nature, such as the gravitational constant (called G) or the speed of light (called C). There are about 30 of these representing the sizes and strengths of parameters such as particle masses, forces, or the universe’s expansion. But our theories don’t explain what values these constants should have. Instead, we have to measure them and plug their values into our equations to accurately describe nature.

Continue reading “The Multiverse: Our Universe Is Suspiciously Unlikely to Exist—Unless It Is One of Many” »

Apr 21, 2023

Artificial intelligence has improved the first-ever real photo of a supermassive black hole 6.5 billion times heavier than the Sun

Posted by in categories: cosmology, information science, robotics/AI

In 2017, the European Southern Observatory (ESO) obtained the first ever real photo of a black hole. Six years later, artificial intelligence was able to improve the image.

Here’s What We Know

American scientists have decided to improve the photo of a black hole. The original image shows something resembling a “fuzzy donut”. Experts have applied the PRIMO algorithm, based on machine learning, to improve the image.

Apr 21, 2023

Giant orbital magnetic moment appears in a graphene quantum dot

Posted by in categories: computing, information science, particle physics, quantum physics

A giant orbital magnetic moment exists in graphene quantum dots, according to new work by physicists at the University of California Santa Cruz in the US. As well as being of fundamental interest for studying systems with relativistic electrons – that is those travelling at near-light speeds – the work could be important for quantum information science since these moments could encode information.

Graphene, a sheet of carbon just one atom thick, has a number of unique electronic properties, many of which arise from the fact that it is a semiconductor with a zero-energy gap between its valence and conduction bands. Near where the two bands meet, the relationship between the energy and momentum of charge carriers (electrons and holes) in the material is described by the Dirac equation and resembles that of a photon, which is massless.

These bands, called Dirac cones, enable the charge carriers to travel through graphene at extremely high, “ultra-relativistic” speeds approaching that of light. This extremely high mobility means that graphene-based electronic devices such as transistors could be faster than any that exist today.

Apr 20, 2023

Is deep learning a necessary ingredient for artificial intelligence?

Posted by in categories: information science, robotics/AI, transportation

The earliest artificial neural network, the Perceptron, was introduced approximately 65 years ago and consisted of just one layer. However, to address solutions for more complex classification tasks, more advanced neural network architectures consisting of numerous feedforward (consecutive) layers were later introduced. This is the essential component of the current implementation of deep learning algorithms. It improves the performance of analytical and physical tasks without human intervention, and lies behind everyday automation products such as the emerging technologies for self-driving cars and autonomous chat bots.

The key question driving new research published today in Scientific Reports is whether efficient learning of non-trivial classification tasks can be achieved using brain-inspired shallow feedforward networks, while potentially requiring less .

Continue reading “Is deep learning a necessary ingredient for artificial intelligence?” »

Apr 20, 2023

What’s AGI, and Why Are AI Experts Skeptical?

Posted by in categories: information science, robotics/AI

ChatGPT and other bots have revived conversations on artificial general intelligence. Scientists say algorithms won’t surpass you any time soon.

Page 50 of 291First4748495051525354Last