Menu

Blog

Archive for the ‘information science’ category: Page 17

Oct 1, 2023

How AI and Machine Learning Are Transforming Liver Disease Diagnosis and Treatment

Posted by in categories: biotech/medical, information science, mathematics, robotics/AI

AI can also help develop objective risk stratification scores, predict the course of disease or treatment outcomes in CLD or liver cancer, facilitate easier and more successful liver transplantation, and develop quality metrics for hepatology.


Artificial Intelligence (AI) is an umbrella term that covers all computational processes aimed at mimicking and extending human intelligence for problem-solving and decision-making. It is based on algorithms or arrays of mathematical formulae that make up specific computational learning methods. Machine learning (ML) and deep learning (DL) use algorithms in more complex ways to predict learned and new outcomes.

AI-powered liver disease diagnosis Machine learning for treatment planning Predicting disease progression The future of hepatology References Further reading

Continue reading “How AI and Machine Learning Are Transforming Liver Disease Diagnosis and Treatment” »

Sep 30, 2023

AI language models can exceed PNG and FLAC in lossless compression, says study

Posted by in categories: information science, robotics/AI

Effective compression is about finding patterns to make data smaller without losing information. When an algorithm or model can accurately guess the next piece of data in a sequence, it shows it’s good at spotting these patterns. This links the idea of making good guesses—which is what large language models like GPT-4 do very well —to achieving good compression.

In an arXiv research paper titled “Language Modeling Is Compression,” researchers detail their discovery that the DeepMind large language model (LLM) called Chinchilla 70B can perform lossless compression on image patches from the ImageNet image database to 43.4 percent of their original size, beating the PNG algorithm, which compressed the same data to 58.5 percent. For audio, Chinchilla compressed samples from the LibriSpeech audio data set to just 16.4 percent of their raw size, outdoing FLAC compression at 30.3 percent.

In this case, lower numbers in the results mean more compression is taking place. And lossless compression means that no data is lost during the compression process. It stands in contrast to a lossy compression technique like JPEG, which sheds some data and reconstructs some of the data with approximations during the decoding process to significantly reduce file sizes.

Sep 29, 2023

Quantum Material Exhibits “Non-Local” Behavior That Mimics Brain Function

Posted by in categories: computing, information science, mathematics, neuroscience, quantum physics

We often believe computers are more efficient than humans. After all, computers can complete a complex math equation in a moment and can also recall the name of that one actor we keep forgetting. However, human brains can process complicated layers of information quickly, accurately, and with almost no energy input: recognizing a face after only seeing it once or instantly knowing the difference between a mountain and the ocean. These simple human tasks require enormous processing and energy input from computers, and even then, with varying degrees of accuracy.

Creating brain-like computers with minimal energy requirements would revolutionize nearly every aspect of modern life. Funded by the Department of Energy, Quantum Materials for Energy Efficient Neuromorphic Computing (Q-MEEN-C) — a nationwide consortium led by the University of California San Diego — has been at the forefront of this research.

UC San Diego Assistant Professor of Physics Alex Frañó is co-director of Q-MEEN-C and thinks of the center’s work in phases. In the first phase, he worked closely with President Emeritus of University of California and Professor of Physics Robert Dynes, as well as Rutgers Professor of Engineering Shriram Ramanathan. Together, their teams were successful in finding ways to create or mimic the properties of a single brain element (such as a neuron or synapse) in a quantum material.

Sep 28, 2023

What’s a Qubit? 3 Ways Scientists Build Quantum Computers

Posted by in categories: information science, mobile phones, particle physics, quantum physics, supercomputing

A complete quantum computing system could be as large as a two-car garage when one factors in all the paraphernalia required for smooth operation. But the entire processing unit, made of qubits, would barely cover the tip of your finger.

Today’s smartphones, laptops and supercomputers contain billions of tiny electronic processing elements called transistors that are either switched on or off, signifying a 1 or 0, the binary language computers use to express and calculate all information. Qubits are essentially quantum transistors. They can exist in two well-defined states—say, up and down—which represent the 1 and 0. But they can also occupy both of those states at the same time, which adds to their computing prowess. And two—or more—qubits can be entangled, a strange quantum phenomenon where particles’ states correlate even if the particles lie across the universe from each other. This ability completely changes how computations can be carried out, and it is part of what makes quantum computers so powerful, says Nathalie de Leon, a quantum physicist at Princeton University. Furthermore, simply observing a qubit can change its behavior, a feature that de Leon says might create even more of a quantum benefit. “Qubits are pretty strange. But we can exploit that strangeness to develop new kinds of algorithms that do things classical computers can’t do,” she says.

Scientists are trying a variety of materials to make qubits. They range from nanosized crystals to defects in diamond to particles that are their own antiparticles. Each comes with pros and cons. “It’s too early to call which one is the best,” says Marina Radulaski of the University of California, Davis. De Leon agrees. Let’s take a look.

Sep 28, 2023

A new kind of chip for quantum technology

Posted by in categories: cybercrime/malcode, engineering, information science, quantum physics, supercomputing

Today, we are living in the midst of a race to develop a quantum computer, one that could be used for practical applications. This device, built on the principles of quantum mechanics, holds the potential to perform computing tasks far beyond the capabilities of today’s fastest supercomputers. Quantum computers and other quantum-enabled technologies could foster significant advances in areas such as cybersecurity and molecular simulation, impacting and even revolutionizing fields such as online security, drug discovery and material fabrication.

An offshoot of this technological race is building what is known in scientific and engineering circles as a “”—a special type of quantum computer, constructed to solve one equation model for a specific purpose beyond the computing power of a standard computer. For example, in , a quantum could theoretically be built to help scientists simulate a specific, complex molecular interaction for closer study, deepening and speeding up drug development.

But just like building a practical, usable quantum computer, constructing a useful quantum simulator has proven to be a daunting challenge. The idea was first proposed by mathematician Yuri Manin in 1980. Since then, researchers have attempted to employ trapped ions, cold atoms and to build a quantum simulator capable of real-world applications, but to date, these methods are all still a work in progress.

Sep 28, 2023

New AI algorithm can detect signs of life with 90% accuracy. Scientists want to send it to Mars

Posted by in categories: information science, robotics/AI, space travel

A new AI method can distinguish between biotic and abiotic samples with 90% accuracy.

Sep 28, 2023

Scientists discover a 100-year-old math error, changing how humans see color

Posted by in categories: computing, information science, mapping, mathematics

In a press release, Bujack, who creates scientific visualizations at Los Alamos National Laboratory, called the current mathematical models used for color perceptions incorrect and requiring a “paradigm shift.”

A surprise finding

Being able to accurately model human color perception has a tremendous impact on automating image processing, computer graphics, and visualization. Bujack’s team first set out to develop algorithms that would automatically enhance color maps used in data visualization to make it easier to read them.

Sep 27, 2023

Imaging the elusive skyrmion: Neutron tomography reveals their shapes and dynamics in bulk materials

Posted by in categories: climatology, computing, information science

Scientists at the National Institute of Standards and Technology (NIST) with colleagues elsewhere have employed neutron imaging and a reconstruction algorithm to reveal for the first time the 3D shapes and dynamics of very small tornado-like atomic magnetic arrangements in bulk materials.

These collective atomic arrangements, called skyrmions—if fully characterized and understood—could be used to process and store information in a densely packed form that uses several orders of magnitude less energy than is typical now.

The conventional, semiconductor-based method of processing information in binary form (on or off, 0 or 1) employs electrical charge states that must be constantly refreshed by current which encounters resistance as it passes through transistors and connectors. That’s the main reason that computers get hot.

Sep 26, 2023

Two Black Holes Masquerading as One

Posted by in categories: cosmology, information science, quantum physics

Black holes may be less unique than previously thought, as the expansion due to a cosmological constant can hold apart a pair of holes and allow them to mimic a single black hole.

Black holes are astonishing objects that can pack the mass of Earth into a space the size of a pea. A remarkable attribute is their stunning simplicity, which is encapsulated in the celebrated uniqueness theorems [1]. Briefly stated, these theorems say that there is only one solution to Einstein’s equations of general relativity for a fully collapsed (nonevolving) system having fixed mass and angular momentum [2]. The implication is that all black holes that have settled down to equilibrium with the same mass and rotation are precisely the same: their entire behavior described by a single equation—the so-called Kerr solution—filling only a few lines of paper!

But there is a catch. The uniqueness theorems make a number of assumptions, the key one being that the space around the black hole is “empty”—in other words, there is no energy that might influence the black hole. Such energy can arise from fields, for example, those of the standard model, or from a “cosmological constant,” which is a form of dark energy that might be behind the accelerated expansion of our Universe today. In a fascinating study, Óscar Dias from the University of Southampton, UK, and colleagues demonstrate that uniqueness is violated in the presence of a positive cosmological constant [3]. Specifically, they show that a pair of black holes whose mutual attraction is balanced by the cosmic expansion would look the same to a distant observer as a single isolated black hole. The results may lead to a rethinking of how simple black holes really are.

Sep 23, 2023

Step forward for massive DNA computer systems

Posted by in categories: biotech/medical, computing, information science

The group at Shanghai Jiao Tong University has demonstrated a DNA computer system using DNA integrated circuits (DICs) that can solve quadratic equations with 30 logic gates.

Published in Nature, the system integrates multiple layers of DNA-based programmable gate arrays (DPGAs). This uses generic single-stranded oligonucleotides as a uniform transmission signal can reliably integrate large-scale DICs with minimal leakage and high fidelity for general-purpose computing.

To control the intrinsically random collision of molecules, the team designed DNA origami registers to provide the directionality for asynchronous execution of cascaded DPGAs. This was used to assemble a DIC that can solve quadratic equations with three layers of cascade DPGAs comprising 30 logic gates with around 500 DNA strands.

Page 17 of 280First1415161718192021Last