Menu

Blog

Archive for the ‘computing’ category: Page 308

Apr 19, 2022

Dr. Erin Duffy, Ph.D. & Kevin Outterson, ESQ — Combating Antibiotic-Resistant Bacteria (CARB-X)

Posted by in categories: biotech/medical, computing, health, law

Combating Antibiotic-Resistant Bacteria — Dr. Erin Duffy, Ph.D., Chief of Research & Development, and Kevin Outterson, ESQ., Executive Director, CARB-X.


The Combating Antibiotic-Resistant Bacteria Biopharmaceutical Accelerator (CARB-X — https://carb-x.org/) is a global non-profit partnership accelerating antibacterial products to address drug-resistant bacteria, a leading cause of death around the world. 1.27 million deaths worldwide were attributed to resistant bacterial infections in 2019.

Continue reading “Dr. Erin Duffy, Ph.D. & Kevin Outterson, ESQ — Combating Antibiotic-Resistant Bacteria (CARB-X)” »

Apr 19, 2022

Study shows simple, computationally-light model can simulate complex brain cell responses

Posted by in categories: biotech/medical, chemistry, computing, mathematics, neuroscience

The brain is inarguably the single most important organ in the human body. It controls how we move, react, think and feel, and enables us to have complex emotions and memories. The brain is composed of approximately 86 billion neurons that form a complex network. These neurons receive, process, and transfer information using chemical and electrical signals.

Learning how respond to different signals can further the understanding of cognition and development and improve the management of disorders of the brain. But experimentally studying neuronal networks is a complex and occasionally invasive procedure. Mathematical models provide a non-invasive means to accomplish the task of understanding , but most current models are either too computationally intensive, or they cannot adequately simulate the different types of complex neuronal responses. In a recent study, published in Nonlinear Theory and Its Applications, IEICE, a research team led by Prof. Tohru Ikeguchi of Tokyo University of Science, has analyzed some of the complex responses of neurons in a computationally simple neuron model, the Izhikevich neuron model.

“My laboratory is engaged in research on neuroscience and this study analyzes the basic mathematical properties of a neuron model. While we analyzed a single neuron model in this study, this model is often used in computational neuroscience, and not all of its properties have been clarified. Our study fills that gap,” explains Prof. Ikeguchi. The research team also comprised Mr. Yota Tsukamoto and Ph.D. student Ms. Honami Tsushima, also from Tokyo University of Science.

Apr 18, 2022

A new quantum encryption breakthrough could lead to hacker-proof communication

Posted by in categories: computing, encryption, internet, quantum physics

Scientists from Beijing set a new quantum secure direct communication (QSDC) world record of 102.2 km (64 miles), a massive leap over the previous record of 18 km (11 miles), according to The Eurasian Times.

The research could eventually lead to a massive quantum communications network that would be virtually hacker-proof due to the nature of the technology.

The researchers, who published their findings in a paper in Nature, demonstrated transmission speeds of 0.54 bits per second, much slower than communications using classical computing devices. Still, this was fast enough for phone call and text message encryption over a distance of 30 km (19 miles).

Apr 18, 2022

Simple, Computationally-Light Model Can Simulate Complex Brain Cell Responses

Posted by in categories: biotech/medical, chemistry, computing, neuroscience

Summary: The Izhikevich neuron model allows the simulation of both periodic and quasi-periodic responses in neurons at lower computational cost.

Source: Tokyo University of Science.

The brain is inarguably the single most important organ in the human body. It controls how we move, react, think and feel, and enables us to have complex emotions and memories. The brain is composed of approximately 86 billion neurons that form a complex network. These neurons receive, process, and transfer information using chemical and electrical signals.

Apr 18, 2022

Scientists develop new computational approach to reduce noise in X-ray data

Posted by in categories: computing, nanotechnology

Scientists from the National Synchrotron Light Source II (NSLS-II) and Computational Science Initiative (CSI) at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory have helped to solve a common problem in synchrotron X-ray experiments: reducing the noise, or meaningless information, present in data. Their work aims to improve the efficiency and accuracy of X-ray studies at NSLS-II, with the goal of enhancing scientists’ overall research experience at the facility.

NSLS-II, a DOE Office of Science user facility, produces X-ray beams for the study of a huge variety of samples, from potential new battery materials to plants that can remediate contaminated soil. Researchers from across the nation and around the globe come to NSLS-II to investigate their samples using X-rays, collecting huge amounts of data in the process. One of the many X-ray techniques available at NSLS-II to visiting researchers is X-ray photon correlation spectroscopy (XPCS). XPCS is typically used to study material behaviors that are time-dependent and take place at the nanoscale and below, such as the dynamics between and within structural features, like tiny grains. XPCS has been used, for example, to study magnetism in advanced computing materials and structural changes in polymers (plastics).

While XPCS is a powerful technique for gathering information, the quality of the data collected and range of materials that can be studied is limited by the “flux” of the XPCS X-ray beam. Flux is a measure of the number of X-rays passing through a given area at a point in time, and high flux can lead to too much “noise” in the data, masking the signal the scientists are seeking. Efforts to reduce this noise have been successful for certain experimental setups. But for some types of XPCS experiments, achieving a more reasonable signal-to-noise ratio is a big challenge.

Apr 17, 2022

Graphene-hBN breakthrough to spur new LEDs, quantum computing

Posted by in categories: computing, quantum physics

In a discovery that could speed research into next-generation electronics and LED devices, a University of Michigan research team has developed the first reliable, scalable method for growing single layers of hexagonal boron nitride on graphene.

The process, which can produce large sheets of high-quality hBN with the widely used molecular-beam epitaxy process, is detailed in a study in Advanced Materials.

Continue reading “Graphene-hBN breakthrough to spur new LEDs, quantum computing” »

Apr 17, 2022

If Fungi Could Talk: Study Suggests Fungi Could Communicate in Structure Comparable to Humans

Posted by in category: computing

Mushrooms could be communicating in a structure that resembles human language, suggests a study published in the Royal Society Open Science.

Professor Andrew Adamatzky analysed the electrical signals in fungi and found patterns that have a structural similarity to English and Swedish languages at the University of the West of England’s Unconventional Computing Laboratory. The hope is to better understand how information is transferred and processed in mycelium networks, and to one day create fungi-based computing devices.

Apr 17, 2022

Mojo Vision’s New Contact Lens Brings Seamless Augmented Reality a Step Closer

Posted by in categories: augmented reality, biotech/medical, computing

Around the rim of the lens is an array of other electronics, including a custom-designed chip with a radio that streams content to the display and a variety of sensors, including an accelerometer, gyroscope, and magnetometer for tracking the user’s eye movements. This eye tracking capability not only ensures that AR imagery holds still as the user looks around, but also makes it possible to control the device through eye movements alone.

Despite their efforts to pack as much into the lens as possible, it won’t be a stand-alone piece of equipment. Most of the computing power required to run AR applications will be contained in a companion device worn around the neck, which will stream the content to the lens wirelessly.

The lens also hasn’t yet been cleared by the FDA for human use, so early demonstrations involve looking through a lens on a stick just in front of the eye. At present it is only capable of producing images in a green monochrome. But according to CNET , the device allows a user to select a variety of apps arranged in a ring around the periphery of their field of vision using nothing more than their gaze. These make it possible to do everything from checking flight information to using a compass to navigate and track fitness data like heart rate and lap number.

Apr 17, 2022

Quantum computing: The benefits of being quantum-ready

Posted by in categories: computing, education, quantum physics

To fully embrace the benefits of quantum computing in the future, we need to focus on education and workforce development and become quantum-ready today.


The 13-year-old daughter of a friend visiting my workplace — the IBM Research lab in Zurich — seemed puzzled. She knew I worked in a research lab and I that work with computers, but the computers she knows don’t typically resemble the chandelier-like structure that hung from the ceiling in front of us.

Yet, it is a computer – a quantum computer. And while someone in their early teens right now can be excused for not knowing what a quantum computer is, I would very much like that to change.

Apr 16, 2022

New software enables diesel engines to run on alternative fuels

Posted by in categories: computing, neuroscience

Illinois Tech designs new engine brains that could reduce emissions.