Toggle light / dark theme

The challenges of making AI work at the edge—that is, making it reliable enough to do its job and then justifying the additional complexity and expense of putting it in our devices—are monumental. Existing AI can be inflexible, easily fooled, unreliable and biased. In the cloud, it can be trained on the fly to get better—think about how Alexa improves over time. When it’s in a device, it must come pre-trained, and be updated periodically. Yet the improvements in chip technology in recent years have made it possible for real breakthroughs in how we experience AI, and the commercial demand for this sort of functionality is high.


AI is moving from data centers to devices, making everything from phones to tractors faster and more private. These newfound smarts also come with pitfalls.

Satellite imagery is becoming ubiquitous. Research has demonstrated that artificial intelligence applied to satellite imagery holds promise for automated detection of war-related building destruction. While these results are promising, monitoring in real-world applications requires high precision, especially when destruction is sparse and detecting destroyed buildings is equivalent to looking for a needle in a haystack. We demonstrate that exploiting the persistent nature of building destruction can substantially improve the training of automated destruction monitoring. We also propose an additional machine-learning stage that leverages images of surrounding areas and multiple successive images of the same area, which further improves detection significantly. This will allow real-world applications, and we illustrate this in the context of the Syrian civil war.

Existing data on building destruction in conflict zones rely on eyewitness reports or manual detection, which makes it generally scarce, incomplete, and potentially biased. This lack of reliable data imposes severe limitations for media reporting, humanitarian relief efforts, human-rights monitoring, reconstruction initiatives, and academic studies of violent conflict. This article introduces an automated method of measuring destruction in high-resolution satellite images using deep-learning techniques combined with label augmentation and spatial and temporal smoothing, which exploit the underlying spatial and temporal structure of destruction. As a proof of concept, we apply this method to the Syrian civil war and reconstruct the evolution of damage in major cities across the country. Our approach allows generating destruction data with unprecedented scope, resolution, and frequency—and makes use of the ever-higher frequency at which satellite imagery becomes available.

Investigators at Cedars-Sinai and UC San Diego found that a synthetic compound given orally protected the liver against injury in an animal model for alcoholic hepatitis.

The study, co-authored by Dr. Ekihiro Seki, was published in the Proceedings of the Na… See More.


The most prevalent forms of ALD are fatty liver, alcoholic hepatitis and cirrhosis. Corticosteroids are the only treatment option for alcoholic hepatitis, or chronic inflammation of the liver, despite little evidence of long-term efficacy and considerable adverse side effects.

Investigators at Cedars-Sinai and the University of California, San Diego (UCSD), found that a synthetic compound given orally protected the liver against injury in an animal model for alcoholic hepatitis. The study was recently published in the Proceedings of the National Academy of Sciences. Ekihiro Seki, MD, PhD of Cedars-Sinai and Dennis A. Carson, MD, of UCSD are co-senior authors of the paper.

The promise of 5G Internet of Things (IoT) networks requires more scalable and robust communication systems—ones that deliver drastically higher data rates and lower power consumption per device.

Backscatter radios—passive sensors that reflect rather than radiate energy—are known for their low-cost, low-complexity, and battery-free operation, making them a potential key enabler of this future although they typically feature low data rates and their performance strongly depends on the surrounding environment.

Researchers at the Georgia Institute of Technology, Nokia Bell Labs, and Heriot-Watt University have found a low-cost way for backscatter radios to support high-throughput communication and 5G-speed Gb/sec data transfer using only a single transistor when previously it required expensive and multiple stacked transistors.

Tim Berners-Lee’s source code for the World Wide Web is the latest non-fungible token (NFT) to go up for sale.

Sotheby’s in New York is selling the program that paved the way for the internet we know today more than 30 years after its creation.

The sale started June 23 and ends on Wednesday. Bidding had reached $2.8 million on Friday.

Flexible thinking is key to creativity – in other words, the ability to think of new ideas, make novel connections between ideas, and make new inventions. It also supports academic and work skills such as problem solving. That said, unlike working memory – how much you can remember at a certain time – it is largely independent of IQ, or “crystallised intelligence”.


IQ is often hailed as a crucial driver of success, particularly in fields such as science, innovation and technology. In fact, many people have an endless fascination with the IQ scores of famous people. But the truth is that some of the greatest achievements by our species have primarily relied on qualities such as creativity, imagination, curiosity and empathy.

Many of these traits are embedded in what scientists call “cognitive flexibility” – a skill that enables us to switch between different concepts, or to adapt behaviour to achieve goals in a novel or changing environment. It is essentially about learning to learn and being able to be flexible about the way you learn. This includes changing strategies for optimal decision-making. In our ongoing research, we are trying to work out how people can best boost their cognitive flexibility.

This research places the circadian clock as a central regulator of glucose production during lung cancer progression and provides important insight toward the development of novel therapeutics to target REV-ERBα to suppress cancer cell growth.


New research from the University of California, Irvine reveals how the circadian regulation of glucose production in the liver is lost during lung cancer progression, and how the resulting increase in glucose production may fuel cancer cell growth.

The new study titled, “Glucagon regulates the stability of REV-ERBα to modulate hepatic in a model of lung cancer-associated cachexia,” published today in Science Advances, illustrates how the is regulated under conditions of stress such as during lung cancer progression and cancer-associated tissue wasting disease called cachexia.

“Our research shows that a critical circadian protein, REV-ERBα, controls glucose production in the liver. During lung cancer progression and specifically under conditions of cachexia, this circadian regulation is lost, resulting in increased glucose production from the liver,” said senior author Selma Masri, Ph.D., assistant professor in the Department of Biological Chemistry at UCI School of Medicine. “Based on our findings, we identified that lung tumors are able to provide instructive cues to the liver to increase glucose production, a major for cancer cells.”

The Large Hadron Collider has a lot of tasks ahead of it. Next stop: investigating the Big Bang.


The truth is, we don’t really know because it takes huge amounts of energy and precision to recreate and understand the cosmos on such short timescales in the lab.

But scientists at the Large Hadron Collider (LHC) at CERN, Switzerland aren’t giving up.

Now our LHCb experiment has measured one of the smallest differences in mass between two particles ever, which will allow us to discover much more about our enigmatic cosmic origins.

The “technology intelligence engine” uses A.I. to sift through hundreds of millions of documents online, then uses all that information to spot trends.


Build back better

Tarraf was fed up with incorrect predictions. He wanted a more data-driven approach to forecasting that could help investors, governments, pundits, and anyone else to get a more accurate picture of the shape of tech-yet-to-come. Not only could this potentially help make money for his firm, but it could also, he suggested, illuminate some of the blind spots people have which may lead to bias.

Tarraf’s technology intelligence engine uses natural language processing (NLP) to sift through hundreds of millions of documents — ranging from academic papers and research grants to startup funding details, social media posts, and news stories — in dozens of different languages. The futurist and science fiction writer William Gibson famously opined that the future is already here, it’s just not evenly distributed. In other words, tomorrow’s technology has already been invented, but right now it’s hidden away in research labs, patent applications, and myriad other silos around the world. The technology intelligence engine seeks to unearth and aggregate them.