Toggle light / dark theme

Land cover classification (LCC) of heterogeneous mining areas is important for understanding the influence of mining activities on regional geo-environments. Hyperspectral remote sensing images (HSI) provide spectral information and influence LCC. Convolutional neural networks (CNNs) improve the performance of hyperspectral image classification with their powerful feature learning ability. However, if pixel-wise spectra are used as inputs to CNNs, they are ineffective in solving spatial relationships. To address the issue of insufficient spatial information in CNNs, capsule networks adopt a vector to represent position transformation information. Herein, we combine a clustering-based band selection method and residual and capsule networks to create a deep model named ResCapsNet. We tested the robustness of ResCapsNet using Gaofen-5 Imagery. The images covered two heterogeneous study areas in Wuhan City and Xinjiang Province, with spatially weakly dependent and spatially basically independent datasets, respectively. Compared with other methods, the model achieved the best performances, with averaged overall accuracies of 98.45 and 82.80% for Wuhan study area, and 92.82 and 70.88% for Xinjiang study area. Four transfer learning methods were investigated for cross-training and prediction of those two areas and achieved good results. In summary, the proposed model can effectively improve the classification accuracy of HSI in heterogeneous environments.

Saar Yoskovitz is Co-Founder & CEO at Augury, a pioneer in AI-driven Machine Health and Process Health solutions for industrial sectors.

American manufacturers are at a crossroads, needing to decide between evolution and obsolescence. The tools that historically drove profitability and efficiency are no longer having an impact. Labor is hard to find and harder to keep. The National Association of Manufacturing projects that 2.1 million manufacturing roles will go unfilled by 2030. This hard truth is compounded by findings in Augury’s State of Production Health report, which reveals that 91% of manufacturers say that the mass exodus of industry veterans will worsen the knowledge gap.

An alarming rate of brain drain is looming over the industrial sector. As tenured employees reach retirement age and fewer professionals line up to take their place, more manufacturers are turning to artificial intelligence (AI) to bridge the gap.

A self-replicating machine is a type of autonomous robot that is capable of reproducing itself autonomously using raw materials found in the environment, thus exhibiting self-replication in a way analogous to that found in nature. Homer Jacobson, Edward F. Moore, Freeman Dyson, John von Neumann, Konrad Zuse and in more recent times by K. Eric Drexler in his book on nanotechnology, Engines of Creation (coining the term clanking replicator for such machines) and by Robert Freitas and Ralph Merkle in their review Kinematic Self-Replicating Machinesmoons and asteroid belts for ore and other materials, the creation of lunar factories, and even the construction of solar power satellites in space. The von Neumann probeuniversal constructor, a self-replicating machine that would be able to evolve and which he formalized in a cellular automata environment. Notably, Von Neumann’s Self-Reproducing Automata scheme posited that open-ended evolution requires inherited information to be copied and passed to offspring separately from the self-replicating machine, an insight that preceded the discovery of the structure of the DNA molecule by Watson and Crick and how it is separately translated and replicated in the cell.https://en.m.wikipedia.org/wiki/Self-replicating_machine#:~:...n_probe_is, [ 9 ] A self-replicating machine is an artificial self-replicating system that relies on conventional large-scale technology and automation. The concept, first proposed by Von Neumann no later than the 1940s, has attracted a range of different approaches involving various types of technology. Certain idiosyncratic terms are occasionally found in the literature. For example, the term clanking replicator was once used by Drexler [ 10 ] to distinguish macroscale replicating systems from the microscopic nanorobots or “assemblers” that nanotechnology may make possible, but the term is informal and is rarely used by others in popular or technical discussions. Replicators have also been called “von Neumann machines” after John von Neumann, who first rigorously studied the idea.

Google DeepMind has announced an impressive grab bag of new products and prototypes that may just let it seize back its lead in the race to turn generative artificial intelligence into a mass-market concern.

Top billing goes to Gemini 2.0—the latest iteration of Google DeepMind’s family of multimodal large language models, now redesigned around the ability to control agents—and a new version of Project Astra, the experimental everything app that the company teased at Google I/O in May.

MIT Technology Review got to try out Astra in a closed-door live demo last week. It was a stunning experience, but there’s a gulf between polished promo and live demo.


To predict your #longevity, you have two main options. You can rely on the routine tests and measurements your doctor likes to order for you, such as blood pressure, cholesterol levels, weight, and so on. Or you can go down a biohacking rabbit hole the way tech millionaire turned longevity guru Bryan Johnson did to live longer. Johnson’s obsessive self-measurement protocol involves tracking more than a hundred biomarkers, ranging from the telomere length in blood cells to the speed of his urine stream (which, at 25 milliliters per second, he reports, is in the 90th percentile of 40-year-olds).


Scientists crunched the numbers to come up with the single best predictor of how long you’ll live—and arrived at a surprisingly low-tech answer.

One year of treatment with the targeted drug olaparib improves long-term survival in women with high-risk, early-stage breast cancer with mutations in BRCA1 or BRCA2 genes, new results from a major clinical trial show.

Ten years since the first patient was recruited, new findings from the phase III OlympiA trial – presented at San Antonio Breast Cancer Symposium (SABCS) 2024 – show that adding olaparib to standard treatment cuts the risk of cancer coming back by 35 per cent, and the risk of women dying by 28 per cent.

After six years, 87.5 per cent of patients who were treated with the drug were still alive compared with 83.2 per cent of those who were given the placebo pills.

Professor Andrew Tutt at The Institute of Cancer Research, London, and King’s College London is the global lead investigator and Chair of the Steering Committee for the OlympiA study, and was also involved in early laboratory research on PARP inhibitors such as olaparib, and their subsequent clinical development. The Breast International Group (BIG) coordinated the international OlympiA study, involving 671 study locations, globally across multiple partners. BIG coordinated the trial’s UK sites through the ICR Clinical Trials and Statistics Unit (ICR-CTSU).