Toggle light / dark theme

Brain cancer remains challenging to diagnose, due to nonspecific symptoms and a lack of cost-effective tests. A new blood test that uses attenuated total reflection (ATR)-Fourier transform infrared (FTIR) spectroscopy in conjunction with machine learning technology, may help advance the detection of brain cancer.

The patented technology, developed by a team at the University of Strathclyde, uses infrared light to produce a “bio-signature” of a blood sample and applies artificial intelligence to check for the signs of cancer.

The research is published in Nature Communications in a paper titled, “Development of high-throughput ATR-FTIR technology for rapid triage of brain cancer.

One of the new products unveiled at CES this year is a new kind of home security system — one that includes drones to patrol your property, along with sensors designed to mimic garden light and a central processor to bring it all together.

Sunflower Labs debuted their new Sunflower Home Awareness System, which includes the eponymous Sunflowers (motion and vibration sensors that look like simple garden lights but can populate a map to show you cars, people and animals on or near your property in real time); the Bee (a fully autonomous drone that deploys and flies on its own, with cameras on board to live-stream video); and the Hive (a charging station for the Bee, which also houses the brains of the operation for crunching all the data gathered by the component parts).

Roving aerial robots keeping tabs on your property might seem a tad dystopian, and perhaps even unnecessary, when you could maybe equip your estate with multiple fixed cameras and sensors for less money and with less complexity. But Sunflower Labs thinks its security system is an evolution of more standard fare because it “learns and reacts to its surroundings,” improving over time.

Drastic miniaturization of electronics and ingression of next-generation nanomaterials into space technology have provoked a renaissance in interplanetary flights and near-Earth space exploration using small unmanned satellites and systems. As the next stage, the NASA’s 2015 Nanotechnology Roadmap initiative called for new design paradigms that integrate nanotechnology and conceptually new materials to build advanced, deep-space-capable, adaptive spacecraft. This review examines the cutting edge and discusses the opportunities for integration of nanomaterials into the most advanced types of electric propulsion devices that take advantage of their unique features and boost their efficiency and service life. Finally, we propose a concept of an adaptive thruster.

Recent AI lecture by Stanford University.


What do web search, speech recognition, face recognition, machine translation, autonomous driving, and automatic scheduling have in common? These are all complex real-world problems, and the goal of artificial intelligence (AI) is to tackle these with rigorous mathematical tools.

In this course, you will learn the foundational principles that drive these applications and practice implementing some of these systems. Specific topics include machine learning, search, game playing, Markov decision processes, constraint satisfaction, graphical models, and logic. The main goal of the course is to equip you with the tools to tackle new AI problems you might encounter in life.

Tesla vehicles are apparently going to talk to people not only inside the car but also outside. CEO Elon Musk even released a quick preview video.

It’s no secret that Tesla wants to use more artificial intelligence in its business.

Two years ago, Tesla hired Andrej Karpathy to lead its computer vision and AI team and they have been expanding their team since then.

“A new model based on the blood-vessel network in a rat brain shows that the vessel position within its circulatory network does not influence the blood flow nor how nutrients are transported. Instead, transport is controlled mostly by the dilation of vessels. As well as providing new insights into the circulatory system, the model could lead to better artificial tissues and brain-scanning techniques – and might even improve the performance of solar panels.”

Nutrient flow in the brain is controlled by blood-vessel dilation, reveals network model

If you enjoyed this article, please like and follow our Facebook page for the latest news on neuroscience, psychology, and artificial intelligence:

https://www.facebook.com/The-Neuro-Network-383136302314720/


Interesting research paper on a new nanobot technology. I’m watching for ways in which suitable substrates for mind uploading can be constructed, and DNA self-guided assembly has potential.

Here are some excerpts and a weblink to the paper:

“…Chemical approaches have opened synthetic routes to build dynamic materials from scratch using chemical reactions, ultimately allowing flexibility in design…”

… As a realization of this concept, we engineered a mechanism termed DASH—DNA-based Assembly and Synthesis of Hierarchical materials—providing a mesoscale approach to create dynamic materials from biomolecular building blocks using artificial metabolism. DASH was developed on the basis of nanotechnology that uses DNA as a generic material ranging from nanostructures to hydrogels, for enzymatic substrates, and as linkers between nanoparticles…”

“…Next, to illustrate the potential uses of self-generated materials, we created various hybrid functional materials from the DASH patterns. The DASH patterns served as a versatile mesoscale scaffold for a diverse range of functional nanomaterials beyond DNA, ranging from proteins to inorganic nanoparticles, such as avidin, quantum dots, and DNA-conjugated gold nanoparticles (AuNPs) (Fig. 4D, figs. S37 and S38, and Supplementary Text). The generated patterns were also rendered functional with catalytic activity when conjugated with enzymes (figs. S39 and S40 and Supplementary Text). We also showed that the DNA molecules within the DASH patterns retained the DNA’s genetic properties and that, in a cell-free fashion, the materials themselves successfully produced green fluorescent proteins (GFPs) by incorporating a reporter gene for sfGFP (Fig. 4E and figs. S9 and S41) (40). The protein production capability of the materials established the foundation for future cell-free production of proteins, including enzymes, in a spatiotemporally controlled manner.

Zume Pizza, the Mountain View company that used robots to make its pizzas, has made its last delivery.

In filings with the state Employment Development Department, Zume said it is cutting 172 jobs in Mountain View, and eliminating another 80 jobs at its facility in San Francisco. Zume Chief Executive Alex Garden made the annoucement about Zume in an email to company employees on Wednesday.

“With admiration and sadness, we are closing Zume Pizza today,” Garden said in his email “Over the last four years this business has been our invention test bed and has been our inspiration for many of the growth businesses we have at Zume today.”

Analog machine learning hardware offers a promising alternative to digital counterparts as a more energy efficient and faster platform. Wave physics based on acoustics and optics is a natural candidate to build analog processors for time-varying signals. In a new report on Science Advances Tyler W. Hughes and a research team in the departments of Applied Physics and Electrical Engineering at Stanford University, California, identified mapping between the dynamics of wave physics and computation in recurrent neural networks.

The map indicated the possibility of training physical wave systems to learn complex features in temporal data using standard training techniques used for neural networks. As proof of principle, they demonstrated an inverse-designed, inhomogeneous medium to perform English vowel classification based on raw audio signals as their waveforms scattered and propagated through it. The scientists achieved performance comparable to a standard digital implementation of a recurrent neural network. The findings will pave the way for a new class of analog machine learning platforms for fast and efficient information processing within its native domain.

The recurrent neural network (RNN) is an important machine learning model widely used to perform tasks including natural language processing and time series prediction. The team trained wave-based physical systems to function as an RNN and passively process signals and information in their native domain without analog-to-digital conversion. The work resulted in a substantial gain in speed and reduced power consumption. In the present framework, instead of implementing circuits to deliberately route signals back to the input, the recurrence relationship occurred naturally in the time dynamics of the physics itself. The device provided the memory capacity for information processing based on the waves as they propagated through space.