Findings suggest it is possible to treat soil and water on Mars for farming.
I have been invited to participate in a quite large event in which some experts and I (allow me to not consider myself one) will discuss about Artificial Intelligence, and, in particular, about the concept of Super Intelligence.
It turns out I recently found out this really interesting TED talk by Grady Booch, just in perfect timing to prepare my talk.
No matter if you agree or disagree with Mr. Booch’s point of view, it is clear that today we are still living in the era of weak or narrow AI, very far from general AI, and even more from a potential Super Intelligence. Still, Machine Learning bring us with a great opportunity as of today. The opportunity to put algorithms to work together with humans to solve some of our biggest challenges: climate change, poverty, health and well being, etc.
Near-term quantum computers, quantum computers developed today or in the near future, could help to tackle some problems more effectively than classical computers. One potential application for these computers could be in physics, chemistry and materials science, to perform quantum simulations and determine the ground states of quantum systems.
Some quantum computers developed over the past few years have proved to be fairly effective at running quantum simulations. However, near-term quantum computing approaches are still limited by existing hardware components and by the adverse effects of background noise.
Researchers at 1QB Information Technologies (1QBit), University of Waterloo and the Perimeter Institute for Theoretical Physics have recently developed neural error mitigation, a new strategy that could improve ground state estimates attained using quantum simulations. This strategy, introduced in a paper published in Nature Machine Intelligence, is based on machine-learning algorithms.
A pair of researchers at MIT have found evidence suggesting that a new kind of computer could be built based on liquid crystals rather than silicon. In their paper published in the journal Science Advances, Žiga Kos and Jörn Dunkel outline a possible design for a computer that takes advantage of slight differences in the orientation of the molecules that make up liquid crystals and the advantages such a system would have over those currently in use.
Most modern computer screens are made using liquid crystal displays (LCDs). Such displays are made by growing crystals in a flat plane. These crystals are made up of rod-shaped molecules that line up in a parallel fashion (those that line up the wrong way are removed). The orientation of the molecules in LCDs are not all perfect alignments, of course, but they are close enough to allow for sharp imagery.
In this new effort, Kos and Dunkel, suggest it should be possible to take advantage of those slight misalignments to create a new way to hold and manipulate computer data. They note that such a computer could encode a unique value to each type of misalignment to hold a bit of data. Thus, a computer using this approach would not be constrained to conventional binary bits—it could have a whole host of options, perhaps making it much faster than machines used today (depending on how quickly the orientations could be changed).
Scientists are planning a “CAT scan” of a British Columbia volcano to help harness the underground heat that turns rock into magma for renewable energy.
“Canadians are often surprised to know there’s volcanoes in the country,” said Steve Grasby, a geologist with Natural Resources Canada. “But there are active volcanoes.”
Grasby and his colleagues are headed about 24 kilometres west of Whistler, B.C., to Mount Cayley, part of the same mountain chain as well-known volcanic peaks such Mount St. Helens in Washington State.
Star Scientific invents a catalyst that in the presence of hydrogen and oxygen heats to 700 Celsius. That’s enough heat to drive a steam turbine for power generation.
Star Scientific is a 25-year-old research laboratory north of Sydney, Australia. The company is one of many trying to make existing power plants carbon-free. This includes old coal-fired thermal power stations which remain among the biggest contributors to greenhouse gas emissions on the planet. Star has invented a patented non-polluting catalyst which it calls HERO® which is an acronym for Hydrogen Energy Release Optimizer. It uses hydrogen without producing combustion.
Mars Food Australia, the subsidiary of the global food giant, is using HERO® to help decarbonize its processes. The 18-month pilot project is the first step in developing alternative heat sources for the food industry. Bill Heague who is General Manager of Mars in Australia states, “Thermal energy is crucial to the business of cooking food and this technology has the capability to create limitless heat without any combustion and zero emissions.”
But this is only the beginning because Star has plans to introduce the technology into legacy coal-fired power plants to retrofit existing generators to run on green hydrogen. In an interview with Bloomberg Green, Andrew Horvath, Chairman of Star Scientific states, “We think there are a lot of opportunities in existing steam turbines that have some longevity…Why would you throw them away? They’re already connected to the grid.” He cites the example of Japan where 70% of its existing turbines still have 40 years of life left.
Brain cells with the same “birthdate” are more likely to wire together into cooperative signaling circuits that carry out many functions, including the storage of memories, a new study finds.
Led by researchers from NYU Grossman School of Medicine, the new study on the brains of mice developing in the womb found that brain cells (neurons) with the same birthdate showed distinct connectivity and activity throughout the animals’ adult lives, whether they were asleep or awake.
Published online August 22 in Nature Neuroscience, the findings suggest that evolution took advantage of the orderly birth of neurons—by gestational day—to form localized microcircuits in the hippocampus, the brain region that forms memories. Rather than attempting to create each new memory from scratch, the researchers suggest, the brain may exploit the stepwise formation of neuronal layers to establish neural templates, like “Lego pieces,” that match each new experience to an existing template as it is remembered.
People have been dreaming of robot butlers for decades, but one of the biggest barriers has been getting machines to understand our instructions. Google has started to close the gap by marrying the latest language AI with state-of-the-art robots.
Human language is often ambiguous. How we talk about things is highly context-dependent, and it typically requires an innate understanding of how the world works to decipher what we’re talking about. So while robots can be trained to carry out actions on our behalf, conveying our intentions to them can be tricky.
If they have any ability to understand language at all, robots are typically designed to respond to short, specific instructions. More opaque directions like “I need something to wash these chips down” are likely to go over their heads, as are complicated multi-step requests like “Can you put this apple back in the fridge and fetch the chocolate?”
The JWST provides an intriguing look at the early universe, but it’s not yet rewriting fundamental theories of the cosmos.