Forget government-issued food pyramids. Let an algorithm tell you how to eat.
Credit Credit Erik Blad
Posted in food, government, information science, robotics/AI
Forget government-issued food pyramids. Let an algorithm tell you how to eat.
Credit Credit Erik Blad
A startup called CogitAI has developed a platform that lets companies use reinforcement learning, the technique that gave AlphaGo mastery of the board game Go.
Gaining experience: AlphaGo, an AI program developed by DeepMind, taught itself to play Go by practicing. It’s practically impossible for a programmer to manually code in the best strategies for winning. Instead, reinforcement learning let the program figure out how to defeat the world’s best human players on its own.
Drug delivery: Reinforcement learning is still an experimental technology, but it is gaining a foothold in industry. DeepMind has talked of using it to optimize the performance of data centers and wind turbines. Amazon recently launched a reinforcement-learning platform, but it is aimed more at researchers and academics. CogitAI’s first commercial customers include those working in robotics for drug manufacturing. Its platform lets the robot figure out the optimal way to process drug orders.
“Monks don’t discuss the true meaning of the Heart Sutra to worshippers; they just read it like poetry,” Kohei Ogawa, a robotics professor at the University of Osaka who worked on the robot, told The Diplomat. “But this doesn’t work. The monks are like robots.”
Androgynous Android
The Mindar android also bends gender, according to The Diplomat, with its human-like face and chest designed to evoke both male and female characteristics.
Finding the best light-harvesting chemicals for use in solar cells can feel like searching for a needle in a haystack. Over the years, researchers have developed and tested thousands of different dyes and pigments to see how they absorb sunlight and convert it to electricity. Sorting through all of them requires an innovative approach.
Now, thanks to a study that combines the power of supercomputing with data science and experimental methods, researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory and the University of Cambridge in England have developed a novel “design to device” approach to identify promising materials for dye-sensitized solar cells (DSSCs). DSSCs can be manufactured with low-cost, scalable techniques, allowing them to reach competitive performance-to-price ratios.
The team, led by Argonne materials scientist Jacqueline Cole, who is also head of the Molecular Engineering group at the University of Cambridge’s Cavendish Laboratory, used the Theta supercomputer at the Argonne Leadership Computing Facility (ALCF) to pinpoint five high-performing, low-cost dye materials from a pool of nearly 10,000 candidates for fabrication and device testing. The ALCF is a DOE Office of Science User Facility.
Perceiving an object only visually (e.g. on a screen) or only by touching it, can sometimes limit what we are able to infer about it. Human beings, however, have the innate ability to integrate visual and tactile stimuli, leveraging whatever sensory data is available to complete their daily tasks.
Researchers at the University of Liverpool have recently proposed a new framework to generate cross-modal sensory data, which could help to replicate both visual and tactile information in situations in which one of the two is not directly accessible. Their framework could, for instance, allow people to perceive objects on a screen (e.g. clothing items on e-commerce sites) both visually and tactually.
“In our daily experience, we can cognitively create a visualization of an object based on a tactile response, or a tactile response from viewing a surface’s texture,” Dr. Shan Luo, one of the researchers who carried out the study, told TechXplore. “This perceptual phenomenon, called synesthesia, in which the stimulation of one sense causes an involuntary reaction in one or more of the other senses, can be employed to make up an inaccessible sense. For instance, when one grasps an object, our vision will be obstructed by the hand, but a touch response will be generated to ‘see’ the corresponding features.”
Quartz’s Lyft story isn’t the most groundbreaking work of journalism in the world, but it’s an interesting proof of concept about how reporters can leverage new tools to pull interesting takeaways from otherwise dry public records — and, perhaps, a preview of things to come.
“This is taking [data journalism] to the next level where we’re trying to get journalists comfortable using computers to do some of this pattern matching, sorting, grouping, anomaly detection — really working with especially large data sets,” John Keefe, Quartz’s technical architect for bots and machine learning, told Digiday back when the Quartz AI Studio first launched.
READ MORE: Here’s what Lyft talks about as risk factors that other companies don’t [Quartz].
An Israeli spacecraft on its maiden mission to the moon has sent its first selfie back to Earth, mission chiefs said on Tuesday.
The image showing part of the Beresheet spacecraft with Earth in the background was beamed to mission control in Yehud, Israel – 23,360 miles (37,600km) away, the project’s lead partners said.
The partners, NGO SpaceIL and state-owned Israel Aerospace Industries, launched the unmanned Beresheet – Hebrew for Genesis – from Cape Canaveral in Florida on 22 February.