Houston-based ThirdAI, a company building tools to speed up deep learning technology without the need for specialized hardware like graphics processing units, brought in $6 million in seed funding.
Neotribe Ventures, Cervin Ventures and Firebolt Ventures co-led the investment, which will be used to hire additional employees and invest in computing resources, Anshumali Shrivastava, Third AI co-founder and CEO, told TechCrunch.
Shrivastava, who has a mathematics background, was always interested in artificial intelligence and machine learning, especially rethinking how AI could be developed in a more efficient manner. It was when he was at Rice University that he looked into how to make that work for deep learning. He started ThirdAI in April with some Rice graduate students.
✨ Instagram: https://www.instagram.com/Robotix_with_Sina. 📌My Amazon Pick: https://www.amazon.com/shop/iloverobotics?tag=lifeboatfound-20. ———————— A company called Tombot thinks it’s come up with a way to improve the quality of life for seniors facing challenges when it comes to being social: a robotic companion dog that behaves and responds like a real pup, but without all the responsibilities of maintaining a living, breathing animal. The company even enlisted the talented folks at the Jim Henson’s Creature Shop to help make the robo-dog look as lifelike as possible. It’s a noble effort, but it also raises lots of questions.
For starters, can robots actually be a good substitute for an animal companion? Replacing people with robots is a massive technological challenge—and one we’re not even close to accomplishing. Every time a multi-million dollar humanoid robot like Boston Dynamics’ ATLAS takes a nasty spill, we’re reminded that they’re nowhere near ready to interact with the average consumer. But robotic animals are a different story. It’s hard not to draw comparisons to a well-trained dog when seeing Boston Dynamics’ SpotMini in action. And even though it still comes with a price tag that soars to hundreds of thousands of dollars, there are robotic pets available on the other end of the affordability spectrum. Sony’s Aibo, originally released 20 years ago, was so popular and beloved that owners in Japan regularly held funerals for their robo-dogs when they stopped working and replacement parts were no longer available. In late 2,017 Sony brought its Aibo line back from the dead, and despite a $2,900 price tag and questionable smarts, it’s hard not to get drawn into interacting with the plastic pet as if it were a real puppy.
But Tombot isn’t the first company to create a robotic pet specifically designed to serve as an attentive companion. For the past decade, a $5,000 robotic seal called Paro has been comforting seniors and those dealing with longtime illnesses like Alzheimer’s. And a few years ago Hasbro introduced a ~$100 robotic cat and dog under its Joy For All line (which has since spun off into its own company called Ageless Innovation) that will respond lovingly to (or at least appear to) physical interactions. As long as you don’t need a robotic pet to fetch the paper, scare off intruders, or retrieve dead ducks, robots can effectively deliver at least some of the interactions and companionship a real-life pet can. Can Tombot actually deliver the next-generation of robotic companion pets? Enlisting experts like the Jim Henson’s Creature Shop’s Creative Supervisor Peter Brooke and Animatronic Supervisor John Criswell was a good start. In addition to designing over-the-top muppets, the studio has created life-like animatronic animals for use in movies and TV, and with a deep understanding of how creatures move, they were able to deliver a design for a robotic dog that not only looks more like a real dog than a plush toy, it moves like one as well.
I think SENS did this last year but now AlphaFold2 will make it easier and faster.
Hey it’s Han from WrySci HX discussing how breakthroughs in the protein folding problem by AlphaFold 2 from DeepMind could combine with the SENS research foundation’s approach of allotopic mitochondrial gene expression to fight aging damage. More below ↓↓↓
It’s no secret that AI is everywhere, yet it’s not always clear when we’re interacting with it, let alone which specific techniques are at play. But one subset is easy to recognize: If the experience is intelligent and involves photos or videos, or is visual in any way, computer vision is likely working behind the scenes.
Computer vision is a subfield of AI, specifically of machine learning. If AI allows machines to “think,” then computer vision is what allows them to “see.” More technically, it enables machines to recognize, make sense of, and respond to visual information like photos, videos, and other visual inputs.
Over the last few years, computer vision has become a major driver of AI. The technique is used widely in industries like manufacturing, ecommerce, agriculture, automotive, and medicine, to name a few. It powers everything from interactive Snapchat lenses to sports broadcasts, AR-powered shopping, medical analysis, and autonomous driving capabilities. And by 2,022 the global market for the subfield is projected to reach $48.6 billion annually, up from just $6.6 billion in 2015.
I would say this is probably aimed at a few things. It’s a work around to the national fight to raise the minimum wage. These will be out of sight out of mind, so no one, besides the workers, will see as they are gradually automated to 100% by around 2027. And, the delivery is gradually fully automated with long distance drones and self driving vehicles. Also, be sure every other chain is working on the same stuff.
The ‘ghost kitchens’ are coming to the UK, US and Canada.
A radical collaboration between a biologist and an engineer is supercharging efforts to protect grape crops. The technology they’ve developed, using robotics and AI to identify grape plants infected with a devastating fungus, will soon be available to researchers nationwide working on a wide array of plant and animal research.
The biologist, Lance Cadle-Davidson, Ph.D. ‘03, an adjunct professor in the School of Integrative Plant Science (SIPS), is working to develop grape varieties that are more resistant to powdery mildew, but his lab’s research was bottlenecked by the need to manually assess thousands of grape leaf samples for evidence of infection.
Powdery mildew, a fungus that attacks many plants including wine and table grapes, leaves sickly white spores across leaves and fruit and costs grape growers worldwide billions of dollars annually in lost fruit and fungicide costs.
Three scientists on Tuesday won the Nobel Physics Prize, including the first woman to receive the prestigious award in 55 years, for inventing Chirped-pulse amplification, or CPA. The 9-million-Swedish-kronor award (about $1 million) will be doled out to Arthur Ashkin of Bell Laboratories in Holmdel, N.J., Gérard Mourou of École Polytechnique in Palaiseau, France, and Donna Strickland of the University of Waterloo in Canada. This is a technique for creating ultrashort, yet extremely high-energy laser pulses necessary in a variety of applications. It is remarkable what can be achieved with lasers in research and in applications, and there are many good reasons for it, including their coherence, frequency stability, and controllability, but for some applications, the thing that really matters is raw power. Article by Dr. Olivier Alirol, Physicist, Resonance Science Foundation Research Scientist.