Toggle light / dark theme

Israeli drone manufacturer Airobotics has collaborated with Israeli solar farm services company Solar Drone to develop and supply to Solar Drone a unique solar panel cleaning drone system. The fully automated system will include a drone docking station for automatic battery replacement and cleaning fluid replenishment, enabling the system to operate continuously.

While solar power and solar panels are essentially maintenance-free systems, but solar panels do require cleaning from time to time to enable proper function. Dirt, dust, mud, and bird dropping greatly reduce solar panel efficiency, impacting power output. Frequent cleaning is expensive and time-consuming, especially when panels are remote, difficult to access, or difficult to clean.

A new “drone-in-a-box”-type system is now being developed to do this job. A quadrocopter is housed inside a weatherproof dock located near the solar panels. At regular intervals, the station doors on top will open, releasing the drone. The drone will then take off and fly up to the panels, using LiDAR sensors and mapping cameras for more accurate positioning. Each panel will be sprayed with a cleaning fluid, and after completing the task, the drone will return to the docking station. If necessary, the robotic system will replace the discharged battery with the charged one and replace its cleaning fluid container with a full one.

Scientists from the Division of Mechanical Science and Engineering at Kanazawa University developed a prototype pipe maintenance robot that can unclog and repair pipes with a wide range of diameters. Using a cutting tool with multiple degrees of freedom, the machine is capable of manipulating and dissecting objects for removal. This work may be a significant step forward for the field of sewerage maintenance robots.

Various sewer pipes that are essential to the services of buildings require regular inspection, repair, and maintenance. Current robots that move inside pipes are primarily designed only for visual surveying or inspection. Some robots were developed for maintenance, but they couldn’t execute complicated tasks. In– robots that can also clear blockages or perform complex tasks are highly desirable, especially for pipes that are too narrow for humans to traverse. Now, a team of researchers at Kanazawa University have developed and tested a prototype with these capabilities. “Our robot can help civic and industrial workers by making their job much safer. It can operate in small pipes that humans either cannot access or are dangerous,” explains first author Thaelasutt Tugeumwolachot.

One of the main challenges with designing a robot of this kind is how to achieve a snug fit inside pipes of different sizes. Previous models can expand or contract their width by only about 60 percent. Here, the researchers used six foldable “crawler” arms around the body of the robot. This adjustable locomotion mechanism allowed it to work in pipes with sizes between 15 to 31 cm, a range of over 100 percent. Another is how to crowd complex and tough arm mechanism into small space. This robot equipped a compact arm which enables complicated cutting movements by being driven via gear train from several motors inside the robot body.

Human computing for labeling at scale

All this begs the question: How do you create labeled data at scale?

Manually labeling data for AI is an extremely labor-intensive process. It can take weeks or months to label a few hundred samples using this approach, and the accuracy rate is not very good, particularly when facing niche labeling tasks. Additionally, it will be necessary to update datasets and build bigger datasets than competitors in order to remain competitive.

Summary: A new machine-learning algorithm could help practitioners identify autism in children more effectively.

Source: USC

For children with autism spectrum disorder (ASD), receiving an early diagnosis can make a huge difference in improving behavior, skills and language development. But despite being one of the most common developmental disabilities, impacting 1 in 54 children in the U.S., it’s not that easy to diagnose.

More than a score of companies are pushing to be early winners in the race for self-driving taxis — robotaxis — with the potential that brings to capture the entire value chain of car transport from your riders. They are all at different stages, and they almost all want to convince the public and investors that they are far along.

To really know how far along a project is, you need the chance to look inside it. To see the data only insiders see on just how well their vehicle is performing, as well as what it can and can’t do. Most teams want to keep those inside details secret, though in time they will need to reveal them to convince the public, and eventually regulators that they are ready to deploy.

Because they keep them secret, those of us looking in from the outside can only scrape for clues. The biggest clues come when they reach certain milestones, and when they take risks which tell us their own internal math has said it’s OK to take that risk. Most teams announce successes and release videos of drives, but these offer us only limited information because they can be cherry picked. The best indicators are what they do, not what they say.

A new “common-sense” approach to computer vision enables artificial intelligence that interprets scenes more accurately than other systems do.

Computer vision systems sometimes make inferences about a scene that fly in the face of common sense. For example, if a robot were processing a scene of a dinner table, it might completely ignore a bowl that is visible to any human observer, estimate that a plate is floating above the table, or misperceive a fork to be penetrating a bowl rather than leaning against it.

Move that computer vision system to a self-driving car and the stakes become much higher — for example, such systems have failed to detect emergency vehicles and pedestrians crossing the street.

✅ Instagram: https://www.instagram.com/pro_robots.

You’re on the PRO Robots channel and in this issue, on the eve of the New Year and Christmas, we’ve made a selection of non-trivial gifts for you. From high-tech, to simple but useful! See Top robots and gadgets you can buy right now for fun, usefulness, or to feel like you’re in a futuristic movie of the future. Have you started picking out presents for the New Year yet?

0:00 In this issue.
0:23 Robot vacuum cleaner ROIDMI EVE Plus.
1:13 CIRO Solar Robot Robot Kit.
1:46 mBot Robotic Constructors by Makeblock.
2:30 Adeept PiCar Pro Robotics Kit.
2:50 Adeept raspclaw Hexapod Robot Spider.
3:10 Copies of Spot and Unity robots.
3:20 Ultrasonic device for phone disinfection.
3:35 Projector for your phone.
3:54 Wireless Record Player.
4:15 Gadgets to Find Lost Things.
4:31 Compact Smart Security Camera.
4:50 Smart Change Jar.
5:11 Face Tracking Phone Holder.
5:27 Smart Garden.
5:53 Smart ring.
6:28 Smart Mug.

#prorobots #robots #robot #future technologies #robotics.

Tesla is allowing drivers — yes, the person behind the wheel who is ideally preoccupied with tasks such as “steering” — to play video games on its vehicles’ massive console touchscreens while driving.

“I only did it for like five seconds and then turned it off,” Tesla owner Vince Patton told The New York Times. “I’m astonished. To me, it just seems inherently dangerous.”

The feature has reportedly been available for some time. Given that the company is already facing fierce scrutiny for rolling out its still unfinished Full Self-Driving beta to customers, it’s not exactly a good look.