Toggle light / dark theme

Engie and Macquarie to build 150MW one hour battery at site of shuttered Hazelwood coal generator.


French energy giant Engie and Macquarie’s Green Investment Group are to jointly fund the construction of a 150MW/150MWh big battery at the site of the now closed Hazelwood brown coal generator.

The announcement, which comes four years after Engie closed what was Australia’s dirtiest power station, continues the trend of using the sites of closed or ageing coal and gas plants to build battery storage to support the switch to 100 per cent renewables.

Construction has already begun on the Hazelwood Battery, which will be built and maintained over a 20-year period by US-based Fluence, using – for the first time in Australia – its sixth-generation Gridstack product and its AI-enabled bidding system.

The robot navigates using sensors and removes weeds mechanically without the need for chemicals. The LiDAR (light detection and ranging) scanners installed in the weed killer continuously emit laser pulses as the vehicle moves, which are then reflected by objects in the surrounding area. This produces a 3D point cloud of the environment, which helps mobile weed killers to find their way and determine the position of plants or trees. “AMU-Bot is not yet able to classify all plants; however, it can recognize crops such as trees and shrubs in the rows of the tree nursery cultivations,” said the team leader Kevin Bregler.

The weeds in the spaces between the plants or trees are also reliably eliminated. To do this, the manipulator moves into the gaps between the crops. The weeds do not need to be collected separately and are left on the ground to dry out. Thanks to its caterpillar drive, the self-driving weed killer moves along the ground with ease and is extremely stable. Even holes in the ground created when saplings are removed do not pose a problem for AMU-Bot. The AMU-Bot platform is economical, robust, easy to use, and at the same time highly efficient.

The project is funded by the German Federal Office of Agriculture and Food. The AMU-Bot platform relies on the ingenious interaction of three sophisticated modules: caterpillar vehicle, navigation system, and manipulator. Bosch is responsible for the navigation and the sensor system, while KommTek developed the caterpillar drive. The Fraunhofer IPA designed the height-adjustable manipulator, including rotary harrows, and was responsible for overall coordination.

Eureka Robotics, a tech spin-off from Nanyang Technological University, Singapore (NTU Singapore), has developed a technology, called Dynamis, that makes industrial robots nimbler and almost as sensitive as human hands, able to manipulate tiny glass lenses, electronics components, or engine gears that are just millimeters in size without damaging them.

This proprietary force feedback technology developed by NTU scientists was previously demonstrated by the Ikea Bot which assembled an Ikea chair in just 20 minutes. The breakthrough was first published in Science in 2018 and went viral on the internet when it could match the dexterity of human hands in assembling furniture.

NTU Associate Professor Pham Quang Cuong, Co-founder of Eureka Robotics, said they have since upgraded the software technology, which will be made available for a large number of industrial robots worldwide by Denso Wave, a market leader in , which is part of the Toyota Group.

OAKLAND/LOS ANGELES, Calif., Dec 2 – Andy Chanley, the afternoon drive host at Southern California’s public radio station 88.5 KCSN, has been a radio DJ for over 32 years. And now, thanks to artificial intelligence technology, his voice will live on simultaneously in many places.

“I may be a robot, but I still love to rock,” says the robot DJ named ANDY, derived from Artificial Neural Disk-JockeY, in Chanley’s voice, during a demonstration for Reuters where the voice was hard to distinguish from a human DJ.

Our phones, speakers and rice cookers have been talking to us for years, but their voices have been robotic. Seattle-based AI startup WellSaid Labs says it has finessed the technology to create over 50 real human voice avatars like ANDY so far, where the producer just needs to type in text to create the narration.

Bongard said they found that the xenobots, which were initially sphere-shaped and made from around 3,000 cells, could replicate. But it happened rarely and only in specific circumstances. The xenobots used “kinetic replication” — a process that is known to occur at the molecular level but has never been observed before at the scale of whole cells or organisms, Bongard said.


The US scientists who created the first living robots say the life forms, known as xenobots, can now reproduce — and in a way not seen in plants and animals.

Formed from the stem cells of the African clawed frog (Xenopus laevis) from which it takes its name, xenobots are less than a millimeter (0.04 inches) wide. The tiny blobs were first unveiled in 2020 after experiments showed that they could move, work together in groups and self-heal.

Now the scientists that developed them at the University of Vermont, Tufts University and Harvard University’s Wyss Institute for Biologically Inspired Engineering said they have discovered an entirely new form of biological reproduction different from any animal or plant known to science.

Circa 2018 #artificialintelligence #doctor


Abstract: Online symptom checkers have significant potential to improve patient care, however their reliability and accuracy remain variable. We hypothesised that an artificial intelligence (AI) powered triage and diagnostic system would compare favourably with human doctors with respect to triage and diagnostic accuracy. We performed a prospective validation study of the accuracy and safety of an AI powered triage and diagnostic system. Identical cases were evaluated by both an AI system and human doctors. Differential diagnoses and triage outcomes were evaluated by an independent judge, who was blinded from knowing the source (AI system or human doctor) of the outcomes. Independently of these cases, vignettes from publicly available resources were also assessed to provide a benchmark to previous studies and the diagnostic component of the MRCGP exam. Overall we found that the Babylon AI powered Triage and Diagnostic System was able to identify the condition modelled by a clinical vignette with accuracy comparable to human doctors (in terms of precision and recall). In addition, we found that the triage advice recommended by the AI System was, on average, safer than that of human doctors, when compared to the ranges of acceptable triage provided by independent expert judges, with only a minimal reduction in appropriateness.

From: Yura Perov N [view email]

[v1] Wed, 27 Jun 2018 21:18:37 UTC (54 KB)

Bridging Technology And Medicine For The Modern Healthcare Ecosystem — Dr. Mona G. Flores, MD, Global Head of Medical AI, NVIDIA.


Dr. Mona Flores M.D., is the Global Head of Medical AI, at NVIDIA (https://blogs.nvidia.com/blog/author/monaflores/), the American multinational technology company, where she oversees the company’s AI initiatives in medicine and healthcare to bridge the chasm between technology and medicine.

Dr. Flores first joined NVIDIA in 2018 with a focus on developing their healthcare ecosystem. Before joining NVIDIA, she served as the chief medical officer of digital health company Human-Resolution Technologies after a 25+ year career in medicine and cardiothoracic surgery.

To say we’re at an inflection point of the technological era may be an obvious declaration to some. The opportunities at hand and how various technologies and markets will advance are nuanced, however, though a common theme is emerging. The pace of innovation is moving at a rate previously seen by humankind at only rare points in history. The invention of the printing press and the ascension of the internet come to mind as similar inflection points, but current innovation trends are being driven aggressively by machine learning and artificial intelligence (AI). In fact, AI is empowering rapid technology advances in virtually all areas, from the edge and personal devices, to the data center and even chip design itself.

There is also a self-perpetuating effect at play, because the demand for intelligent machines and automation everywhere is also ramping up, whether you consider driver assist technologies in the automotive industry, recommenders and speech recognition input in phones, or smart home technologies and the IoT. What’s spurring our recent voracious demand for tech is the mere fact that leading-edge OEMs, from big names like Tesla and Apple, to scrappy start-ups, are now beginning to realize great gains in silicon and system-level development beyond the confines of Moore’s Law alone.