Toggle light / dark theme

Exyn Brings Level 4 Autonomy to Drones

Fully autonomous exploration and mapping of the unknown is a cutting-edge capability for commercial drones.


Drone autonomy is getting more and more impressive, but we’re starting to get to the point where it’s getting significantly more difficult to improve on existing capabilities. Companies like Skydio are selling (for cheap!) commercial drones that have no problem dynamically path planning around obstacles at high speeds while tracking you, which is pretty amazing, and it can also autonomously create 3D maps of structures. In both of these cases, there’s a human indirectly in the loop, either saying “follow me” or “map this specific thing.” In other words, the level of autonomous flight is very high, but there’s still some reliance on a human for high-level planning. Which, for what Skydio is doing, is totally fine and the right way to do it.

TSMC Uses AMD’s EPYC Chips to Make Chips

The silicon Ouroboros.


TSMC produces chips for AMD, but it also now uses AMD’s processors to control the equipment that it uses to make chips for AMD (and other clients too). Sounds like a weird circulation of silicon, but that’s exactly what happens behind the scenes at the world’s largest third-party foundry.

There are hundreds of companies that use AMD EPYC-based machines for their important workloads, sometimes business-critical workloads. Yet, when it comes to mission-critical work, Intel Xeon (and even Intel Itanium and mainframes) rule the world. Luckily for AMD, things have begun to change, and TSMC has announced that it is now using EPYC-based servers for its mission-critical fab control operations.

“For automation with the machinery inside our fab, each machine needs to have one x86 server to control the operation speed and provision of water, electricity, and gas, or power consumption,” said Simon Wang, Director of Infrastructure and Communication Services Division at TSMC.

Crew-1 Mission | Return

SpaceX and NASA are targeting Saturday, May 1 at 8:35 p.m. EDT, or 00:35 UTC on May 2, for Dragon to autonomously undock from the International Space Station (ISS) and splashdown off the coast of Florida on Sunday, May 2 at approximately 2:57 a.m. EDT, 6:57 UTC, completing its first six-month operational mission to the Station.

A series of departure burns will move Dragon away from the orbiting laboratory, followed by the vehicle jettisoning the trunk to reduce weight and mass to help save propellant for the deorbit burn. Once complete, Dragon will re-enter Earth’s atmosphere and deploy its two drogue and four main parachutes in preparation for a soft water landing.

Aboard the spacecraft will be NASA astronauts Mike Hopkins, Victor Glover, Shannon Walker, and JAXA astronaut Soichi Noguchi, who flew to the space station on Dragon six months ago when Falcon 9 launched the spacecraft from historic Launch Complex 39A (LC-39A) at Kennedy Space Center in Florida on Sunday, November 15, 2020.

Upon splashdown, the Dragon and the astronauts will be quickly recovered and returned to Cape Canaveral and Houston respectively. Once the mission is complete, Dragon will be inspected and refurbished for future human spaceflight missions.

The Army Wants to Give Its Robots Living Muscle Tissue

It’s exactly as weird as it sounds.


The U.S. Army is looking into using animal muscle tissue as a means to move robots.

The Army Research Laboratory believes its bots could use real muscle, which allows most living things to move and manipulate their environments, instead of mechanical arms, wheels, tracks, and other systems to travel across the battlefield. The concept, which some might find disturbing, is an example of the new field of “biohybrids.”

A tactile sensing foot to increase the stability of legged robots

In order to effectively navigate real-world environments, legged robots should be able to move swiftly and freely while maintaining their balance. This is particularly true for humanoid robots, robots with two legs and a human-like body structure.

Building robots that are stable on their legs while walking can be challenging. In fact, legged robots typically have unstable dynamics, due to their pendulum-like structure.

Researchers at Hong Kong University of Science and Technology recently developed a computer vision-based robotic foot with tactile sensing capabilities. When integrated at the end of a ’s legs, the artificial foot can increase a robot’s balance and stability during locomotion.

AI Challenges For The Health IT Industry: Should We Expect Electronic Doctors?

Yes, but they wont be trusted til 2035.


Current trends in AI use in healthcare lead me to posit that this market will significantly grow in the coming years. So, should leaders in healthcare expect the emergence of a fully automated electronic physician, sonographer or surgeon as a replacement for the human healthcare professional? Can the development of AI in healthcare help overcome the difficulties the industry faces today? To figure all this out, I would like to analyze the current challenges of using AI in healthcare.

Let’s discuss two promising examples: the application of AI in diagnosis and reading images, and the use of robotic systems in surgery.

Diagnostic Robots: Accuracy And Use For Treatment Recommendations

The success of AI in diagnosing is confirmed by the results of its application in a number of medical studies — for example, in optical coherence tomography (OCT), which requires serious qualifications. Google’s AI-based DeepMind Health system, for instance, demonstrates 94% accuracy of diagnoses for over 50 types of eye diseases in an early trial. Nevertheless, the system operates in conjunction with human experts.

Artificial Intelligence Algorithm Helps Unravel the Physics Underlying Quantum Systems

Protocol to reverse engineer Hamiltonian models advances automation of quantum devices.

Scientists from the University of Bristol ’s Quantum Engineering Technology Labs (QETLabs) have developed an algorithm that provides valuable insights into the physics underlying quantum systems — paving the way for significant advances in quantum computation and sensing, and potentially turning a new page in scientific investigation.

In physics, systems of particles and their evolution are described by mathematical models, requiring the successful interplay of theoretical arguments and experimental verification. Even more complex is the description of systems of particles interacting with each other at the quantum mechanical level, which is often done using a Hamiltonian model. The process of formulating Hamiltonian models from observations is made even harder by the nature of quantum states, which collapse when attempts are made to inspect them.

Kleos Space develops tool for in-space manufacturing of large structures

SAN FRANCISCO — Kleos Space is conducting a six-month test of technology for in-space manufacturing of large 3D carbon fiber structures that could be used to construct solar arrays, star shades and interferometry antennas.

The company with operations in Luxembourg, the United States and United Kingdom is best known for radio frequency reconnaissance satellites. In the background, however, Kleos has been designing and developing in-space manufacturing technology called Futrism to robotically produce a carbon-fiber I-beam with embedded fiber-optic cables that is more than 100 meters long.

“It’s something that we have linked to our roadmap for RF, because it’s something that could deploy very large antennas for RF reconnaissance,” Kleos CEO Andy Bowyer told SpaceNews. “However, it’s useful for a whole range of other applications as well that we are very keen to work with partners on. We firmly believe that manufacturing in space is the future.”

Computer vision inches toward ‘common sense’ with Facebook’s latest research

Machine learning is capable of doing all sorts of things as long as you have the data to teach it how. That’s not always easy, and researchers are always looking for a way to add a bit of “common sense” to AI so you don’t have to show it 500 pictures of a cat before it gets it. Facebook’s newest research takes a big step toward reducing the data bottleneck.

The company’s formidable AI research division has been working for years now on how to advance and scale things like advanced computer vision algorithms, and has made steady progress, generally shared with the rest of the research community. One interesting development Facebook has pursued in particular is what’s called “semi-supervised learning.”

Generally when you think of training an AI, you think of something like the aforementioned 500 pictures of cats — images that have been selected and labeled (which can mean outlining the cat, putting a box around the cat or just saying there’s a cat in there somewhere) so that the machine learning system can put together an algorithm to automate the process of cat recognition. Naturally if you want to do dogs or horses, you need 500 dog pictures, 500 horse pictures, etc. — it scales linearly, which is a word you never want to see in tech.

/* */