Toggle light / dark theme

I know it’s an old movie (and it was an even older book before that), but I want to look at the physics of the special submarine drive in The Hunt for Red October. In the story, the Russians build a so-called “caterpillar drive” using hydro-magneto power instead of the traditional propeller. This new drive is way quieter than the traditional type—so quiet that it could sneak up on the United States and blow it up. Spoiler alert: It doesn’t.

Here is the cool part: This magnetohydrodynamic drive, which turns water into a sort of rotor, is a real thing. (Although technically in the book version this drive is something other than magnetohydrodynamic. Quibbles.) In fact, it’s pretty simple to build. All you really need is a battery, a magnet, and some wires. Oh, also this will have to operate in salt water, so you might need some salt. Here is the basic setup.

You’ve read your last complimentary article this month. To read the full article, SUBSCRIBE NOW. If you’re already a subscriber, please sign in and and verify your subscription.

If scientists could find a way to control the process for making semiconductor components on a nanometric scale, they could give those components unique electronic and optical properties—opening the door to a host of useful applications.

Researchers at the Laboratory of Microsystems, in EPFL’s School of Engineering, have taken an important step towards that goal with their discovery of semiconducting nanotubes that assemble automatically in solutions of metallic nanocrystals and certain ligands. The tubes have between three and six walls that are perfectly uniform and just a few atoms thick—making them the first such nanostructures of their kind.

What’s more, the nanotubes possess photoluminescent properties: they can absorb light of a specific wavelength and then send out intense light waves of a different color, much like and quantum wells. That means they can be used as in , for example, or as catalysts in photoreduction reactions, as evidenced by the removal of the colors of some organic dyes, based on the results of initial experiments. The researchers’ findings have made the cover of ACS Central Science.

Honeywell Quantum Solutions has demonstrated record-breaking high fidelity quantum operations on their trapped-ion qubits. It is a major step towards producing the world’s most powerful quantum computer. Honeywell targets an operational trapped ion quantum computer by the end of 2019.

Currently the leading trapped ion quantum computer is by the startup IonQ. There are commercial quantum annealing systems from D-Wave Systems with 2000 qubits. There are superconducting quantum computers with 16–72 qubits from Google, IBM, Intel and Rigetti Systems.

For a robot to be able to “learn” sign language, it is necessary to combine different areas of engineering such as artificial intelligence, neural networks and artificial vision, as well as underactuated robotic hands. “One of the main new developments of this research is that we united two major areas of Robotics: complex systems (such as robotic hands) and social interaction and communication,” explains Juan Víctores, one of the researchers from the Robotics Lab in the Department of Systems Engineering and Automation of the UC3M.

The first thing the scientists did as part of their research was to indicate, through a simulation, the specific position of each phalanx in order to depict particular signs from Spanish Sign Language. They then attempted to reproduce this position with the robotic hand, trying to make the movements similar to those a human hand could make. “The objective is for them to be similar and, above all, natural. Various types of were tested to model this adaptation, and this allowed us to choose the one that could perform the gestures in a way that is comprehensible to people who communicate with sign language,” the researchers explain.

Finally, the scientists verified that the system worked by interacting with potential end-users. “The who have been in contact with the robot have reported 80 percent satisfaction, so the response has been very positive,” says another of the researchers from the Robotics Lab, Jennifer J. Gago. The experiments were carried out with TEO (Task Environment Operator), a for home use developed in the Robotics Lab of the UC3M.

A vegetable-picking robot that uses machine learning to identify and harvest a commonplace, but challenging, agricultural crop has been developed by engineers.

The ‘Vegebot’, developed by a team at the University of Cambridge, was initially trained to recognise and harvest iceberg lettuce in a lab setting. It has now been successfully tested in a variety of field conditions in cooperation with G’s Growers, a local fruit and vegetable co-operative.

Although the prototype is nowhere near as fast or efficient as a human worker, it demonstrates how the use of robotics in agriculture might be expanded, even for like iceberg lettuce which are particularly challenging to harvest mechanically. The results are published in The Journal of Field Robotics.