Toggle light / dark theme

News-Medical speaks to David Dambman from Biosero about the emerging importance of automation in scientific research and how a centralized scheduling software is an essential first step for any laboratory looking to automate their workflow.

Why has automation become so critical to advancing scientific research?

There are many reasons why automation is useful in scientific research. First and foremost, automation is about being able to walk away from your experiments and spend time analyzing your results, rather than carrying out mundane tasks such as transferring liquids from one plate to another.

Read more

Training bigger neural networks can be challenging when faced with accelerator memory limits. The size of the datasets being used by machine learning models is very large nowadays. For example, a standard image classification datasets like hashtagged Instagram contains millions of images. With the increasing quality of the images, the memory required will also increase. Today, the memory available on NVIDIA GPUs is only 32 GB.

Therefore, there needs to be a tradeoff between memory allocated for the features in a model and how the network gets activated. It is only understandable why the accelerator memory limit needs to be breached.


Intel and others are investing $13 million in Untether AI, a startup that’s working on a novel type of chip for artificial intelligence that promises to perform neural-network calculations at warp speed.

Speedup: Untether, based in Toronto, Canada, has already developed a prototype device that transfers data between different parts of the chip 1,000 times more quickly than a conventional AI chip. That’s an impressive achievement, but it should be treated cautiously since the prototype is far larger than an actual chip—and because other factors will contribute to the overall performance of the finished device.

Bottleneck: One of the key challenges with modern chips is shuttling data from memory to the units used to perform logical operations. This is especially problematic as the amount of data that chips need to process increases, as is the case with AI applications such as face or voice recognition. Untether uses what’s known as “near-memory computing” to reduce the physical distance between memory and the processing tasks, which speeds up data transfer and lowers power consumption.


On its final flyby of Saturn’s largest moon in 2017, NASA’s Cassini spacecraft gathered radar data revealing that the small liquid lakes in Titan’s northern hemisphere are surprisingly deep, perched atop hills and filled with methane.

The new findings, published April 15 in Nature Astronomy, are the first confirmation of just how deep some of Titan’s lakes are (more than 300 feet, or 100 meters) and of their composition. They provide new information about the way liquid methane rains on, evaporates from and seeps into Titan—the only planetary body in our solar system other than Earth known to have stable liquid on its surface.

Scientists have known that Titan’s hydrologic cycle works similarly to Earth’s—with one major difference. Instead of water evaporating from seas, forming clouds and rain, Titan does it all with methane and ethane. We tend to think of these hydrocarbons as a gas on Earth, unless they’re pressurized in a tank. But Titan is so cold that they behave as liquids, like gasoline at room temperature on our planet.

Read more