Toggle light / dark theme

Computer chip designers, materials scientists, biologists and other scientists now have an unprecedented level of access to the world of nanoscale materials thanks to 3D visualization software that connects directly to an electron microscope, enabling researchers to see and manipulate 3D visualizations of nanomaterials in real time.

Developed by a University of Michigan-led team of engineers and software developers, the capabilities are included in a new beta version of tomviz, an open-source 3D data visualization tool that’s already used by tens of thousands of researchers. The new version reinvents the visualization process, making it possible to go from microscope samples to 3D visualizations in minutes instead of days.

In addition to generating results more quickly, the new capabilities enable researchers to see and manipulate 3D visualizations during an ongoing experiment. That could dramatically speed research in fields like microprocessors, electric vehicle batteries, lightweight materials and many others.

A paradigm shift away from the 3D mathematical description developed by Schrödinger and others to describe how we see color could result in more vibrant computer displays, TVs, textiles, printed materials, and more.

New research corrects a significant error in the 3D mathematical space developed by the Nobel Prize-winning physicist Erwin Schrödinger and others to describe how your eye distinguishes one color from another. This incorrect model has been used by scientists and industry for more than 100 years. The study has the potential to boost scientific data visualizations, improve televisions, and recalibrate the textile and paint industries.

“The assumed shape of color space requires a paradigm shift,” said Roxana Bujack, a computer scientist with a background in mathematics who creates scientific visualizations at Los Alamos National Laboratory. Bujack is lead author of the paper on the mathematics of color perception by a Los Alamos team. It was published in the Proceedings of the National Academy of Sciences.

The idea of conversing with mourners at your own funeral may sound like the plot from the latest episode of Black Mirror.

But it could become a reality, thanks to a Los Angeles-based startup, which has developed a ‘holographic conversational video experience’.

StoryFile creates a digital clone of the subject by using 20 synchronised cameras to record them answering a series of questions.

The U.S. Department of Energy (DOE) has published a request for information from computer hardware and software vendors to assist in the planning, design, and commission of next-generation supercomputing systems.

The DOE request calls for computing systems in the 2025–2030 timeframe that are five to 10 times faster than those currently available and/or able to perform more complex applications in “data science, artificial intelligence, edge deployments at facilities, and science ecosystem problems, in addition to traditional modelling and simulation applications.”

U.S. and Slovakia-based company Tachyum has now responded with its proposal for a 20 exaFLOP system. This would be based on Prodigy, its flagship product and described as the world’s first “universal” processor. According to Tachyum, the chip integrates 128 64-bit compute cores running at 5.7 GHz and combining the functionality of a CPU, GPU, and TPU into a single device with homogeneous architecture. This allows Prodigy to deliver performance at up to 4x that of the highest performing x86 processors (for cloud workloads) and 3x that of the highest performing GPU for HPC and 6x for AI applications.