Toggle light / dark theme

In a study published in Cell Reports Physical Science (“Electro-Active Polymer Hydrogels Exhibit Emergent Memory When Embodied in a Simulated Game-Environment”), a team led by Dr Yoshikatsu Hayashi demonstrated that a simple hydrogel — a type of soft, flexible material — can learn to play the simple 1970s computer game ‘Pong’. The hydrogel, interfaced with a computer simulation of the classic game via a custom-built multi-electrode array, showed improved performance over time.

Dr Hayashi, a biomedical engineer at the University of Reading’s School of Biological Sciences, said: Our research shows that even very simple materials can exhibit complex, adaptive behaviours typically associated with living systems or sophisticated AI.

This opens up exciting possibilities for developing new types of ‘smart’ materials that can learn and adapt to their environment.

While large language models (LLMs) have demonstrated remarkable capabilities in extracting data and generating connected responses, there are real questions about how these artificial intelligence (AI) models reach their answers. At stake are the potential for unwanted bias or the generation of nonsensical or inaccurate “hallucinations,” both of which can lead to false data.

That’s why SMU researchers Corey Clark and Steph Buongiorno are presenting a paper at the upcoming IEEE Conference on Games, scheduled for August 5–8 in Milan, Italy. They will share their creation of a GAME-KG framework, which stands for “Gaming for Augmenting Metadata and Enhancing Knowledge Graphs.”

The research is published on the arXiv preprint server.

Recent advances in the field of artificial intelligence (AI) and computing have enabled the development of new tools for creating highly realistic media, virtual reality (VR) environments and video games. Many of these tools are now widely used by graphics designers, animated film creators and videogame developers worldwide.

One aspect of virtual and digitally created environments that can be difficult to realistically reproduce is fabrics. While there are already various computational tools for digitally designing realistic -based items (e.g., scarves, blankets, pillows, clothes, etc.), creating and editing realistic renderings of these fabrics in real-time can be challenging.

Researchers at Shandong University and Nanjing University recently introduced a new lightweight artificial neural network for the real-time rendering of woven fabrics. Their proposed network, introduced in a paper published as part of the Special Interest Group on Computer Graphics and Interactive Techniques Conference Conference Papers ‘24, works by encoding the patterns and parameters of fabrics as a small latent vector, which can later be interpreted by a decoder to produce realistic representations of various fabrics.