Jun 2, 2021
Intel’s image-enhancing AI is a step forward for photorealistic game engines
Posted by Genevieve Klien in categories: entertainment, robotics/AI
For reference, we can go back to the HRNet paper. The researchers used a dedicated Nvidia V100, a massive and extremely expensive GPU specially designed for deep learning inference. With no memory limitation and no hindrance by other in-game computations, the inference time for the V100 was 150 milliseconds per input, which is ~7 fps, not nearly enough to play a smooth game.
Development and training neural networks
Continue reading “Intel’s image-enhancing AI is a step forward for photorealistic game engines” »