Toggle light / dark theme

Siren is a digital human that showcases the potential of 3D/4D scanning technology for digitizing human appearance and motion with unprecedented realism.

Epic Games Inc. is a leading developer of the popular video game engine Unreal Engine. Recently, the company launched the highly anticipated MetaHuman Creator, a tool for creating high-fidelity digital humans for various applications including video games, film and more. The MetaHuman Creator is powered by advanced motion capture and rendering technology, enabling creators to create lifelike characters that can be customized and animated for various uses.

The advancement of digital humans is disrupting our world in profound and unpredictable ways. As they become increasingly intelligent, empathetic and capable, they will reshape businesses, society and human relationships at every level.

Passengers of Singapore Airlines can now stay connected to free internet at an altitude of 12,000 meters.

In simpler times, during a flight journey, one could switch off the cellular, read a good or a bad book, enjoy a glass of questionable wine, watch a movie in a different language using the in-flight entertainment system, or simply nod off. Or one could even dare to converse with a fellow passenger (gasp).

And now, more and more airlines have… More.


FreshSplash/iStock.

But we live in a digitally demanding age, and while it would be convenient for many to stay connected to the world outside of the airplane, up until about five years ago, flying meant going off the grid (sort of), even if for a couple of hours.

Daniel Lidar, the Viterbi Professor of Engineering at USC and Director of the USC Center for Quantum Information Science & Technology, and Dr. Bibek Pokharel, a Research Scientist at IBM Quantum, have achieved a quantum speedup advantage in the context of a “bitstring guessing game.” They managed strings up to 26 bits long, significantly larger than previously possible, by effectively suppressing errors typically seen at this scale. (A bit is a binary number that is either zero or one). Their paper is published in the journal Physical Review Letters.

Quantum computers promise to solve certain problems with an advantage that increases as the problems increase in complexity. However, they are also highly prone to errors, or noise. The challenge, says Lidar, is “to obtain an advantage in the real world where today’s quantum computers are still ‘noisy.’” This noise-prone condition of current is termed the “NISQ” (Noisy Intermediate-Scale Quantum) era, a term adapted from the RISC architecture used to describe classical computing devices. Thus, any present demonstration of quantum speed advantage necessitates noise reduction.

The more unknown variables a problem has, the harder it usually is for a to solve. Scholars can evaluate a computer’s performance by playing a type of game with it to see how quickly an algorithm can guess hidden information. For instance, imagine a version of the TV game Jeopardy, where contestants take turns guessing a secret word of known length, one whole word at a time. The host reveals only one correct letter for each guessed word before changing the secret word randomly.

Join Our new premium membership: Doug Casey’s Phyle. https://phyle.co.

ADDITIONAL WAYS TO CONNECT WITH US:
Connect with us on Telegram: https://t.me/dougcasey.

Email list: https://dougcasey.substack.com/


Discussed in this episode:

Doug Casey on the Shocking 2025 Deagel Forecast… War, Population Reduction and the Collapse of the West

When gaming and AI wholly collide… it’ll hopefully look this good but sound much better.

At Computex 2023 in Taipei, Nvidia CEO Jensen Huang just gave the world a glimpse of what it might be like when gaming and AI collide — with a graphically breathtaking rendering of a cyberpunk ramen shop where you can actually talk to the proprietor.

Seriously, instead of clicking on dialogue options, it imagines you could hold down a button, just say something with your own voice, and get an answer from a video game character. Nvidia’s calling it a “peek at the future of games.”


Generative AI technologies are revolutionizing how games are conceived, produced, and played. Game developers are exploring how these technologies impact 2D and 3D content-creation pipelines during production. Part of the excitement comes from the ability to create gaming experiences at runtime that would have been impossible using earlier solutions.

The creation of non-playable characters (NPCs) has evolved as games have become more sophisticated. The number of pre-recorded lines has grown, the number of options a player has to interact with NPCs has increased, and facial animations have become more realistic.

The company is also focusing on advertising and its core segment of gaming.

Chipmaker Nvidia has unveiled a slew of artificial intelligence (AI) products in its bid to stay ahead of the game and join the trillion-dollar valuation club with the likes of Apple, Microsoft, and Amazon. The announcement comes close to the market rally of NVIDIA stock, which rose over 25 percent last week.

Once known for making chips for gaming geeks, Nvidia is now at the core of the AI frenzy that has gripped the world after its graphic processing units (GPUs) have been a critical component of the capacities of AI tools. The company’s A100 and H100 chips have become household names after tools like ChatGPT became popular last year.

A group of computer scientists from the University of Toronto wants to make it easier to film how-to videos.

The team of researchers have developed Stargazer, an interactive robot that helps university instructors and other content creators create engaging tutorial videos demonstrating physical skills.

For those without access to a cameraperson, Stargazer can capture dynamic instructional videos and address the constraints of working with static cameras.