The three billionaires are locked in a race to reach space — but some future visions look more promising than others.
The growing population of avatars that use AI smarts to interact with us is a major clue.
In the fictional worlds of film and TV, artificial intelligence has been depicted as so advanced that it is indistinguishable from humans. But what if we’re actually getting closer to a world where AI is capable of thinking and feeling?
Tech company UneeQ is embarking on that journey with its “digital humans.” These avatars act as visual interfaces for customer service chatbots, virtual assistants, and other applications. UneeQ’s digital humans appear lifelike not only in terms of language and tone of voice, but also because of facial movements: raised eyebrows, a tilt of the head, a smile, even a wink. They transform a transaction into an interaction: creepy yet astonishing, human, but not quite.
What lies beneath UneeQ’s digital humans? Their 3D faces are modeled on actual human features. Speech recognition enables the avatar to understand what a person is saying, and natural language processing is used to craft a response. Before the avatar utters a word, specific emotions and facial expressions are encoded within the response.
The scientists found that there are some key differences between different FRBs, some of which were one-off bursts and some of which rapidly repeated, according to CNN. That lead them to believe that the different categories are given off by fundamentally different sources of cosmic phenomena, they said in research presented Wednesday at a meeting of the American Astronomical Society. The next steps, of course, are to figure out what those sources actually are.
Thanks to just a year’s worth of observations that greatly expanded the known number of FRBs, the scientists now have much more to work with as they try to figure out what’s causing them. It also highlights the fact that FRBs, once thought to be rare occurrences, appear to be common phenomena in the grand scheme of things.
“That’s kind of the beautiful thing about this field — FRBs are really hard to see, but they’re not uncommon,” MIT physicist and CHIME member Kiyoshi Masui said in a press release. “If your eyes could see radio flashes the way you can see camera flashes, you would see them all the time if you just looked up.”
Stefan Thomas really could have used a quantum computer this year.
The German-born programmer and crypto trader forgot the password to unlock his digital wallet, which contains 7002 bitcoin, now worth $265 million. Quantum computers, which will be several million times faster than traditional computers, could have easily helped him crack the code.
Though quantum computing is still very much in its infancy, governments and private-sector companies such as Microsoft and Google are working to make it a reality. Within a decade, quantum computers could be powerful enough to break the cryptographic security that protects cell phones, bank accounts, email addresses and — yes — bitcoin wallets.
Scientists using sound recordings from underwater nuclear bomb detectors have discovered the distinct song of a previously unknown population of pygmy blue whales in the Indian Ocean.
Summary: A new deep neural network can accurately predict a healthy person’s brain age based on EEG data collected from a sleep study.
Source: AASM
A study shows that a deep neural network model can accurately predict the brain age of healthy patients based on electroencephalogram data recorded during an overnight sleep study, and EEG-predicted brain age indices display unique characteristics within populations with different diseases.
The accelerating effort to understand the mathematics of quantum field theory will have profound consequences for both math and physics.
Awesome cameras everywhere.
Watch the #ISOCELLUnroll 2021 event introducing the new #ISOCELL JN1, #Samsung’s 50MP image sensor with 0.64μm pixels. Equipped with innovative pixel technologies, the ISOCELL JN1 delivers awesome detail and colors in an ultra-slim package.
00:00 ISOCELL Unroll.
00:31 Opening.
01:08 Welcome speech.
02:56 Introduction: ISOCELL JN1
04:27 Awesome detail: 50MP with 0.64μm pixels.
07:22 Awesome light: ISOCELL 2.0 and Tetrapixel.
08:59 Awesome colors: Smart-ISO and HDR
10:13 Awesome focus: Double Super PD
12:06 Q&A
14:10 Closing remarks.
**A team of researchers affiliated with institutions in Singapore, China, Germany and the U.K., has developed an insect-computer hybrid system for use in search operations after disasters strike. **They have written a paper describing their system, now posted on the arXiv preprint server.
Because of the frequency of natural disasters such as earthquakes, fires and floods, scientists have been looking for better ways to help victims trapped in the rubble–people climbing over wreckage is both hazardous and inefficient. The researchers noted that small creatures such as insects move much more easily under such conditions and set upon the task of using a type of cockroach as a searcher to assist human efforts.
The system they came up with merges microtechnology with the natural skills of a live Madagascar hissing cockroach. These cockroaches are known for their dark brown and black body coloring and, of course, for the hissing sound they make when upset. They are also one of the few wingless cockroaches, which made them a good candidate for carrying a backpack.
The backpack created by the researchers consisted of five circuit boards connected together that hosted an IR camera, a communications chip, a CO2 sensor, a microcontroller, flash memory, a DAC converter and an IMU. The electronics-filled backpack was then affixed to the back of a cockroach. The researchers also implanted electrodes in the cockroach’s cerci–the antenna-like appendages on either side of its head. In its normal state, the cockroach uses its cerci to feel what is in its path and uses that information to make decisions about turning left or right. With the electrodes in place, the backpack could send very small jolts of electricity to the right or left cerci, inducing the cockroach to turn in a desired direction.
Testing involved setting the cockroach in a given spot and having it attempt to find a person laying in the vicinity. A general destination was preprogrammed into the hardware and then the system was placed into a test scenario, where it moved autonomously using cues from its sensor to make its way to the person serving as a test victim. The researchers found their system was able to locate the test human 94% of the time. They plan to improve their design with the goal of using the system in real rescue operations.
Google claims that it has developed artificial intelligence software that can design computer chips faster than humans can.
The tech giant said in a paper in the journal Nature on Wednesday that a chip that would take humans months to design can be dreamed up by its new AI in less than six hours.
The AI has already been used to develop the latest iteration of Google’s tensor processing unit chips, which are used to run AI-related tasks, Google said.