Toggle light / dark theme

The University of Bristol is part of an international consortium of 13 universities in partnership with Facebook AI, that collaborated to advance egocentric perception. As a result of this initiative, we have built the world’s largest egocentric dataset using off-the-shelf, head-mounted cameras.


Progress in the fields of artificial intelligence (AI) and augmented reality (AR) requires learning from the same data humans process to perceive the world. Our eyes allow us to explore places, understand people, manipulate objects and enjoy activities—from the mundane act of opening a door to the exciting interaction of a game of football with friends.

Egocentric 4D Live Perception (Ego4D) is a massive-scale dataset that compiles 3,025 hours of footage from the wearable cameras of 855 participants in nine countries: UK, India, Japan, Singapore, KSA, Colombia, Rwanda, Italy, and the US. The data captures a wide range of activities from the ‘egocentric’ perspective—that is from the viewpoint of the person carrying out the activity. The University of Bristol is the only UK representative in this diverse and international effort, collecting 270 hours from 82 participants who captured footage of their chosen activities of daily living—such as practicing a musical instrument, gardening, grooming their pet, or assembling furniture.

“In the not-too-distant future you could be wearing smart AR glasses that guide you through a recipe or how to fix your bike—they could even remind you where you left your keys,” said Principal Investigator at the University of Bristol and Professor of Computer Vision, Dima Damen.

Lizards can regrow severed tails, making them the closest relative to humans that can regenerate a lost appendage. But in lieu of the original tail that includes a spinal column and nerves, the replacement structure is an imperfect cartilage tube. Now, for the first time, a USC-led study in Nature Communications describes how stem cells can help lizards regenerate better tails.

“This is one of the only cases where the regeneration of an appendage has been significantly improved through stem cell-based therapy in any reptile, bird or mammal, and it informs efforts to improve wound healing in humans,” said the study’s corresponding author Thomas Lozito, an assistant professor of orthopaedic surgery and stem cell biology and regenerative medicine at the Keck School of Medicine of USC.

Oct. 13 2021 — In 1,998 researchers including Mark Kubinec of UC Berkeley performed one of the first simple quantum computations using individual molecules. They used pulses of radio waves to flip the spins of two nuclei in a molecule, with each spin’s “up” or “down” orientation storing information in the way that a “0” or “1” state stores information in a classical data bit. In those early days of quantum computers, the combined orientation of the two nuclei – that is, the molecule’s quantum state – could only be preserved for brief periods in specially tuned environments. In other words, the system quickly lost its coherence. Control over quantum coherence is the missing step to building scalable quantum computers.

Now, researchers are developing new pathways to create and protect quantum coherence. Doing so will enable exquisitely sensitive measurement and information processing devices that function at ambient or even extreme conditions. In 2,018 Joel Moore, a senior faculty scientist at Lawrence Berkeley National Laboratory (Berkeley Lab) and professor at UC Berkeley, secured funds from the Department of Energy to create and lead an Energy Frontier Research Center (EFRC) – called the Center for Novel Pathways to Quantum Coherence in Materials (NPQC) – to further those efforts. “The EFRCs are an important tool for DOE to enable focused inter-institutional collaborations to make rapid progress on forefront science problems that are beyond the scope of individual investigators,” said Moore.

Through the NPQC, scientists from Berkeley Lab, UC Berkeley, UC Santa Barbara, Argonne National Laboratory, and Columbia University are leading the way to understand and manipulate coherence in a variety of solid-state systems. Their threefold approach focuses on developing novel platforms for quantum sensing; designing two-dimensional materials that host complex quantum states; and exploring ways to precisely control a material’s electronic and magnetic properties via quantum processes. The solution to these problems lies within the materials science community. Developing the ability to manipulate coherence in realistic environments requires in-depth understanding of materials that could provide alternate quantum bit (or “qubit”), sensing, or optical technologies.

Today we learned that an art group is planning a spectacle to draw attention to a provocative use of our industrial robot, Spot. To be clear, we condemn the portrayal of our technology in any way that promotes violence, harm, or intimidation. It’s precisely the sort of thing the company tries to get out in front of. After decades of killer-robot science-fiction, it doesn’t take much to make people jump any time an advanced robot enters the picture. It’s the automaton version of Rule 34 (in staunch defiance of Asimov’s First Law of Robotics): If a robot exists, someone has tried to weaponize it.

Full Story:


Let’s talk about strapping guns to the backs of robots. I’m not a fan (how’s that for taking a stand?). When MSCHF did it with Spot back in February, it was a thought experiment, art exhibit and a statement about where society might be headed with autonomous robotics. And most importantly, of course, it was a paintball gun. Boston Dynamics clearly wasn’t thrilled with the message it was sending, noting:

The math is pretty basic. How many satellites are going to go up over the next decade? How many solar panels will they need? And how many are being manufactured that fit the bill? Turns out the answers are: a lot, a hell of a lot, and not nearly enough. That’s where Regher Solar aims to make its mark, by bringing the cost of space-quality solar panels down by 90% while making an order of magnitude more of them. It’s not exactly a modest goal, but fortunately the science and market seem to be in favor, giving the company something of a tailwind. The question is finding the right balance between cost and performance while remaining relatively easy to manufacture. Of course, if there was an easy answer there, someone would already be doing that.

Full Story:


Intech Company is the ultimate source of the latest AI news. It checks trusted websites and collects bests pieces of AI information.

We spent 24 hours exploring 575 underground bunkers in the middle of South Dakota being converted into the worlds largest prepper community. Get Surfshark VPN at https://surfshark.deals/karaandnate.
and enter our code KARAANDNATE for 83% off and 3 extra months for free!

Vlog 765 | #vanlife at xPoint, South Dakota, USA | State 27/50 | Filmed April 8 2021.

📸 Follow us on Instagram for behind the scenes content: @karaandnate https://www.instagram.com/karaandnate/

🗺 Create your own CUSTOM route map from your favorite vacation, city, or road trip with our new site! https://atlas.co/

To efficiently navigate their surrounding environments and complete missions, unmanned aerial systems (UASs) should be able to detect multiple objects in their surroundings and track their movements over time. So far, however, enabling multi-object tracking in unmanned aerial vehicles has proved to be fairly challenging.

To efficiently navigate their surrounding environments and complete missions, unmanned aerial systems (UASs) should be able to detect multiple objects in their surroundings and track their movements over time. So far, however, enabling multi-object tracking in unmanned aerial vehicles has proved to be fairly challenging.

Researchers at Lockheed Martin AI Center have recently developed a new deep learning technique that could allow UASs to track multiple objects in their surroundings. Their technique, presented in a paper pre-published on arXiv, could aid the development of better performing and more responsive autonomous flying systems.

“We present a robust tracking architecture aimed to accommodate for the noise in real-time situations,” the researchers wrote in their paper. “We propose a kinematic prediction model, called deep extended Kalman filter (DeepEKF), in which a sequence-to-sequence architecture is used to predict entity trajectories in latent space.”

Founders tend to think responsible AI practices are challenging to implement and may slow the progress of their business. They often jump to mature examples like Salesforce’s Office of Ethical and Humane Use and think that the only way to avoid creating a harmful product is building a big team. The truth is much simpler.

I set out to learn how founders were thinking about responsible AI practices on the ground by speaking with a handful of successful early-stage founders and found many of them were implementing responsible AI practices.

Only they didn’t call it that. They just call it “good business.”