Very technical and it’s from the usually secretive Calico.
Jacob Kimmel of Calico Labs discusses how cells can be reprogrammed to restore youthful expression through transient suppression of cell identity at Lifespan.io’s 2021 EARD conference.
Jacob Kimmel is a Principal Investigator at Calico Life Sciences leading a research program focused on repurposing developmental programs to address aging and age-related disease. His recent work has revealed the influence of cell identity on aging trajectories, discovered mechanisms of age-related impairment in muscle stem cells and developed machine learning methods for the analysis of single-cell genomics data. Prior to Calico, Jacob completed Ph.D. training with Wallace Marshall and Andrew Brack at the University of California San Francisco where he developed methods to measure cell state transitions with timelapse microscopy techniques.
Calico (Calico Life Sciences LLC) is an Alphabet-founded research and development company whose mission is to harness advanced technologies and model systems to increase our understanding of the biology that controls human aging.
Following last week’s surprise update, Google is back again with the December Feature Drop for Pixel owners. This time around, Google is bringing some Pixel 6-exclusive features to Pixel 4a 5G “or newer” Pixel devices. https://youtu.be/FmznwhHlCP8 These include the all-new Quick Tap to Snap, making it possible for you to quickly access and send Snaps through Snapchat right from the Lock Screen. Going along with the Quick Tap to Snap functionality, Google has introduced an all-new Pixel-exclusive lens called “Pixel Face”. The company claims that “more Pixel-exclusive Lenses” will arrive in future Feature Drops. In addition to the Feature Drop, this update includes the December Android Security Patch, and includes the following versions for these devices: Pixel 3a (XL): SQ1A.211205.008 Pixel 4 (XL): SQ1A.211205.008 Pixel 4a: SQ1A.211205.008 Pixel 4a (5G): SQ1A.211205.008 Pixel 5: SQ1A.211205.008 Pixel 5a (5G): SQ1A.211205.008 Owners of the Pixel 6 and 6 Pro will begin receiving their December updates “next week”. With last week’s update, Google announced compatibility with the Pixel 6 and 6 Pro to enable a digital car key with select BMW models. After being set up, you can simply place your Pixel on the key reader and press the engine start button. But now, Google is activating the Ultra-Wideband chip found in the Pixel 6 Pro to improve functionality. Not only will this make it easier to use your Pixel 6 Pro as a digital car key, but Google also states this will offer improved Nearby Share compatibility. Another new feature arriving for Pixel devices comes via the Sound Amplifier app. With this Feature Drop, Google is adding a new “Conversation Mode” to Pixel devices. This uses on-device Machine Learning to “help anyone who has a hard time hearing in loud environments by tuning out competing noise.” It works by pointing your phone at the person you want to have a conversation with, pinning the person, and then being able to actually enjoy hearing what they have to say. Google states this is a “sneak peek” version of the feature, but we could end up seeing it arrive for more devices in the future. Now Playing is getting an updated experience, as you’ll not only be able to identify the song with your Pixel but there’s a new music note icon next to the track information. When tapping on the music note, you’ll be able to save it as a favorite, while being able to view and search your history, along with the list of favorite songs. Keeping with the music trend, Google is bringing enhanced bass-level controls to the Pixel Buds A-Series. After the update has arrived, you’ll be able to adjust the slider between-1 and +4, which is “twice the bass range you currently have”. As we’ve seen with previous events, Google is adding a few new wallpapers to celebrate the International Day of Persons with Disabilities. In celebration of International Day of Persons with Disabilities, we collaborated with Dana Kearly, a disabled multidisciplinary artist from Vancouver B.C., to create three beautiful new wallpapers for the Curated Culture collection. Last, but certainly not least, Google is bringing car crash detection support to owners of the Pixel 3 or newer in Taiwan, Italy, and France. With this feature, your phone will check with you in the event that you are in a car accident. If you do not respond within the pre-determined amount of time, your Pixel will contact emergency responders and will provide your location.
Get 20% off your first Mack Weldon order and try out the Daily Wear System when you go to http://www.mackweldon.com/joescott and enter promo code “JOESCOTT” at checkout. From the potential of orbital railguns, to space elevators on the moon and Mars, to the threat of AI taking over your job, to the latest on Neuralink, today’s lightning round video features questions from Patreon supporters. Thanks for the great questions guys!
Clearview AI, the company known for its facial recognition technology that fills its database with images it scrapes from the web, is a step closer to obtaining a US patent for its controversial tech. The company has received a “notice of allowance” from the US Patent and Trademark Office.
I think his purpose in doing this is to prioritize full self driving over partial self driving features.
“Humans drive with eyes and biological neural nets,” Musk said in October. “So [it] makes sense that cameras and silicon neural nets are [the] only way to achieve generalized solution to self-driving.”
Moreover, he’s reportedly implementing that philosophy at Tesla.
Musk has repeatedly instructed the company’s Autopilot team, which works on self-driving car tech, to ditch radar and use only cameras instead, the New York Times reported on Monday.
No multi-billion-dollar acquisitions occurred in the world of AI chips in 2021.
Instead, the leading AI chip startups all raised rounds at multi-billion-dollar valuations, making clear that they aspire not to get acquired but to become large standalone public companies.
In our predictions last December, we identified three startups in particular as likely acquisition targets. Of these: SambaNova raised a $670 million Series D at a $5 billion valuation in April; Cerebras raised a $250 million Series F at a $4 billion valuation last month; and Graphcore raised $220 million at a valuation close to $3 billion amid rumors of an upcoming IPO.
Other top AI chip startups like Groq and Untether AI also raised big funding rounds in 2021.
Full Story:
As of the beginning of this year, no autonomous vehicle company had ever gone public. 2021 is the year that that all changed.
To handle this, people have trained neural networks on regions where we have more complete weather data. Once trained, the system could be fed partial data and infer what the rest was likely to be. For example, the trained system can create a likely weather radar map using things like satellite cloud images and data on lightning strikes.
This is exactly the sort of thing that neural networks do well with: recognizing patterns and inferring correlations.
What drew the Rigetti team’s attention is the fact that neural networks also map well onto quantum processors. In a typical neural network, a layer of “neurons” performs operations before forwarding its results to the next layer. The network “learns” by altering the strength of the connections among units in different layers. On a quantum processor, each qubit can perform the equivalent of an operation. The qubits also share connections among themselves, and the strength of the connection can be adjusted. So, it’s possible to implement and train a neural network on a quantum processor.
Robots are already in space. From landers on the moon to rovers on Mars and more, robots are the perfect candidates for space exploration: they can bear extreme environments while consistently repeating the same tasks in exactly the same way without tiring. Like robots on Earth, they can accomplish both dangerous and mundane jobs, from space walks to polishing a spacecraft’s surface. With space missions increasing in number and expanding in scientific scope, requiring more equipment, there’s a need for a lightweight robotic arm that can manipulate in environments difficult for humans.
Robots are already in space. From landers on the moon to rovers on Mars and more, robots are the perfect candidates for space exploration: they can bear extreme environments while consistently repeating the same tasks in exactly the same way without tiring. Like robots on Earth, they can accomplish both dangerous and mundane jobs, from space walks to polishing a spacecraft’s surface. With space missions increasing in number and expanding in scientific scope, requiring more equipment, there’s a need for a lightweight robotic arm that can manipulate in environments difficult for humans.
However, the control schemes that can move such arms on Earth, where the planes of operation are flat, do not translate to space, where the environment is unpredictable and changeable. To address this issue, researchers in Harbin Institute of Technology’s School of Mechanical Engineering and Automation have developed a robotic arm weighing 9.23 kilograms—about the size of a one-year-old baby—capable of carrying almost a quarter of its own weight, with the ability to adjust its position and speed in real time based on its environment.
They published their results on Sept. 28 in Space: Science & Technology.