Toggle light / dark theme

Tesla’s (TSLA) stock is rising in pre-market trading on an optimistic new report about the automaker’s Dojo supercomputer coming from Morgan Stanley.

The firm massively increased its price target on Tesla’s stock because of it.

Dojo is Tesla’s own custom supercomputer platform built from the ground up for AI machine learning and, more specifically, for video training using the video data coming from its fleet of vehicles.

Similarly, allowing the MyoLegs to flail around for a while in a seemingly aimless fashion gave them better performance with locomotion tasks, as the researchers described in another paper presented at the recent Robotics Science and Systems meeting. Vittorio Caggiano, a Meta researcher on the project who has a background in both AI and neuroscience, says that scientists in the fields of neuroscience and biomechanics are learning from the MyoSuite work. “This fundamental knowledge [of how motor control works] is very generalizable to other systems,” he says. “Once they understand the fundamental mechanics, then they can apply those principles to other areas.”

This year, MyoChallenge 2023 (which will also culminate at the NeurIPS meeting in December) requires teams to use the MyoArm to pick up, manipulate, and accurately place common household objects and to use the MyoLegs to either pursue or evade an opponent in a game of tag.

Emo Todorov, an associate professor of computer science and engineering at the University of Washington, has worked on similar biomechanical models as part of the popular Mujoco physics simulator. (Todorov was not involved with the current Meta research but did oversee Kumar’s doctoral work some years back.) He says that MyoSuite’s focus on learning general representations means that control strategies can be useful for “a whole family of tasks.” He notes that their generalized control strategies are analogous to the neuroscience principle of muscle synergies, in which the nervous system activates groups of muscles at once to build up to larger gestures, thus reducing the computational burden of movement. “MyoSuite is able to construct such representations from first principles,” Todorov says.

Scientists have successfully grown kidneys made of mostly human cells inside pig embryos — taking researchers yet another step down the long road toward generating viable human organs for transplant.

The results, reported September 7 in Cell Stem Cell, mark the first time a solid humanized organ, one with both human and animal cells, has been grown inside another species.


The work represents an important advance in the methods needed to grow humanized kidneys, hearts, and pancreases in animals.

Cells hidden in the skull may point to a way to detect, diagnose and treat inflamed brains.

A detailed look at the skull reveals that bone marrow cells there change and are recruited to the brain after injury, possibly traveling through tiny channels connecting the skull and the outer protective layer of the brain. Paired with the discovery that inflammation in the skull is disease-specific, these new findings collectively suggest the skull’s marrow could serve as a target to track and potentially treat neurological disorders involving brain inflammation, researchers report August 9 in Cell.


New observations of skull cell signals and skull tunnels suggest bone marrow there could be used to monitor neurological diseases.

The large language models that enable generative artificial intelligence (AI) are driving an increase in investment and an acceleration of competition in the field of silicon photonics, a technology that combines silicon-based integrated circuits (ICs) and optical components to process and transmit massive amounts of data more efficiently.

Top-rank designers and manufacturers of ICs, AI systems and telecommunications equipment have all joined the race, including NVIDIA, TSMC, Intel, IBM, Cisco Systems, Huawei, NTT and imec, the Interuniversity Microelectronics Centre headquartered in Belgium.

These and other organizations have been working on silicon photonics for many years, some of them (including Intel and NTT) for nearly two decades.

Recent advancements in gene editing technologies may lead to a cure for hemoglobinopathies, including sickle cell disease and β-thalassemia.

A collaborative study between researchers from St Jude Children’s Research Hospital (TN, USA) and the Broad Institute of MIT and Harvard (MA, USA) has shown that adenosine base editing could be more effective than other gene editing approaches such as CRISPR/Cas9 for treating sickle cell disease and β-thalassemia. Comparing five different gene editing strategies utilizing either Cas9 nucleases or adenine base editors in hematopoietic and progenitor stem cells, the team found that base editing yielded more favorable results.

Sickle cell disease and β-thalassemia arise due to mutations in the β-globin subunit of hemoglobin, resulting in defective red blood cells. Previous studies have shown that restoring the function of γ-globin, a hemoglobin submit expressed during fetal development, could hold therapeutic advantages for patients with sickle cell disease and β-thalassemia. During fetal development, γ-globin combines with α-globin to form fetal hemoglobin. Following birth, expression of γ-globin ceases as it is replaced by β-globin to form adult hemoglobin. The researchers sought to see whether fetal hemoglobin expression could be restored in post-natal red blood cells to counter the effects of the disease, offering a potentially universal therapeutic approach for the disease.

Octopuses are like aliens living among us — they do a lot of things differently from land animals, or even other sea creatures. Their flexible tentacles taste what they touch and have minds of their own. Octopuses’ eyes are color-blind, but their skin can detect light on its own (SN: 6/27/15, p. 10). They are masters of disguise, changing color and skin textures to blend into their surroundings or scare off rivals. And to a greater extent than most creatures, octopuses squirt the molecular equivalent of red ink over their genetic instructions with astounding abandon, like a copy editor run amok.

These edits modify RNA, the molecule used to translate information from the genetic blueprint stored in DNA, while leaving the DNA unaltered.


Modifications to RNA could explain the intelligence and flexibility of shell-less cephalopods.

We’re going to be hearing a lot about various plans and positions on AI regulation in the coming weeks.

The US Congress is heading back into session, and they are hitting the ground running on AI. We’re going to be hearing a lot about various plans and positions on AI regulation in the coming weeks, kicking off with Senate Majority Leader Chuck Schumer’s first AI Insight Forum on Wednesday. This and planned future forums will bring together some of the top people in AI to discuss the risks and opportunities posed by advances in this technology and how Congress might write legislation to address them.

This newsletter will break down what exactly these forums are and aren’t, and what might come… More.

Meta is reportedly planning to train a new model that it hopes will be as powerful as OpenAI’s latest and greatest chatbot.

Meta has been snapping up AI training chips and building out data centers in order to create a more powerful new chatbot it hopes will be as sophisticated as OpenAI’s GPT-4, according to The Wall Street Journal.

The Journal writes that Meta has been buying more Nvidia H100 AI-training chips and is beefing up its infrastructure so that, this time around, it won’t need to rely on Microsoft’s Azure cloud platform to train the new chatbot. The company reportedly assembled a group earlier this year to build the model, with the goal of speeding up the creation of AI tools that can emulate human expressions. company aims to release its new model next year.

Though artificial intelligence has been making inroads into the enterprise, the rise of generative AI is accelerating the pace of adoption. It’s time for enterprise CXOs to consider building systems of intelligence that complement systems of record and systems of engagement.

In the last two decades, enterprises have invested in building solid foundations for managing data and information. Relational databases such as Oracle and Microsoft SQL Server became the cornerstone of information systems. Built on this foundation were customer relationship management, human resources management, supply chain management and other line of business applications that quickly became the digital backbone of… More.


This context, when combined with advanced prompt engineering, helps enterprises build intelligent AI-based assistants on the lines of Microsoft Copilot or Google Duet AI.

The foundation models become the core of systems of intelligence. The contextual information generated via semantic search is fed to these generative AI models, which deliver rich insights and accurate information to users. The use cases aligned with SOI go beyond typical chatbots. Different teams within an organization will use them to handle a range of scenarios, from marketing to sales forecasting.