Toggle light / dark theme

Robot That Watched Surgical Videos Now Operates With Human-Level Skill

Researchers have developed a robot capable of performing surgical procedures with the same skill as human doctors by training it using videos of surgeries.

The team from Johns Hopkins and Stanford Universities harnessed imitation learning, a technique that allowed the robot to learn from a vast archive of surgical videos, eliminating the need for programming each move. This approach marks a significant step towards autonomous robotic surgeries, potentially reducing medical errors and increasing precision in operations.

Revolutionary Robot Training

Former COO at Paypal

Former COO at Paypal, David Sacks says that OpenAI recently gave investors a product roadmap update and said their AI models will soon be at PHD level reasoning, act as agents to use tools, meaning the model will be able to pretend to be a human. — - — 👉Get FREE access to the latest AI startup stories! Link in bio. — - — #todayinai #openai #largelanguagemodels

Jensen Huang says that a trillion dollars is being spent on data centers to enable the next

Jensen Huang says that a trillion dollars is being spent on data centers to enable the next, biggest wave of AI to revolutionize business productivity. — - — 👉 Before you go 👋 If you want to keep up with the latest news on AI startups and how they’re changing the world, join 1000+ subscribers reading our newsletter for FREE! Link in bio. — - — #jensenhuang #nvidia #datacenter #datacenters #aidata #artificialintelligence #aitakeover #todayinai

Avi Loeb’s Statement on UAPs to the House Oversight and Accountability Committee

Over the past few months, I was asked multiple times by Staff of the House Committee on Oversight and Accountability whether I am available to testify before the U.S. Congress on Unidentified Anomalous Phenomena (UAPs). As a result, I cleared my calendar for November 13, 2024 and prepared the following written statement. At the end, I was not called to testify before Congress and so I am posting below my intended statement. The Galileo Project under my leadership is about to release this week unprecedented results from commissioning data of its unique Observatory at Harvard University. Half a million objects were monitored on the sky and their appearance was analyzed by state-of-the-art machine learning algorithms. Are any of them UAPs and if so — what are their flight characteristics? Unfortunately, the congressional hearing chairs chose not to hear about these scientific results, nor about the scientific findings from our ocean expedition to the site of the first reported meteor from interstellar space.

Stay tuned for the first extensive paper on the commissioning data from the first Galileo Project Observatory, to be posted publicly in the coming days. Here is my public statement.

Metalenses harness AI for high-resolution, full-color imaging for compact optical systems

Modern imaging systems, such as those used in smartphones, virtual reality (VR), and augmented reality (AR) devices, are constantly evolving to become more compact, efficient, and high-performing. Traditional optical systems rely on bulky glass lenses, which have limitations like chromatic aberrations, low efficiency at multiple wavelengths, and large physical sizes. These drawbacks present challenges when designing smaller, lighter systems that still produce high-quality images.

/* */