Menu

Blog

Archive for the ‘robotics/AI’ category: Page 14

Mar 20, 2024

Medics design AI tool to predict side-effects in breast cancer patients

Posted by in categories: biotech/medical, robotics/AI

Trials in UK, France and the Netherlands indicate tool can predict if patient will experience problems from surgery and radiotherapy.

Mar 20, 2024

Scientists Gave AI an “Inner Monologue” and Something Fascinating Happened

Posted by in category: robotics/AI

Researchers taught an AI to think before it speaks like a human’s inner monologue — and found that it made the model smarter.

Mar 20, 2024

Stability AI Is Falling Apart

Posted by in category: robotics/AI

AI image generation company Stability AI is in big trouble.

Several key AI developers who worked on Stable Diffusion, the company’s popular text-to-image generator, have resigned, Forbes reports.

Stability AI CEO Emad Mostaque announced the news during an all-hands meeting last week, per Forbes, revealing that three of the five researchers who originally created the foundational tech that powers Stable Diffusion at two German universities, had left.

Mar 20, 2024

Behavioral and neural correlates to multisensory detection of sick humans

Posted by in categories: biotech/medical, robotics/AI

Importantly, the superior temporal sulcus (STS), and superior temporal gyrus (STG) are considered core areas for multisensory integration (, ), including for olfactory–visual integration (). The STS was significantly connected to the IPS during multisensory integration, as indicated by the PPI analysis ( Fig. 3 ) focusing on functional connectivity of IPS and whole-brain activation. Likewise, the anterior and middle cingulate cortex, precuneus, and hippocampus/amygdala were activated when testing sickness-cue integration-related whole-brain functional connectivity with the IPS but were not activated when previously testing for unisensory odor or face sickness perception. In this context, hippocampus/amygdala activation may represent the involvement of an associative neural network responding to threat () represented by a multisensory sickness signal. This notion supports the earlier assumption of olfactory-sickness–driven OFC and MDT activation, suggested to be part of a neural circuitry serving disease avoidance. Last, the middle cingulate cortex has recently been found to exhibit enhanced connectivity with the anterior insula during a first-hand experience of LPS-induced inflammation (), and this enhancement has been interpreted as a potential neurophysiological mechanism involved in the brain’s sickness response. Applied to the current data, the middle cingulate cortex, in the context of multisensory-sickness–driven associations between IPS and whole-brain activations, may indicate a shared representation of an inflammatory state and associated discomfort.

In conclusion, the present study shows how subtle and early olfactory and visual sickness cues interact through cortical activation and may influence humans’ approach–avoidance tendencies. The study provides support for sensory integration of information from cues of visual and olfactory sickness in cortical multisensory convergences zones as being essential for the detection and evaluation of sick individuals. Both olfaction and vision, separately and following sensory integration, may thus be important parts of circuits handling imminent threats of contagion, motivating the avoidance of sick conspecifics (, ).

Mar 20, 2024

Embodied Generalist LEO

Posted by in category: robotics/AI

From Beijing Institute for General Artificial Intelligence (BIGAI) 2Peking University 3Carnegie Mellon University 4Tsinghua University.

🦁 Introducing LEO: an embodied generalist agent in 3D World🌎

Continue reading “Embodied Generalist LEO” »

Mar 20, 2024

Elon Musk Predicts A ‘Universal High Income’ As Jobs Are Phased Out And Employment Becomes Obsolete — It’ll Be ‘Somewhat Of An Equalizer’

Posted by in categories: Elon Musk, employment, robotics/AI

Elon Musk made some striking predictions about the impact of artificial intelligence (AI) on jobs and income at the inaugural AI Safety Summit in the U.K. in November.

The serial entrepreneur and CEO painted a utopian vision where AI renders traditional employment obsolete but provides an “age of abundance” through a system of “universal high income.”

“It’s hard to say exactly what that moment is, but there will come a point where no job is needed,” Musk told U.K. Prime Minister Rishi Sunak. “You can have a job if you want to have a job or sort of personal satisfaction, but the AI will be able to do everything.”

Mar 20, 2024

Nvidia’s Jensen Huang says AI hallucinations are solvable, artificial general intelligence is 5 years away

Posted by in categories: futurism, robotics/AI

Artificial general intelligence (AGI) — often referred to as “strong AI,” “full AI,” “human-level AI” or “general intelligent action” — represents a significant future leap in the field of artificial intelligence. Unlike narrow AI, which is tailored for specific tasks, such as detecting product flaws, summarizing the news, or building you a website, AGI will be able to perform a broad spectrum of cognitive tasks at or above human levels. Addressing the press this week at Nvidia’s annual GTC developer conference, CEO Jensen Huang appeared to be getting really bored of discussing the subject — not least because he finds himself misquoted a lot, he says.

The frequency of the question makes sense: The concept raises existential questions about humanity’s role in and control of a future where machines can outthink, outlearn and outperform humans in virtually every domain. The core of this concern lies in the unpredictability of AGI’s decision-making processes and objectives, which might not align with human values or priorities (a concept explored in-depth in science fiction since at least the 1940s). There’s concern that once AGI reaches a certain level of autonomy and capability, it might become impossible to contain or control, leading to scenarios where its actions cannot be predicted or reversed.

When sensationalist press asks for a timeframe, it is often baiting AI professionals into putting a timeline on the end of humanity — or at least the current status quo. Needless to say, AI CEOs aren’t always eager to tackle the subject.

Mar 20, 2024

Move over, Elon Musk: Nvidia says 2024 is the year of the ‘humanoid’ robot

Posted by in categories: Elon Musk, physics, robotics/AI

Physicists just can’t leave an incomplete theory alone; they try and repair it. When nature is kind, it can lead to a major breakthrough.

Mar 20, 2024

Apple researchers reveal new AI breakthrough for training LLMs on images and text

Posted by in categories: innovation, robotics/AI

In a new paper published this month, Apple researchers reveal that they have developed new methods for training large language models using both text and visual information. According to Apple’s researchers, this represents a way to obtain state-of-the-art results.

Mar 20, 2024

Surgical Robot Outperforms Human Surgeons in Precise Removal of Cancerous Tumors

Posted by in categories: biotech/medical, robotics/AI

Surgically removing tumors from sensitive areas, such as the head and neck, poses significant challenges. The goal during surgery is to take out the cancerous tissue while saving as much healthy tissue as possible. This balance is crucial because leaving behind too much cancerous tissue can lead to the cancer’s return or spread. Doing a resection that has precise margins—specifically, a 5mm margin of healthy tissue—is essential but difficult. This margin, roughly the size of a pencil eraser, ensures that all cancerous cells are removed while minimizing damage. Tumors often have clear horizontal edges but unclear vertical boundaries, making depth assessment challenging despite careful pre-surgical planning. Surgeons can mark the horizontal borders but have limited ability to determine the appropriate depth for removal due to the inability to see beyond the surface. Additionally, surgeons face obstacles like fatigue and visual limitations, which can affect their performance. Now, a new robotic system has been designed to perform tumor removal from the tongue with precision levels that could match or surpass those of human surgeons.

The Autonomous System for Tumor Resection (ASTR) designed by researchers at Johns Hopkins (Baltimore, MD, USA) translates human guidance into robotic precision. This system builds upon the technology from their Smart Tissue Autonomous Robot (STAR), which previously conducted the first fully autonomous laparoscopic surgery to connect the two intestinal ends. ASTR, an advanced dual-arm, vision-guided robotic system, is specifically designed for tissue removal in contrast to STAR’s focus on tissue connection. In tests using pig tongue tissue, the team demonstrated ASTR’s ability to accurately remove a tumor and the required 5mm of surrounding healthy tissue. After focusing on tongue tumors due to their accessibility and relevance to experimental surgery, the team now plans to extend ASTR’s application to internal organs like the kidney, which are more challenging to access.

Page 14 of 2,165First1112131415161718Last