Toggle light / dark theme

Luminar Surges On Plan To Supply Laser Sensors For Nvidia’s Self-Driving Car Platform

Laser lidar startup Luminar, founded and led by the youngest self-made billionaire tracked by Forbes, will supply its sensors to Nvidia for a new autonomous vehicle technology platform that the chip and computing powerhouse is developing for automakers to install in consumer cars and trucks. The news pushed Luminar’s shares up more than 20%.

Nvidia aims to supply the DRIVE Hyperion system, powered by its Orin “systems on a chip” computing hardware, AI-enabled software and Luminar’s long-range Iris lidar, to automakers starting in 2,024 Luminar said at Santa Clara, California-based Nvidia’s annual GTC conference. The platform, which also integrates cameras and radar for additional sensing capability, includes everything needed for mass-production vehicles to operate autonomously in highway driving, Nvidia said earlier this year.

Full Story:

How Enterprise AI Architecture Is Transforming Every Industry From Commerce To Wealth Management, And Beyond

Artificial intelligence (AI) has big promise to solve problems in almost every industry. AI-supported, AI-fueled, AI-based technologies are now present and capable of automating tasks in retail businesses and wealth management, to name a couple. These automations reduce error, manage increasingly vast datasets, and free up humans to do intelligent, strategic tasks. At the enterprise level, AI-architecture is transforming capacity and steadily shaping the way businesses of the future operate.

Connecting to Core Systems of Commerce Businesses Operationalizing machine learning or AI at scale is a key priority for the world of retail and commerce. Enterprise tech stacks leverage AI and predictions for high-frequency, ambiguous situations. Active learning and continuous improvement of AI are embedded in business applications and workflows. Making use of these requires contextual stitching of signals to create a single unified view of the truth, which empowers teams to make contextual decisions in the present. While the technological frameworks have existed for the better part of a decade, most businesses have been unable to overcome the barrier of applying technology in real world contexts, or at scale.

Full Story:

Mantium raises $12.75M in seed funding, launches cloud-based AI platform

Ohio-based startup Mantium has today announced closing $12.75 million in seed funding, as well as the launch of a cloud-based AI platform — which allows users to build with large language models.

The seed round, co-led by venture funds Drive Capital and Top Harvest, will be used to source for more talent, to add more features to Mantium’s AI platform and in driving awareness around what is achievable with large language models, especially across Africa, the firm’s CEO and co-founder Ryan Sevey told TechCrunch.

It is looking to expand its team of 33 which is currently spread across nine countries, including Ghana, Nigeria and Kenya. Having a globally distributed team, Sevey said, helps in the generation of unique insights and varying problem-solving approaches around AI.

Nvidia’s new tech uses AI to automatically match voice lines to animated faces

Nvidia may be best known for graphics cards you can’t find in stores, but the company also makes some interesting software tools. An example of this is the noise removal feature known as RTX voice, which was upgraded to work with all GeForce cards earlier this year, and does an excellent job of cleaning up background noise.

Now Nvidia (Thanks, 80.lv) has been showing off a new tool in beta this year involving sound. Audio2Face is an impressive looking auto rigging process that runs within Nvidia’s open real-time simulation platform, Omniverse. It has the ability to take an audio file, and apply surprisingly well matching animations to the included Digital Mark 3D character model.

Autonomous robot performs its first intramuscular injection without needles

One of the things that many people hate most about getting vaccinations and taking certain types of medication is needles. Any medication that has to be delivered intramuscular typically requires a needle and a skilled medical professional to administer it. However, that may change in the future with a new autonomous robot created by a company called Cobionix, founded at the University of Waterloo.

The autonomous robot utilizes the company’s Cobi platform to perform injections without using needles. Cobi is described as a versatile robotic platform that can be deployed rapidly and complete tasks completely autonomously. The robot was fitted with a needle-free injection system. It demonstrated the ability to deliver intramuscular injections to patients without needing needles and without supervision by a healthcare professional.

The robot developers believe that Cobi and solutions like it could help protect healthcare workers, reduce the cost of healthcare, and help improve patient outcomes. Researchers believe the autonomous design of the robot will dramatically reduce the requirements for vaccine clinics and could help deliver vaccines and other medications to remote populations with limited access to healthcare.

Brain Implant Translates Paralyzed Man’s Thoughts Into Text With 94% Accuracy

A man paralyzed from the neck down due to a spinal cord injury he sustained in 2007 has shown he can communicate his thoughts, thanks to a brain implant system that translates his imagined handwriting into actual text.

The device – part of a longstanding research collaboration called BrainGate – is a brain-computer interface (BCI), that uses artificial intelligence (AI) to interpret signals of neural activity generated during handwriting.

In this case, the man – called T5 in the study, and who was 65 years of age at the time of the research – wasn’t doing any actual writing, as his hand, along with all his limbs, had been paralyzed for several years.

MERLIN: A self-supervised strategy to train deep despeckling networks

When a highly coherent light beam, such as that emitted by radars, is diffusely reflected on a surface with a rough structure (e.g., a piece of paper, white paint or a metallic surface), it produces a random granular effect known as the ‘speckle’ pattern. This effect results in strong fluctuations that can reduce the quality and interpretability of images collected by synthetic aperture radar (SAR) techniques.

SAR is an imaging method that can produce fine-resolution 2D or 3D images using a resolution-limited radar system. It is often employed to collect images of landscapes or object reconstructions, which can be used to create millimeter-to-centimeter scale models of the surface of Earth or other planets.

To improve the quality and reliability of SAR data, researchers worldwide have been trying to develop techniques based on deep neural networks that could reduce the speckle effect. While some of these techniques have achieved promising results, their performance is still not optimal.

Landing AI brings in $57M for its machine learning operations tools

Just over a year after launching its flagship product, Landing AI secured a $57 million round of Series A funding to continue building tools that enable manufacturers to more easily and quickly build and deploy artificial intelligence systems.

The company, started by former Google and Baidu AI guru Andrew Ng, developed LandingLens, a visual inspection tool that applies AI and deep learning to find product defects faster and more accurately.

Ng says industries should adopt a data-centric approach to building AI, which provides a more efficient way for manufacturers to teach an AI model what to do, using no code/low code capabilities, which include just a few mouse clicks to build advanced AI models in less than a day.

Neuroscience Behind an Artificial General Intelligence

https://youtube.com/watch?v=1_Mcp-YjPmQ&feature=share

This video gives and overview of human neuroscience and applies it to the design of an artificial general intelligence named Eta.

Go to www.startengine.com/orbai to own shares in the future of AI.
Check out https://www.orbai.ai/about-us.htm for details on the company, tech, patents, products and more.

What we usually think of as Artificial Intelligence today, when we see human-like robots and holograms in our fiction, talking and acting like real people and having human-level or even superhuman intelligence and capabilities — is actually called Artificial General Intelligence (AGI), and it does NOT exist anywhere on earth yet. What we do have is called Deep Learning, that has fundamental limitations that will not allow it to become AGI.

For an AI to pass the threshold of human intelligence, and become an artificial general intelligence requires an AI to have the ability to see, hear, and experience its environment. It needs to be able to learn that environment, to organize it’s memory non-locally and store abstract concepts in a distributed architecture so it can model it’s environment, events, and people in it.

It needs to be able speak conversationally and interact verbally like a human, and be able to understand the experiences, events, and concepts behind the words and sentences of language so it can compose language at a human level.

It needs to be able to solve all the problems that a human can, using flexible memory recall, analogy, metaphor, imagination, intuition, logic and deduction from sparse information.

Walmart is using driverless trucks to complete a seven-mile delivery loop

As promised, Walmart has started doing fully driverless box truck deliveries in partnership with startup Gatik between its own locations on a fixed 7-mile loop, the companies announced. Despite those limitations, the route in Bentonville, Arkansas involves “intersections, traffic lights and merging on dense urban roads,” the companies said. It’s another shot of good news for the progress of self-driving vehicles after GM’s cruise launched its self-driving taxis into testing last week.

The Gatik trucks are bringing grocery orders from a Walmart fulfilment center (dark store) to a nearby Walmart Neighborhood Market grocery store in Bentonville, the host city of the company’s headquarters. The route covers the “middle mile” transportation of goods between warehouses and stores. The program effectively got launched following the December 2020 approval by the Arkansas State Highway Commission, and has been driverless since this summer.