Menu

Blog

Archive for the ‘robotics/AI’ category: Page 1662

Aug 22, 2019

Sea Machines Robotics to demonstrate autonomous spill response

Posted by in categories: robotics/AI, transportation

BOSTON — Sea Machines Robotics Inc. this week said it has entered into a cooperative agreement with the U.S. Department of Transportation’s Maritime Administration to demonstrate the ability of its autonomous technology in increasing the safety, response time and productivity of marine oil-spill response operations.

Sea Machines was founded in 2015 and claimed to be “the leader in pioneering autonomous control and advanced perception systems for the marine industries.” The company builds software and systems to increase the safety, efficiency, and performance of ships, workboats, and commercial vessels worldwide.

Continue reading “Sea Machines Robotics to demonstrate autonomous spill response” »

Aug 22, 2019

Will China lead the world in AI by 2030?

Posted by in categories: ethics, robotics/AI

But observers warn that there are several factors that could stymie the nation’s plans, including a lack of contribution to the theories used to develop the tools underpinning the field, and a reticence by Chinese companies to invest in the research needed to make fundamental breakthroughs.


The country’s artificial-intelligence research is growing in quality, but the field still plays catch up to the United States in terms of high-impact papers, people and ethics.

Aug 22, 2019

New Tech Puts NASA One Step Closer to Fueling Spacecraft in Space

Posted by in categories: robotics/AI, space

NASA just successfully demonstrated the first of three tools designed to refuel spacecraft in space, right outside of the International Space Station.

The space agency’s Robotic Refuelling Mission 3 was able to unstow a special adapter that can hold super-cold methane, oxygen or hydrogen, and insert it into a special coupler on a different fuel tank.

Future iterations of the system could one day allow us to gas up spacecraft with resources from distant worlds, such as liquid methane as fuel. And that’s a big deal, since future space explorations to far away destinations such as the Moon and Mars will rely on our ability to refuel after leaving Earth’s gravity.

Aug 22, 2019

Preliminary Results and Analysis Independent Core Observer Model (ICOM) Cognitive Architecture in a Mediated Artificial Super Intelligence (mASI) System

Posted by in category: robotics/AI

(BICA for AI, Post Conference Journal Paper, see Springer)

Abstract:

This paper is focused on preliminary cognitive and consciousness test results from using an Independent Core Observer Model Cognitive Architecture (ICOM) in a Mediated Artificial Super Intelligence (mASI) System. These results, including objective and subjective analyses, are designed to determine if further research is warranted along these lines. The comparative analysis includes comparisons to humans and human groups as measured for direct comparison. The overall study includes a mediation client application optimization in helping perform tests, AI context-based input (building context tree or graph data models), intelligence comparative testing (such as an IQ test), and other tests (i.e. Turing, Qualia, and Porter method tests) designed to look for early signs of consciousness or the lack thereof in the mASI system. Together, they are designed to determine whether this modified version of ICOM is a) in fact, a form of AGI and/or ASI, b) conscious, and c) at least sufficiently interesting that further research is called for. This study is not conclusive but offers evidence to justify further research along these lines.

Aug 21, 2019

YouTube is deleting videos of robots fighting because of ‘animal cruelty’

Posted by in categories: ethics, robotics/AI

We need to have higher ethics for robotic beings because if the superintelligence in digital form becomes reality we will need to have better ethics around robot rights. We could have literally a terminator situation but we could make a the vision possibly we do not need to have them be slaves to use but rightful citizens.


Each notice cited the same section of these guidelines, which states: “Content that displays the deliberate infliction of animal suffering or the forcing of animals to fight is not allowed on YouTube.”

It goes on to state: “Examples include, but are not limited to, dog fighting and cock fighting.”

Continue reading “YouTube is deleting videos of robots fighting because of ‘animal cruelty’” »

Aug 21, 2019

New models for handwriting recognition in online Latin and Arabic scripts

Posted by in category: robotics/AI

Researchers at the University of Sfax, in Tunisia, have recently developed a new method to recognize handwritten characters and symbols in online scripts. Their technique, presented in a paper pre-published on arXiv, has already achieved remarkable performance on texts written in both the Latin and Arabic alphabet.

In recent years, researchers have created -based architectures that can tackle a variety of tasks, including image classification, , processing (NLP), and many more. Handwriting recognition systems are computer tools that are specifically designed to recognize characters and other hand-written symbols in a similar way to humans.

In their early years of life, in fact, human beings innately develop the ability to understand different types of handwriting by identifying specific characters both individually and when grouped together. Over the past decade or so, many studies have tried to replicate this ability in , as this would ultimately enable more advanced and automatic analyses of handwritten texts.

Aug 21, 2019

Intel Details Its Nervana Inference and Training AI Cards

Posted by in categories: futurism, robotics/AI

Hot Chips 31 is underway this week, with presentations from a number of companies. Intel has decided to use the highly technical conference to discuss a variety of products, including major sessions focused on the company’s AI division. AI and machine learning are viewed as critical areas for the future of computing, and while Intel has tackled these fields with features like DL Boost on Xeon, it’s also building dedicated accelerators for the market.

The NNP-I 1000 (Spring Hill) and the NNP-T (Spring Crest) are intended for two different markets, inference and training. “Training” is the work of creating and teaching a neural network how to process data in the first place. Inference refers to the task of actually running the now-trained neural network model. It requires far more computational horsepower to train a neural network than it does to apply the results of that training to real-world categorization or classification tasks.

Intel’s Spring Crest NNP-T is designed to scale out to an unprecedented degree, with a balance between tensor processing capability, on-package HBM, networking capability, and on-die SRAMs to boost processing performance. The underlying chip is built by TSMC — yes, TSMC — on 16nm, with a 680mm die size and a 1200mm interposer. The entire assembly is 27 billion transistors with 4x8GB stacks of HBM2-2400 memory, 24 Tensor Processing Clusters (TPCs) with a core frequency of up to 1.1GHz. Sixty-four lanes of SerDes HSIO provides 3.58Tbps of aggregate bandwidth and the card supports an x16 PCIe 4.0 connection. Power consumption is expected to be between 150-250W. The chip was built using TSMC’s advanced CoWoS packaging (Chip-on-Wafer-on-Substrate), and carries 60MB of cache distributed across its various cores. CoWoS competes with Intel’s EMIB, but Intel has decided to build this hardware at TSMC rather than using its own foundries. Performance is estimated at up to 119 TOPS.

Aug 20, 2019

Flying Motorcycle Available for Preorder in Japan from Oct

Posted by in categories: robotics/AI, transportation

Tokyo, Aug. 1 (Jiji Press)—Tokyo-based startup A.L.I. Technologies Inc. plans to start accepting reservations for its “hover bike” flying motorcycle from October, company officials said Thursday.

The startup company, which mainly develops small unmanned aerial vehicles, will unveil the product at the Tokyo Motor Show in autumn.

It aims to sell the product mainly to wealthy foreigners, by touting its cutting-edge technologies.

Aug 20, 2019

Immortality through mind uploading

Posted by in categories: biotech/medical, finance, law, life extension, robotics/AI, sustainability

In the 2015 movie “Chappie”, which is set in the near future, automated robots comprise a mechanised police force. An encounter between two rival criminal gangs severely damages the law enforcing robot (Agent 22). His creator Deon recommends dismantling and recycling the damaged police droids. However, criminals kidnap Deon and force him to upload human consciousness into the damaged robot to train it to rob banks. Chappie becomes the first robot with the human mind who can think and feel like a human. Later, in the movie when his creator Deon is dying, it’s Chappie’s turn to upload Deon’s consciousness into a spare robot through a neural helmet. Similarly, in the “Avatar” a 2009 Hollywood science fiction, a character in the film by name Grace connects with Eiwa, the collective consciousness of the planet and transfers her mind to her Avatar body, while another character Jake transfers his mind to his Avatar body rendering his human body lifeless.

Mind uploading is a process by which we relocate the mind, an assemblage of memories, personality, and attributes of a specific individual, from its original biological brain to an artificial computational substrate. Mind uploading is a central conceptual feature of many science fiction novels and films. For instance, Hanson’s book titled “The Age of Em: Work, Love and Life when Robots Rule the Earth” is a 2016 nonfiction book which explores the implications of a future world when researchers have learned to copy humans onto computers, creating “ems,” or emulated people, who quickly come to outnumber the real ones.

Aug 20, 2019

Nvidia’s Jetson Nano Puts AI In The Palm Of Your Hand

Posted by in category: robotics/AI

We (TIRIAS Research) recently had an opportunity to evaluate the latest Jetson platform from Nvidia. At just 45mm x 70mm the Jetson Nano is the smallest Artificial Intelligence (AI) platform form factor Nvidia has produced to date. The Jetson Nano is powered by the Tegra X1 SoC, which features quad 1.43 GHz Cortex-A57 CPU cores and the 128-core Maxwell GPU. The Jetson Nano also uses the same Jetpack Software Development Kit (SDK) as the other Jetson platforms, the TX2 and AGX Xavier, allowing for cross platform development. For only $99, plus a little extra for accessories, the Jetson Nano is an amazing platform.

In addition to the Tegra X1 SoC, the Nano developer kit comes configured with 4GB of LPDDR4 memory and plenty on I/O options, including a MIPI CSI connector, four USB 3.0 Type-A ports, one USB 2.0 Micro-B, one gigabit ethernet port, and 40 GPIO pins. The Nano is capable of driving dual displays through single DisplayPort and HDMI ports, it has an microSD card slot for storage, and a somewhat hidden M.2 Key E connection for expansion modules/daughter cards for optional functions like wireless connectivity. The Jetson Nano developer kit comes with a sizable heatsink for passive cooling, but has holes drilled for add-on fans. For our evaluation, we used a Noctua NF-A4x20 5V PWM fan and a Raspberry Pi MIPI Camera Module v2 from RS Components and Allied Electronics.

For development software, the Nano runs an Ubuntu Linux OS and uses the Jetpack SDK, which supports Nvidia’s CUDA developer environment, as well as other common AI frameworks, such as TensorRT, VisionWorks, and OpenCV.