Toggle light / dark theme

Flying robots that deliver packages to people’s doorsteps are no longer science fiction. Companies including Amazon.com Inc., Alphabet Inc.’s Wing and Uber Technologies Inc. are starting the most advanced trials of drone delivery in U.S. history.

While commercial drone delivery faces many hurdles, government-approved tests by the tech giants will mark the first time consumers in parts of the country experience the technology. Wing this month started tests in Christiansburg, Va., while Uber says it will experiment in San Diego…

To Read the Full Story.

Facebook remains embroiled in a multibillion-dollar judgement lawsuit over its facial recognition practices, but that hasn’t stopped its artificial intelligence research division from developing technology to combat the very misdeeds of which the company is accused. According to VentureBeat, Facebook AI Research (FAIR) has developed a state-of-the-art “de-identification” system that works on video, including even live video. It works by altering key facial features of a video subject in real time using machine learning, to trick a facial recognition system into improperly identifying the subject.

This de-identification technology has existed in the past and there are entire companies, like Israeli AI and privacy firm D-ID, dedicated to providing it for still images. There’s also a whole category of facial recognition fooling imagery you can wear yourself, called adversarial examples, that work by exploiting weaknesses in how computer vision software has been trained to identify certain characteristics. Take for instance this pair of sunglasses with an adversarial pattern printed onto it that can make a facial recognition system think you’re actress Milla Jovovich.

By contemplating the full spectrum of scenarios of the coming technological singularity many can place their bets in favor of the Cybernetic Singularity which is a sure path to digital immortality and godhood as opposed to the AI Singularity when Homo sapiens is retired as a senescent parent. This meta-system transition from the networked Global Brain to the Gaian Mind is all about evolution of our own individual minds, it’s all about our own Self-Transcendence. https://www.ecstadelic.net/top-stories/the-ouroboros-code-br…etaphysics #OuroborosCode


All AI & Cybernetics Cognitive Science Complexity Consciousness Cosmology Digital Philosophy Digital Physics Economics Emergence Environment Epigenetics Ethics Evolution Evolutionary Biology Experiential Realism Experimental Science Fermi Paradox Free Will Vs. Determinism Futurism Gaia 2.0 Global Brain Immortality Machine Learning Mathematics Memetics Mind Uploading Nanotechnology Neo Transcendentalism Neural Networks Neurophilosophy Neuroscience Phenomenology Philosophy Of Mind Physics Of Time Psychedelics Psychology Quantum Computing Quantum Gravity Quantum Physics Sci Fi Simulation Hypothesis Sociology Spirituality Technological Singularity Theology Transhumanism Virtual Reality

With its whirring rotary blades and extendable cutting arm it would not look out of place stalking the streets of a futuristic urban dystopia.

But Edinburgh University’s new robot has actually been developed to pootle sedately around the garden, pruning rose bushes and trimming topiary.

The semi-autonomous machine — dubbed Trimbot — is programmed to recognise leaves, stalks and flowers, so it does not inadvertently dead-head the living blooms.

Many advanced artificial intelligence projects say they are working toward building a conscious machine, based on the idea that brain functions merely encode and process multisensory information. The assumption goes, then, that once brain functions are properly understood, it should be possible to program them into a computer. Microsoft recently announced that it would spend US$1 billion on a project to do just that.

So far, though, attempts to build supercomputer brains have not even come close. A multi-billion-dollar European project that began in 2013 is now largely understood to have failed. That effort has shifted to look more like a similar but less ambitious project in the U.S., developing new software tools for researchers to study brain data, rather than simulating a brain.

Some researchers continue to insist that simulating neuroscience with computers is the way to go. Others, like me, view these efforts as doomed to failure because we do not believe consciousness is computable. Our basic argument is that brains integrate and compress multiple components of an experience, including sight and smell – which simply can’t be handled in the way today’s computers sense, process and store data.

To determine whether robots have a felt quality of experience, we’ll have to ask them, and ourselves, several probing questions — i.e., “Can the mind existing separately from the body?” “Can the system exist without the computer?”

They’ll also need to possess the right “architectural features.” In this video, NASA’s Dr. Susan Schneider explains more.

Space — also commonly known as the final frontier — has left us in a state of awe since we ever first laid eyes on it. Inspired by numerous works of science fiction, we’ve made it a mission of ours to not only explore space but to colonize its planets as we continue searching for a secondary home.

And while our efforts have been mildly successful thus far, a group of non-biological “creatures” have already achieved the difficult task of conquering space. They’re known as robots.

Whether on the International Space Station (ISS) or on another planet, these automated machines have extended our reach into the cosmos far better than any actual human hand has accomplished. It all started in 1969 when the Soviets made the first attempt to land a robotic rover, known as Lunokhod 0, onto the Lunar surface of our Moon. Unfortunately for the Soviets, the rover was unsuccessful in its landing; instead crashing down after a failed start.

Intel has unveiled two new processors as part of its Nervana Neural Network Processor (NNP) lineup with an aim to accelerate training and inferences drawn from artificial intelligence (AI) models.

Dubbed Spring Crest and Spring Hill, the company showcased the AI-focused chips for the first time on Tuesday at the Hot Chips Conference in Palo Alto, California, an annual tech symposium held every August.

Intel RealSense technologies offer a variety of vision‑based solutions to give your products the ability to understand and perceive the world in 3D. When combined with the Intel Neural Compute Stick 2, which re‑defined the AI at the edge development kit, you get low power, high performance intelligent computer vision at low cost for your prototype.


Depth sensing meets plug-and-play AI at the edge inferencing with Intel® RealSense™ stereo depth cameras bundled with the Intel® Neural Compute Stick 2.