Toggle light / dark theme

This Startup’s Computer Chips Are Powered by Human Neurons

As of right now, Cortical’s mini-brains have less processing power than a dragonfly brain. The company is looking to get its mouse-neuron-powered chips to be capable of playing a game of “Pong,” as CEO Hon Weng Chong told Fortune, following the footsteps of AI company DeepMind, which used the game to test the power of its AI algorithms back in 2013.

“What we are trying to do is show we can shape the behavior of these neurons,” Chong told Fortune.

READ MORE: A startup is building computer chips using human neurons [Fortune].

Europe Gets One Step Closer To AI-Piloted Drones & eVTOL Aircraft

The biggest change worldwide in the last decade was probably the smartphone revolution, but overall, cities themselves still look pretty much the same. In the decade ahead, cities will change a lot more. Most of our regular readers probably think I am referring to how autonomous vehicles networks will start taking over and how owning a car will start to become closer to owning a horse. However, the real answer isn’t just the autonomous vehicles on the roads — they will likely also compete with autonomous eVTOL aircraft carrying people between hubs.

Today, the European Union is moving one step closer to making this second part a reality. Together with Daedalean, an autonomous flight company we have covered in the past, EASA published a new joint report covering “The Learning Assurance for Neural Networks.”

Aijobs on Facebook Watch

This is when #ai will replace humans at creative tasks. 🧠 Credit: @worldeconomicforum… Looking for a job in AI & Machine Learning. Follow us for more updates or visit: https://aijobs.com/

#aijobs #artificialintelligence #datascience #IoT #AIoT #robot #robots #deeplearning #robotics #tecnologia #cybersecurity #aiskills #artificialintelligenceai #machinelearning #machinelearningalgorithms #futuretechnology #ML #computerengineer #codinglife #coding #programmerlife #VR #technologies #techie

Self-supervised learning is the future of AI

Despite the huge contributions of deep learning to the field of artificial intelligence, there’s something very wrong with it: It requires huge amounts of data. This is one thing that both the pioneers and critics of deep learning agree on. In fact, deep learning didn’t emerge as the leading AI technique until a few years ago because of the limited availability of useful data and the shortage of computing power to process that data.

Reducing the data-dependency of deep learning is currently among the top priorities of AI researchers.

In his keynote speech at the AAAI conference, computer scientist Yann LeCun discussed the limits of current deep learning techniques and presented the blueprint for “self-supervised learning,” his roadmap to solve deep learning’s data problem. LeCun is one of the godfathers of deep learning and the inventor of convolutional neural networks (CNN), one of the key elements that have spurred a revolution in artificial intelligence in the past decade.

How a Bouncy Ball Changed the Way I See the World

In the stillness and noise of the M.R.I., I picture what the magnet is doing to my brain. I imagine hydrogen protons aligning along and against the direction of its field. Bursts of radio waves challenge their orientation, generating signals that are rendered into images. Other than the sting of the contrast agent, the momentary changes in nuclear spin feel like nothing. “Twenty-five more minutes,” the radiologist says through the plastic headphones. Usually, I fall asleep.

I’ve had more than 50 scans since 2005, when I received a diagnosis of multiple sclerosis, and I now possess thousands of images of my brain and spine. Sometimes I open the files to count the spinal-cord lesions that are slowly but aggressively taking away my ability to walk. On days my right leg can clear the ground, it feels as if a corkscrew is twisting into my femur. I take halting steps, like a hapless robot, until it’s impossible to move forward. “Maybe in 10 years there will be a pill, or a treatment,” a doctor told me.

For now, even a sustained low fever could cause permanent disability, and medications that treat the disease have left me immunosuppressed, making fevers more likely. I quarantined before it was indicated, and what I miss most now, sheltering in place, are walks through my neighborhood park in Los Angeles with my dog, who gleefully chases the latest bouncy ball I’m hurtling against the concrete. Her current favorite is the Waboba Moon Ball, which comes in highlighter fluorescent yellow and Smurf blue, among other colors. Technically Moon Balls are spherical polyhedrons. They sport radically dimpled surfaces, as if Buckminster Fuller had storyboarded an early pitch for “Space Jam.” Moon Balls are goofy, but they bounce 100 feet.

Quantum Computers: Should We Be Prepared?

Some foresee quantum computers will come to solve some of the world’s most serious issues. However, others accept that the advantages will be exceeded by the downsides, for example, cost or that quantum computers basically can’t work, incapable to play out the complexities demanded of them in the manner we envision. The integral factor will be if the producers can guarantee ‘quantum supremacy’ by accomplishing low error rates for their machines and outperforming current computers.

Hollywood has made numerous anticipations with respect to the future and artificial intelligence, some disturbing, others empowering. One of the most quickly developing research areas takes a look at the use of quantum computers in molding artificial intelligence. Actually, some consider machine learning the yardstick by which the field is estimated.

The idea of machine learning, to ‘learn’ new data without express explicit instruction or programming has existed since 1959, in spite of the fact that we still haven’t exactly shown up at the vision set somewhere by the likes of Isaac Asimov and Arthur C. Clarke. In any case, the conviction is that quantum computing will help accelerate our advancement right now. What was at one time a periphery thought evaded by the more extensive science community, has developed to turn into a well known and practical field worthy of serious investment.

/* */