Toggle light / dark theme

Nanophotonic system allows optical ‘deep learning’

“Deep Learning” computer systems, based on artificial neural networks that mimic the way the brain learns from an accumulation of examples, have become a hot topic in computer science. In addition to enabling technologies such as face- and voice-recognition software, these systems could scour vast amounts of medical data to find patterns that could be useful diagnostically, or scan chemical formulas for possible new pharmaceuticals.

But the computations these systems must carry out are highly complex and demanding, even for the most powerful computers.

Now, a team of researchers at MIT and elsewhere has developed a new approach to such computations, using light instead of electricity, which they say could vastly improve the speed and efficiency of certain deep learning computations. Their results appear today in the journal Nature Photonics (“Deep learning with coherent nanophotonic circuits”) in a paper by MIT postdoc Yichen Shen, graduate student Nicholas Harris, professors Marin Soljacic and Dirk Englund, and eight others.

Think you can develop machines that keep on learning

Now you have until 6/30 to submit a proposal to the Agency’s Lifelong Learning Machines (L2M) program.

Here’s the vision: Muster all the creativity that you can with the goal of developing fundamentally new machine learning approaches that enable systems to learn continually as they operate and apply previous knowledge to novel situations. Current AI systems only compute with what they have been programmed or trained for in advance; they have no ability to learn from data input during execution time and cannot adapt online to changes they encounter in real environments. The goal of L2M is to develop substantially more capable systems that are continually improving and updating from experience.

Consult the Broad Agency Announcement for more information: https://www.fbo.gov/…/…/DARPA/CMO/HR001117S0016/listing.html

Zoltan Istvan’s Schedule for FreedomFest 2017

I’ll be on a panel and also doing an author’s roundtable (The Transhumanist Wager) at FreedomFest in Las Vegas on July 21. It’s one of the largest gatherings of free minds in the world and this year is the 10th anniversary. If you’re there, please say hello! Others are speaking on life extension and AI. Here’s my speaker’s page:


Check out what Zoltan Istvan will be attending at FreedomFest 2017.

What if we built spacecraft… IN SPACE?

We are incredibly excited to announce that Firmamentum, a division of Tethers Unlimited, Inc. (TUI), has signed a contract with the Defense Advanced Research Projects Agency (DARPA) to develop a system that will use in-space manufacturing and robotic assembly technologies to construct on orbit a small satellite able to provide high-bandwidth satellite communications (SATCOM) services to mobile receivers on the ground.

Under the OrbWeaver Direct-to-Phase-II Small Business Innovation Research (SBIR) effort, Firmamentum aims to combine its technologies for in-space recycling, in-space manufacturing, and robotic assembly to create a system that could launch as a secondary payload on an Evolved Expendable Launch Vehicle (EELV). This system would recycle a structural element of that rocket, known as an EELV Secondary Payload Adapter (ESPA) ring, by converting the ring’s aluminum material into a very large, high-precision antenna reflector. The OrbWeaver™ payload would then attach this large antenna to an array of TUI’s SWIFT® software defined radios launched with the OrbWeaver payload to create a small satellite capable of delivering up to 12 gigabits per second of data to K-band very small aperture terminals (VSAT) on the ground.

In the Ruth Porat era at Alphabet, even robot video stars have to find some paying customers

Nice jibe at Boston Dynamics, they are only uhh the best legged robot lab in the world. Google didnt have a clue what they were doing when they bought Boston Dynamics, and thankfully getting sold now before they did anymore damage to it.

I Have a brilliant idea, lets force them to work on wheeled robots LOL 😛.


Alphabet’s sale of a robotics business to Japan’s Softbank shows that CFO Ruth Porat is taking aim even at the company’s most advanced technologies.

Nike-backed Grabit has quietly raised $25 million for robots that handle what others can’t grasp

Robot arms have come a long way since the 1960’s when George C. Devol and Joseph Engelberger created the earliest industrial models. Those had two-finger grippers that, in retrospect, look fit to pluck a rubber ducky out of a bin in a carnival game, but nothing too sophisticated.

By now, robots in factories and warehouses can adjust their grip like human hands, or use suction and pliable materials to move objects wherever they need to go. Problems arise, however, when objects are porous, tiny, or need to be placed with great precision, as with materials handling in textiles, food, automotive and electronics manufacturing.

A startup called Grabit Inc., based in Sunnyvale, Calif., gets around problems with robot dexterity and grip by employing “electroadhesion” to move different materials. Yes, that’s the force that lifts strands of your hair away from your scalp when you rub a balloon on your head.

Are We Building Artificial Brains And Uploading Minds To The Cloud Right Now?

When people post their emotional responses to social media and through their free email account(s), they are loading their human personal emotional responses, judgments, and biases into a large computer and cloud database? Everything we post and respond to is data somewhere. The truth is, hundreds of millions of people around the planet do this every day, 24 hours a day, seven days a week.


Are we uploading our brains to a cloud on a supercomputer and evolving into an artificially intelligent machine? This question and more…