Toggle light / dark theme

Synthetic biology involves creating or re-engineering microbes or other organisms to perform specific tasks, like fighting obesity, monitoring chemical threats or creating biofuels. Essentially, biologists program single-celled organisms like bacteria and yeast much the same way one would program and control a robot.

But 10 years ago, it was extremely challenging to take a DNA sequence designed on a computer and turn it into a polymer that could implement its task in a specific host, say a mouse or human cell. Now, thanks to a multitude of innovations across computing, engineering, biology and other fields, researchers can type out any DNA sequence they want, email it to a synthesis company, and receive their completed DNA construct in a week. You can build entire chromosomes and entire genomes of bacteria in this way.

“Biology is the most powerful substrate for engineering that we know of,” said Christopher Voigt, Professor of Biological Engineering at MIT. “It’s more powerful than electrical engineering, mechanical engineering, materials science and others. Unlike all the other fields, we can look at what biology is already able to do. When we look at the natural world, we see things like the brain. That’s a complex place computing, electrical engineering and computer science can’t reach. The brain even constructs nanostructures very deliberately, something materials science has not accomplished.”

Read more

Today, Lawrence Livermore National Lab (LLNL) and IBM announced the development of a new Scale-up Synaptic Supercomputer (NS16e) that highly integrates 16 TrueNorth Chips in a 4×4 array to deliver 16 million neurons and 256 million synapses. LLNL will also receive an end-to-end software ecosystem that consists of a simulator; a programming language; an integrated programming environment; a library of algorithms as well as applications; firmware; tools for composing neural networks for deep learning; a teaching curriculum; and cloud enablement.

The $1 million computer has 16 IBM microprocessors designed to mimic the way the brain works.

IBM says it will be five to seven years before TrueNorth sees widespread commercial use, but the Lawrence Livermore test is a big step in that direction.

Read more

DARPA’s new “Spectrum Collaboration Challenge” with a $2million prize for who can motivate a machine learning approach to dynamically sharing the RF Spectrum.


WASHINGTON, March 28, 2016 /PRNewswire-iReach/ — On March 23rd, 2016 DARPA announced its next Grand Challenge at the International Wireless Conference Expo in Las Vegas, Nevada. Program Manager, Paul Tilghman of DARPA’s Microsystems Technology Office (MTO), made the announcement to industry leaders following the conferences Dynamic Spectrum Sharing Summit. The challenge will motivate a machine learning approach to dynamically sharing the RF Spectrum and has been named the “Spectrum Collaboration Challenge.” A top prize of $2million dollars has been announced.

While mostly transparent to the typical cell phone or Wi-Fi user, the problem of spectrum congestion has been a long standing issue for both the commercial sector and Department of Defense. The insatiable appetite for wireless connectivity over the last 30 years has grown at such a hurried pace that within the RF community the term spectrum scarcity has been coined. RF bandwidth, the number of frequencies available to communicate information over, is a relatively fixed resource, and advanced communication systems like LTE and military communications systems consume a lot of it. As spectrum planners prepare for the next big wave of connected devices, dubbed the Internet of Things, they wonder where they will find the spectrum bandwidth they need to support these billions of new devices. Equally challenging, is the military’s desire to connect every soldier on the battlefield, while using these very same frequencies.

DARPA has chosen Barone Consulting to help develop the Spectrum Collaboration Challenge to address these critical infrastructure and military operation needs. In the tradition of other DARPA Grand Challenges, the Spectrum Collaboration Challenge provides an opportunity for experts across a wide variety of disciplines to devise groundbreaking strategies and systems and compete in open competition to win prizes, while advancing the state-of-the-art and seeding new technology communities. For the Spectrum Collaboration Challenge, the tasks are to combine distributed sensing techniques, innovative RF transmit and receive technologies, and cutting edge machine learning algorithms to create radio networks capable of learning to collaborate with other unknown radio networks, in real time.

Looking for partners.


[Via Satellite 03-28-2016] The U.S. Defense Advanced Research Projects Agency (DARPA) is reviving its in-orbit servicing efforts through a new public-private partnership program called Robotic Servicing of Geosynchronous Satellites (RSGS). Under the RSGS vision, the partners would join a DARPA-developed modular toolkit, including hardware and software, to a privately developed spacecraft to create a commercially owned and operated Robotic Servicing Vehicle (RSV). DARPA would contribute the robotics technology, such as the previously developed Front End Robotic Enabling Near-Term Demonstration (FREND) robotic arm, expertise, and a government-provided launch. The commercial partner would contribute the satellite to carry the robotic payload, integration of the payload, and the mission operations center and staff.

DARPA seeks to develop and demonstrate the RSV on orbit within the next five years. The agency’s goals include demonstrating safe, reliable, useful and efficient operations in or near Geostationary Earth Orbit (GEO), demonstrating on live GEO satellites in collaboration with commercial and U.S. government spacecraft operators, and supporting the development of a servicer spacecraft with sufficient propellant and payload robustness to enable dozens of missions over several years.

Deep neural networks (DNNs) can be taught nearly anything, including how to beat us at our own games. The problem is that training AI systems ties up big-ticket supercomputers or data centers for days at a time. Scientists from IBM’s T.J. Watson Research Center think they can cut the horsepower and learning times drastically using “resistive processing units,” theoretical chips that combine CPU and non-volatile memory. Those could accelerate data speeds exponentially, resulting in systems that can do tasks like “natural speech recognition and translation between all world languages,” according to the team.

So why does it take so much computing power and time to teach AI? The problem is that modern neural networks like Google’s DeepMind or IBM Watson must perform billions of tasks in in parallel. That requires numerous CPU memory calls, which quickly adds up over billions of cycles. The researchers debated using new storage tech like resistive RAM that can permanently store data with DRAM-like speeds. However, they eventually came up with the idea for a new type of chip called a resistive processing unit (RPU) that puts large amounts of resistive RAM directly onto a CPU.

Read more

Annual sales of drones in the U.S. will hit 2.5 million this year and swell to 7 million by 2020, according to a projection from the Federal Aviation Administration.

Unmanned aircraft purchases are growing both for hobbyists and for commercial ventures that perform inspections, assist farmers, and survey construction sites, according to the agency’s annual forecast of aviation activity, released on Thursday.

Read more

Companies that sell personal data should pay a percentage of the resulting revenue into a Data Mining Royalty Fund that would provide annual payments to U.S. citizens, much as the Alaska Permanent Fund distributes oil revenues to Alaskans.


A viral video released in February showed Boston Dynamics’ new bipedal robot, Atlas, performing human-like tasks: opening doors, tromping about in the snow, lifting and stacking boxes. Tech geeks cheered and Silicon Valley investors salivated at the potential end to human manual labor.

Shortly thereafter, White House economists released a forecast that calculated more precisely whom Atlas and other forms of automation are going to put out of work. Most occupations that pay less than $20 an hour are likely to be, in the words of the report, “automated into obsolescence.”

Read more

They are pushing for fully automated robot Cargo ships. Now, we have robots to load and unload cargo ships. In a few years there probably wont be a single person left working on a dock.


TraPac LLC’s Los Angeles shipping terminal offers a window to how coming global trade will move: using highly automated systems and machinery to handle a flood of goods amid new free-trade accords.

Read more