Menu

Blog

Archive for the ‘information science’ category: Page 271

Oct 11, 2016

Caverlee, Hu receive DARPA grant to fill in the gaps of spatial-temporal datasets

Posted by in category: information science

Image of James CaverleeThe Defense Sciences Office at the Defense Advanced Research Projects Agency (DARPA) has awarded Dr. James Caverlee and Dr. Xia “Ben” Hu a Next Generation Social Science (NGS2) grant to complete their collaborative research project, HELIOS, named after the Greek god with the ability to see the invisible.

Along with being a part of the Texas A&M Engineering Experiment Station’s (TEES) Center for Digital Libraries, Caverlee is an associate professor and Hu is an assistant professor in the Department of Computer Science and Engineering at Texas A&M University.

The HELIOS project aims to create new computational methods and algorithms to fill in the gaps of rapidly evolving spatial-temporal datasets, which are datasets that measure both space and time. These types of datasets are generally missing information, which prohibit accurate assessments of time and location.

Read more

Oct 10, 2016

Sterling’s Flash Crash was long overdue—and there will be many more

Posted by in categories: computing, information science, robotics/AI

Researchers at Sapience.org foresee market instability intensifying by the computer trading ‘arms race’

FOR IMMEDIATE RELEASE

Last Friday the sterling has experienced a dramatic, ultrafast crash. It lost 10% of its value in minutes after the Asian markets opened — a decline usually reserved to declarations of war, major earthquakes and global catastrophes — and bounced right back. Although the affected exchanges are yet to release the details, computer trading algorithms almost certainly played a key role. Just like the 2010 Flash Crash, yesterday’s event is characteristic to Ultrafast Extreme Events[1]: split-second spikes in trade caused by ever smarter algorithms razor-focused on making ever-quicker profits. But the arms race is only likely to intensify as computing speed accelerates and AI algorithms become more intelligent.

Read more

Oct 8, 2016

Robots Have Learned to Pool Their Experience to Acquire Basic Motor Skills

Posted by in categories: information science, robotics/AI

In Brief.

  • A task that would take one robot years to complete could be done in just a few weeks if multiple robots are allowed to communicate with one another.
  • As algorithms and technology advances, a robot cloud could help us best utilize bots within our daily lives.

Robots, for all their helpfulness in performing tasks that we would rather not do (usually because those tasks are dangerous or boring), first need to be coded in order to do the work. These specific sets of commands tell the machines what exactly they need to do and define how to do it.

Continue reading “Robots Have Learned to Pool Their Experience to Acquire Basic Motor Skills” »

Oct 6, 2016

Project Originally Funded By DARPA Seeks To Replace Bees With Tiny, Winged Robots

Posted by in categories: drones, food, information science, internet, military, mobile phones, robotics/AI, transhumanism

Got a bee shortage? No problem, DARPA has you covered.


Following the news that the honeybee is now officially an endangered species as “colony collapse disorder” accelerates, it seems that a Harvard research team has the solution – robotic honeybees. Instead of attempting to save the bees by reducing the use of pesticides or revising safety standards for cell phone radiation, the focus has shifted to replacing the bees altogether. Harvard University researchers, led by engineering professor Robert Wood have been tweaking “RoboBees” since their initial introduction in 2009. The bee-sized robots made of titanium and plastic represent a breakthrough in the field of micro-aerial vehicles. The size of the components needed to create flying robots were previously too heavy to make a such a small structure lightweight enough to achieve flight. Current models weigh only 80 mg and have been fitted with sensors that detect light and wind velocity.

Researchers claim that the bees could artificially pollinate entire fields of crops and will soon be able to be programmed to live in an artificial hive, coordinate algorithms and communicate among themselves about methods of pollination and the locations of particular crops. In addition, RoboBees have been suggested for other uses including searching disaster sites for survivors, monitoring traffic, and “military and police applications.” These applications could include using RoboBees to “scout for insurgents” on battlefields abroad or allowing police and SWAT teams to use the micro-robots to gather footage inside buildings.

Continue reading “Project Originally Funded By DARPA Seeks To Replace Bees With Tiny, Winged Robots” »

Oct 6, 2016

Panasonic uses human touch to transfer data

Posted by in categories: business, information science, security

In an age when digital information can fly around the connected networks of the world in the blink of an eye, it may seem a little old timey to consider delivering messages by hand. But that’s precisely what Panasonic is doing at CEATEC this week. The company is demonstrating a prototype communication system where data is transmitted from one person to another through touch.

There’s very little information on the system available, but Panasonic says that the prototype uses electric field communication technology to move data from “thing-to-thing, human-to-human and human-to-thing.” Data transfer and authentication occurs when the objects or people touch, with digital information stored in a source tag instantaneously moving to a receiver module – kind of like NFC tap to connect technology, but with people in the equation as well as devices.

Continue reading “Panasonic uses human touch to transfer data” »

Oct 5, 2016

Turning to the brain to reboot computing

Posted by in categories: computing, information science, neuroscience, physics

Computation is stuck in a rut. The integrated circuits that powered the past 50 years of technological revolution are reaching their physical limits.

This predicament has computer scientists scrambling for new ideas: new devices built using novel physics, new ways of organizing units within computers and even algorithms that use new or existing systems more efficiently. To help coordinate new ideas, Sandia National Laboratories has assisted organizing the Institute of Electrical and Electronics Engineers (IEEE) International Conference on Rebooting Computing held Oct. 17–19.

Researchers from Sandia’s Data-driven and Neural Computing Dept. will present three papers at the conference, highlighting the breadth of potential non-traditional neural computing applications.

Read more

Oct 4, 2016

If you can solve these equations you have the IQ of a Genius!

Posted by in category: information science

If you understood in less than 10 seconds, you have the IQ of a Genius! Click share if you understand!

Read more

Sep 30, 2016

D-Wave Systems previews 2000-qubit quantum processor

Posted by in categories: energy, information science, quantum physics

D-Wave 2000-qubit processor (credit: D-Wave Systems)

D-Wave Systems announced Tuesday (Sept. 28, 2016) a new 2000-qubit processor, doubling the number of qubits over the previous-generation D-Wave 2X system. The new system will enable larger problems to be solved and performance improvements of up to 1000 times.

D-Wave’s quantum system runs a quantum-annealing algorithm to find the lowest points in a virtual energy landscape representing a computational problem to be solved. The lowest points in the landscape correspond to optimal or near-optimal solutions to the problem. The increase in qubit count enables larger and more difficult problems to be solved, and the ability to tune the rate of annealing of individual qubits will enhance application performance.

Continue reading “D-Wave Systems previews 2000-qubit quantum processor” »

Sep 29, 2016

IBM Neuromorphic chip hits DARPA milestone and has been used to implement deep learning

Posted by in categories: information science, robotics/AI, supercomputing

IBM delivered on the DARPA SyNAPSE project with a one million neuron brain-inspired processor. The chip consumes merely 70 milliwatts, and is capable of 46 billion synaptic operations per second, per watt–literally a synaptic supercomputer in your palm.

Along the way—progressing through Phase 0, Phase 1, Phase 2, and Phase 3—we have journeyed from neuroscience to supercomputing, to a new computer architecture, to a new programming language, to algorithms, applications, and now to a new chip—TrueNorth.

Fabricated in Samsung’s 28nm process, with 5.4 billion transistors, TrueNorth is IBM’s largest chip to date in transistor count. While simulating complex recurrent neural networks, TrueNorth consumes less than 100mW of power and has a power density of 20mW / cm2.

Continue reading “IBM Neuromorphic chip hits DARPA milestone and has been used to implement deep learning” »

Sep 27, 2016

[AI Lab] Pepper robot learning “ball in a cup”

Posted by in categories: entertainment, information science, robotics/AI

This video realized by the AI Lab of SoftBank Robotics shows how Pepper robot learns to play the ball-in-a-cup game (“bilboquet” in French). The movement is first demonstrated to the robot by guiding its arm.

From there, Pepper has to improve its performance through trial-and-error learning. Even though the initial demonstration does not land the ball in the cup, Pepper can still learn to play the game successfully.

Continue reading “[AI Lab] Pepper robot learning ‘ball in a cup’” »