Toggle light / dark theme

What if Amazon moved its shipping centers downtown? Where drones fly into the buildings to pick up deliveries.

This drone beehive is one of the ideas and patents that could be part of Amazon’s city of the future.

From Amazon Go, the shop with no lines, and instant shipping and delivery, to hotels being serviced by Alexa, Amazon Echo, and Amazon Air. This mini documentary takes a look at how Amazon and Jeff Bezos are designing, and investing, into technology and new services that will bring a futuristic city to life.

Select Amazon Go footage used under Creative Commons
Life with Neil — https://youtu.be/p_uxFcza69I

Innovation.


The U.S. Air Force plans to have an operational combat drone by 2023. The service plans to build out a family of unmanned aircraft, known as Skyborg, capable of carrying weapons and actively participating in combat. The Air Force’s goal is to build up a large fleet of armed, sort-of disposable jets that don’t need conventional runways to take off and land.

The Air Force, according to Aviation Week & Space Technology, expects to have the first operational Skyborg aircraft ready by 2023. Skyborg will be available with both subsonic and supersonic engines, indicating both attack and fighter jet versions. The basic design (or designs) will likely be stealthy, carrying guided bombs, air defense suppression missiles, and air-to-air missiles inside internal weapons bays. Interesting, according to AvWeek, the Air Force is considering Skyborg as a replacement not only for the MQ-9 Reaper attack drone but early versions of the F-16 manned fighter.

Researchers at Ben-Gurion University of the Negev (BGU) have determined how to pinpoint the location of a drone operator who may be operating maliciously or harmfully near airports or protected airspace by analyzing the flight path of the drone.

Drones (small commercial unmanned ) pose significant security risks due to their agility, accessibility and low cost. As a result, there is a growing need to develop methods for detection, localization and mitigation of malicious and other harmful aircraft operation.

The paper, which was led by senior lecturer and expert Dr. Gera Weiss from BGU’s Department of Computer Science, was presented at the Fourth International Symposium on Cyber Security, Cryptography and Machine Learning (CSCML 2020) on July 3rd.

Now that the world is in the thick of the coronavirus pandemic, governments are quickly deploying their own cocktails of tracking methods. These include device-based contact tracing, wearables, thermal scanning, drones, and facial recognition technology. It’s important to understand how those tools and technologies work and how governments are using them to track not just the spread of the coronavirus, but the movements of their citizens.

Contact tracing is one of the fastest-growing means of viral tracking. Although the term entered the common lexicon with the novel coronavirus, it’s not a new practice. The Centers for Disease Control and Prevention (CDC) says contact tracing is “a core disease control measure employed by local and state health department personnel for decades.”

Traditionally, contact tracing involves a trained public health professional interviewing an ill patient about everyone they’ve been in contact with and then contacting those people to provide education and support, all without revealing the identity of the original patient. But in a global pandemic, that careful manual method cannot keep pace, so a more automated system is needed.

Researchers from North Carolina State University have discovered that teaching physics to neural networks enables those networks to better adapt to chaos within their environment. The work has implications for improved artificial intelligence (AI) applications ranging from medical diagnostics to automated drone piloting.

Neural networks are an advanced type of AI loosely based on the way that our brains work. Our natural neurons exchange electrical impulses according to the strengths of their connections. Artificial neural networks mimic this behavior by adjusting numerical weights and biases during training sessions to minimize the difference between their actual and desired outputs. For example, a can be trained to identify photos of dogs by sifting through a large number of photos, making a guess about whether the photo is of a dog, seeing how far off it is and then adjusting its weights and biases until they are closer to reality.

The drawback to this is something called “ blindness”—an inability to predict or respond to chaos in a system. Conventional AI is chaos blind. But researchers from NC State’s Nonlinear Artificial Intelligence Laboratory (NAIL) have found that incorporating a Hamiltonian function into neural networks better enables them to “see” chaos within a system and adapt accordingly.