Toggle light / dark theme

To me; it’s all common sense. If you step back look at the technology landscape as a whole along with AI; you start to see the barriers that truly spolights where we have way too much hype around AI.

Example, hacking. If we had truly advance AI at the level that it has been promoted; wouldn’t make sense that researchers would want to solve the $120 billion dollar money pit issue around Cyber Security and make billions to throw at their emerging AI tech plus ensure their AI investment wouldn’t incur pushback by consumers due to lack of trust that AI would not be hacked? So, I usually tread litely on over hype technologies.

I do see great possiblities and seen some amazing things and promise from Quantum Computing; however, we will not truly realize its impact and full potential until another 7 years; I will admit I see more promise with it than the existing AI landscape that is built off of existing traditional digital technology that has been proven to be broken by hackers.


Do you “believe” in AI?

The science fiction world is full of Artificial Intelligence (AI), but AI reality is still far away. According to an article featured in Technology Review, technology is still suffering and nowhere near the expectations of AI.

Senior editor for AI at MIT Technology Review, Will Knight wrote, “For all the remarkable progress being made in artificial intelligence, and warnings about the upheaval this might bring, the smartest computer would still struggle to make it through the eighth grade.”

Knight relates how programmers competed in an Allen Institute for Artificial Intelligence (AI2) contest. The programmers were challenged to write computer programs that could take a science test that was eighth-grade level. During the annual Association for the Advancement of Artificial Intelligence (AAAI) meeting, the winner was announced.

Read more

Making the most of the low light in the muddy rivers where it swims, the elephant nose fish survives by being able to spot predators amongst the muck with a uniquely shaped retina, the part of the eye that captures light. In a new study, researchers looked to the fish’s retinal structure to inform the design of a contact lens that can adjust its focus.

Imagine a that autofocuses within milliseconds. That could be life-changing for people with presbyopia, a stiffening of the eye’s that makes it difficult to focus on close objects. Presbyopia affects more than 1 billion people worldwide, half of whom do not have adequate correction, said the project’s leader, Hongrui Jiang, Ph.D., of the University of Wisconsin, Madison. And while glasses, conventional contact lenses and surgery provide some improvement, these options all involve the loss of contrast and sensitivity, as well as difficulty with night vision. Jiang’s idea is to design contacts that continuously adjust in concert with one’s own cornea and lens to recapture a person’s youthful vision.

The project, for which Jiang received a 2011 NIH Director’s New Innovator Award (an initiative of the NIH Common Fund) funded by the National Eye Institute, requires overcoming several engineering challenges. They include designing the lens, algorithm-driven sensors, and miniature electronic circuits that adjust the shape of the lens, plus creating a power source — all embedded within a soft, flexible material that fits over the eye.

Read more

This is extremely interesting and innovating to me. Why? Just imagine if your car (even a self driving car) your car breaks down on a road somewhere 10 to 25 miles away from the nearest gas station or town. And, you have a backup system that alerts you in the car that it has to switch over to tow mode, and engages a robotic pull system and set your flashers on then tows you to the nearest gas station or police station; etc.? No more tow bills, no more fears to the elderly or others being exposed on the side of the road. BTW — the car engine keeps the car microbot/s charged up.


A team of tiny robot ants pull a car that is thousands of times their weight as part of an experiment at Stanford University.

Read more

Developers can now see an early preview of experimental WebAssembly support in an internal Microsoft Edge build with the AngryBots demo, alongside similar previews for Firefox and Chrome. WebAssembly is a new, portable, size and load-time-efficient binary format suitable for compiling to the Web.

In the video above, a demo running in Microsoft Edge uses the preliminary WebAssembly support in the Chakra engine. The demo starts up significantly faster than just using asm.js, as the WebAssembly binaries have a smaller file size and parse more quickly than plain JavaScript, which needs to be parsed in the asm.js case.

Read more about WebAssembly on the Microsoft Edge Dev Blog.

Read more