Toggle light / dark theme

The process of systems integration (SI) functionally links together infrastructure, computing systems, and applications. SI can allow for economies of scale, streamlined manufacturing, and better efficiency and innovation through combined research and development.

New to the systems integration toolbox are the emergence of transformative technologies and, especially, the growing capability to integrate functions due to exponential advances in computing, data analytics, and material science. These new capabilities are already having a significant impact on creating our future destinies.

The systems integration process has served us well and will continue to do so. But it needs augmenting. We are on the cusp of scientific discovery that often combines the physical with the digital—the Techno-Fusion or merging of technologies. Like Techno-Fusion in music, Techno-Fusion in technologies is really a trend that experiments and transcends traditional ways of integration. Among many, there are five grouping areas that I consider good examples to highlight the changing paradigm. They are: Smart Cities and the Internet of Things (IoT); Artificial Intelligence (AI), Machine Learning (ML), Quantum and Super Computing, and Robotics; Augmented Reality (AR) and Virtual Reality Technologies (VR); Health, Medicine, and Life Sciences Technologies; and Advanced Imaging Science.

The odd, wavy pattern that results from viewing certain phone or computer screens through polarized glasses has led researchers to take a step toward thinner, lighter-weight lenses. Called moiré, the pattern is made by laying one material with opaque and translucent parts at an angle over another material of similar contrast.

A team of researchers from Tokyo University of Agriculture and Technology, TUAT, in Japan have demonstrated that moiré metalenses—tiny, patterned lenses composed of artificial ‘meta’ atoms—can tune along a wider range than previously seen. They published their results on November 23 in Optics Express.

“Metalenses have attracted a lot of interest because they are so thin and lightweight, and could be used in ultra-compact imaging systems, like future smart phones, virtual reality goggles, drones or microbots,” said paper author Kentaro Iwami, associate professor in the TUAT Department of Mechanical Systems Engineering.

In 2016, combined venture investments in VR, AR, and mixed reality (MR) exceeded $1.25 billion. In 2019, that number increased more than 3X to $4.1 billion. And today, major players are bringing new, second-generation VR headsets to market that have the power to revolutionize the VR industry, as well as countless others. Already, VR headset sales volumes are expected to reach 30 million per year by 2022. For example, Facebook’s new Oculus Quest 2 headset has outsold its predecessor by 5X in the initial weeks of the product launch. With the FAANG tech giants pouring billions into improving VR hardware, the VR space is massively heating up. In this blog, we will dive into a brief history of VR, recent investment surges, and the future of this revolutionary technology.


“Virtual reality is not a media experience,” explains Bailenson. “When it’s done well, it’s an actual experience. In general our findings show that VR causes more behavior changes, causes more engagement, causes more influence than other types of traditional media.”

Nor is empathy the only emotion VR appears capable of training. In research conducted at USC, psychologist Skip Rizzo has had considerable success using virtual reality to treat PTSD in soldiers. Other scientists have extended this to the full range of anxiety disorders.

VR, especially when combined with AI, has the potential to facilitate a top shelf traditional education, plus all the empathy and emotional skills that traditional education has long been lacking.

Honorable Mentions

One more scientific brilliance this year is the use of light in neuroscience and tissue engineering. One study, for example, used lasers to directly print a human ear-like structure under the skin of mice, without a single surgical cut. Another used light to incept smell in mice, artificially programming an entirely new, never-seen-in-nature perception of a scent directly into their brains. Yet another study combined lasers with virtual reality to dissect how our brains process space and navigation, “mentally transporting” a mouse to a virtual location linked to a reward. To cap it off, scientists found a new way to use light to control the brain through the skull without surgery—though as of now, you’ll still need gene therapy. Given the implications of unauthorized “mind control,” that’s probably less of a bug and more of a feature.

We’re nearing the frustratingly slow, but sure, dying gasp of Covid-19. The pandemic defined 2020, but science kept hustling along. I can’t wait to share what might come in the next year with you—may it be revolutionary, potentially terrifying, utterly bizarre or oddly heart-warming.

Devising an effective AGI value loading system should be of the utmost importance. Interlinking of enhanced humans with AGIs will bring about the Syntellect Emergence which could be considered the essence of the Cybernetic Singularity. Future efforts in programming and infusing machine morality will surely combine top-down, bottom-up and interlinking approaches. #AGI #FriendlyAI #Cybernetics #BenevolentAI #SyntheticIntelligence #CyberneticSingularity #Superintelligence


A simple solution to achieve this might be to combine select human minds (very liberal, loving, peaceful types) with brain computer interfaces in a virtual environment. Work to raise an AGI who believes itself to be human and who believes in self sacrifice and putting the good of others above its own self. When this is achieved, let it sale they a virtual door to join humanity online.

It’s not a stretch to say that stretchable sensors could change the way soft robots function and feel. In fact, they will be able to feel quite a lot.

Cornell researchers have created a fiber-optic sensor that combines low-cost LEDs and dyes, resulting in a stretchable “skin” that detects deformations such as pressure, bending and strain. This sensor could give soft robotic systems – and anyone using augmented reality technology – the ability to feel the same rich, tactile sensations that mammals depend on to navigate the natural world.