Toggle light / dark theme

I hear this author; however, can it pass military basic training/ boot camp? Think not.


Back when Alphabet was known as Google, the company bought Boston Dynamics, makers of the amazingly advanced robot named Atlas. At the time, Google promised that Boston Dynamics would stop taking military contracts, as it often did. But here’s the open secret about Atlas: She can enlist in the US military anytime she wants.

Technology transfer is a two-way street. Traditionally we think of technology being transferred from the public to the private sector, with the internet as just one example. The US government invests in and develops all kinds of important technologies for war and espionage, and many of those technologies eventually make their way to American consumers in one way or another. When the government does so consciously with both military and civilian capabilities in mind, it’s called dual-use tech.

But just because a company might not actively pursue military contracts doesn’t mean that the US military can’t buy (and perhaps improve upon) technology being developed by private companies. The defense community sees this as more crucial than ever, as government spending on research and development has plummeted. About one-third of R&D was being done by private industry in the US after World War II, and two-thirds was done by the US government. Today it’s the inverse.

A new laser tag coming our way; however, this time when you’re tagged, you really are dead.


US officials tout the ‘unprecedented power’ of killing lasers to be released by 2023.

The US Army will deploy its first laser weapons by 2023, according to a recently released report.

Mary J. Miller, Deputy Assistant Secretary of the Army for Research and Technology, speaking before the House Armed Services Committee on Monday, stated that, “I believe we’re very close,” when asked about progress with directed-energy, or laser, weapons.

Sounds like new options to be considered around Polyhistidine Tagging.


Among bioprocessors, attitudes toward affinity purification range from a desire to move beyond old specificity/yield trade-offs to a willingness to explore new polyhistidine technology spin-offs, including systems for real-time detection.

Read more

Bad news if you use RoundUp.


Local councils across Australia that use the weed killer glyphosate on nature-strips and playgrounds are being warned that the chemical probably causes cancer.

An updated World Health Organisation (WHO) warning for the herbicide, often trade marked as Roundup, is also routinely used in household gardens and farms.

The WHO’s International Agency for Research on Cancer (IARC) recently upgraded its assessment of glyphosate from “possibly” to “probably carcinogenic to humans”, though the level of risk is the same as the IARC’s findings on red meat.

I see articles and reports like the following about military actually considering fully autonomous missals, drones with missals, etc. I have to ask myself what happened to the logical thinking.


A former Pentagon official is warning that autonomous weapons would likely be uncontrollable in real-world situations thanks to design failures, hacking, and external manipulation. The answer, he says, is to always keep humans “in the loop.”

The new report, titled “ Autonomous Weapons and Operational Risk,” was written by Paul Scharre, a director at the Center for a New American Security. Scharre used to work at the office of the Secretary of Defense where he helped the US military craft its policy on the use of unmanned and autonomous weapons. Once deployed, these future weapons would be capable of choosing and engaging targets of their own choosing, raising a host of legal, ethical, and moral questions. But as Scharre points out in the new report, “They also raise critically important considerations regarding safety and risk.”

As Scharre is careful to point out, there’s a difference between semi-autonomous and fully autonomous weapons. With semi-autonomous weapons, a human controller would stay “in the loop,” monitoring the activity of the weapon or weapons system. Should it begin to fail, the controller would just hit the kill switch. But with autonomous weapons, the damage that be could be inflicted before a human is capable of intervening is significantly greater. Scharre worries that these systems are prone to design failures, hacking, spoofing, and manipulation by the enemy.

UK is getting serious about Quantum especially in their universities; all £204 million worth.


Universities and Science minister Jo Johnson has announced two major investments in science and engineering research totaling £204 million.

Forty UK universities will share in £167 million that will support doctoral training over a two year period, while £37 million will be put into developing the graduate skills, specialist equipment and facilities that will put UK Quantum Technologies research at the forefront of the field.

The minister made the announcements during a visit to the University of Oxford where he met academics working in the Networked Quantum Information Technologies (NQIT) Quantum Technology Hub, which is led by Professor Ian Walmsley, one of four that form part of the £270 million UK National Quantum Technologies Programme.

This could be very very tricky for a number of reasons: 1) how will this work with people who develop laryngitis or some other illness disrupting their speech? 2) what happens if a person uses a recorded voice or voice changer? 3) what happens when a person’s voice does change as they get older or have a medical procedure done that permanently alters the voice? I could list more; however, I believe that researcher will realize that there will be a need for two forms of biometrics when it comes to the voice.


Software firm Nuance believes that in the near future, there will be an expectation from customers to interact with technology in a more human-like manner.

Read more

The interesting piece in the articles that I have seen on robots taking jobs have only occurred in Asia and in certain situations in the UK. I believe that companies across the US see some of the existing hacking risks (especially since the US has the highest incidents of hackings among the other countries) that prevents companies from just replacing their employees with connected autonomous robots plus I am not sure that robotics is at the level of sophistication that most consumers want to spend a lot of money on at the moment.

Bottom line is that until hacking is drastically reduce (if not finally eliminated); that autonomous AI like connected robots and humanoids will find they will have a hard time being adopted by the US collective mass of the population.


In the future the global employment market will rely heavily on robots, artificial intelligence, and all sorts of automation.

In fact, technology is so crucial going forward, that in January, the World Economic Forum predicted that in less than five years more than five million human jobs will be replaced by automation, AI, and robots.

Just this week, a new report showed nearly a third of retails jobs in the UK could disappear by 2025, with many workers replaced by technology in some way or another.

Read more