Toggle light / dark theme

Wearable sensors are ubiquitous thanks to wireless technology that enables a person’s glucose concentrations, blood pressure, heart rate, and activity levels to be transmitted seamlessly from sensor to smartphone for further analysis.

Most wireless sensors today communicate via embedded Bluetooth chips that are themselves powered by small batteries. But these conventional chips and power sources will likely be too bulky for next-generation sensors, which are taking on smaller, thinner, more flexible forms.

Now MIT engineers have devised a new kind of that communicates wirelessly without requiring onboard chips or batteries. Their design, detailed in the journal Science, opens a path toward chip-free wireless sensors.

A team of Chinese scientists report on a new method for entangling photons that they say could make quantum networks and quantum computing more practical, according to the South China Post.

In a study published in Nature Photonics, the team from the University of Science and Technology of China said that the new way to produce entangled photons is extremely efficient. The work was led by Jian-Wei Pan, one of the world’s leading quantum researcher from the Hefei National Research Center for Physical Sciences at the Microscale, the University of Science and Technology of China and CAS Center for Excellence in Quantum Information and Quantum Physics, University of Science and Technology of China.

Entangled photons are needed for certain forms of quantum communication and computing. These technologies require the ability to efficiently produce large numbers of particles — in this case, photons — that can remain entangled even when separated by vast distances to process and protect information. Specifically, the technology could be used in quantum relays that are used in long-distance, attack-proof quantum communication, the newspaper reports.

Discussion panel with:
- Swati Chavda, a science fiction writer and former brain surgeon.
- Ron S. Friedman, a science fiction writer and an Information Technologies Specialist.

August 13th 2022, When Words Collide festival.

#booktube #authortube #writingtube #braincomputerinterface #neuralink.

Relevant links:

Swati Chavda website: https://www.swatichavda.com/

Ron S. Friedman website: https://ronsfriedman.wordpress.com/

If the combination of Covid-19 and remote work technologies like Zoom have undercut the role of cities in economic life, what might an even more robust technology like the metaverse do? Will it finally be the big upheaval that obliterates the role of cities and density? To paraphrase Airbnb CEO Brian Chesky: The place to be was Silicon Valley. It feels like now the place to be is the internet.

The simple answer is no, and for a basic reason. Wave after wave of technological innovation — the telegraph, the streetcar, the telephone, the car, the airplane, the internet, and more — have brought predictions of the demise of physical location and the death of cities.


Remote work has become commonplace since the beginning of the Covid-19 pandemic. But the focus on daily remote work arrangements may miss a larger opportunity that the pandemic has unearthed: the possibility of a substantially increased labor pool for digital economy work. To measure interest in digital economy jobs, defined as jobs within the business, finance, art, science, information technology, and architecture and engineering sectors, the authors conducted extensive analyses of job searches on the Bing search engine, which accounts for more than a quarter of all desktop searches in the U.S. They found that, not only did searches for digital economy jobs increase since the beginning of the pandemic, but those searches also became less geographically concentrated. The single biggest societal consequence of the dual trends of corporate acceptance of remote work and people’s increased interest in digital economy jobs is the potential geographic spread of opportunity.

Page-utils class= article-utils—vertical hide-for-print data-js-target= page-utils data-id= tag: blogs.harvardbusiness.org, 2007/03/31:999.334003 data-title= Who Gets to Work in the Digital Economy? data-url=/2022/08/who-gets-to-work-in-the-digital-economy data-topic= Business and society data-authors= Scott Counts; Siddharth Suri; Alaysia Brown; Brian Xu; Sharat Raghavan data-content-type= Digital Article data-content-image=/resources/images/article_assets/2022/08/Aug22_04_509299271-383x215.jpg data-summary=

The labor market for jobs you can do on a laptop is expanding beyond major cities.

Neuralink, a company co-founded by Elon Musk, has been working on an implantable brain-machine interface since 2016. While it previously demonstrated its progress by showing a Macaque monkey controlling the cursor.

It’s unclear what kind of deal Musk has offered — whether it’s a collaboration or a financial investment —since none of the players responded or confirmed the report with the news organization.


Elon Musk’s last update on Neuralink — his company that is working on technology that will connect the human brain directly to a computer — featured a pig with one of its chips implanted in its brain. Now Neuralink is demonstrating its progress by showing a Macaque with one of the Link chips playing Pong. At first using “Pager” is shown using a joystick, and then eventually, according to the narration, using only its mind via the wireless connection.

Today we are pleased to reveal the Link’s capability to enable a macaque monkey, named Pager, to move a cursor on a computer screen with neural activity using a 1,024 electrode fully-implanted neural recording and data transmission device, termed the N1 Link. We have implanted the Link in the hand and arm areas of the motor cortex, a part of the brain that is involved in planning and executing movements. We placed Links bilaterally: one in the left motor cortex (which controls movements of the right side of the body) and another in the right motor cortex (which controls the left side of the body).

In an accompanying blog post, Neuralink says it’s building on decades of research that developed systems connecting “a few hundred electrodes” that needed a physical connector through the skin, compared to its N1 Link with 1,024 electrodes. According to Neuralink, “Our mission is to build a safe and effective clinical BMI system that is wireless and fully implantable that users can operate by themselves and take anywhere they go; to scale up the number of electrodes for better robustness and higher information throughput; and to automate the implant surgery to make it as rapid and safe as possible.”

[Editor’s note: “Quantum Computing Will Be Bigger Than the Discovery of Fire!” was previously published in June 2022. It has since been updated to include the most relevant information available.]

It’s commonly appreciated that the discovery of fire was the most profound revolution in human history. And yesterday, I read that a major director at Bank of America (BAC) thinks a technology that hardly anyone is talking about these days could be more critical for humankind than fire!

Liquid crystals consist of rod-shaped molecules that slosh around like a fluid, and in those that are nematic the molecules are mostly parallel to each other. For devices like TV screens, the odd molecule that faces the wrong way has to be removed during the manufacturing process, but these defects are key for building a liquid crystal computer, says Kos.

In ordinary computers, information is stored as a series of bits, electronic versions of 1s and 0s. In Kos and Dunkel’s liquid crystal computer, the information would instead be translated into a series of defective orientations. A liquid crystal defect could encode a different value for every different degree of misalignment with other molecules.

Electric fields could then be used to manipulate the molecules to perform basic calculations, similar to how simple circuits called logic gates work in an ordinary computer. Calculations on the proposed computer would appear as ripples spreading through the liquid.

The team’s sensor design is a form of electronic skin, or “e-skin” — a flexible, semiconducting film that conforms to the skin like electronic Scotch tape. The heart of the sensor is an ultrathin, high-quality film of gallium nitride, a material that is known for its piezoelectric properties, meaning that it can both produce an electrical signal in response to mechanical strain and mechanically vibrate in response to an electrical impulse.

The researchers found they could harness gallium nitride’s two-way piezoelectric properties and use the material simultaneously for both sensing and wireless communication.

In their new study, the team produced pure, single-crystalline samples of gallium nitride, which they paired with a conducting layer of gold to boost any incoming or outgoing electrical signal. They showed that the device was sensitive enough to vibrate in response to a person’s heartbeat, as well as the salt in their sweat, and that the material’s vibrations generated an electrical signal that could be read by a nearby receiver. In this way, the device was able to wirelessly transmit sensing information, without the need for a chip or battery.

Nvidia shared more performance benchmarks, but as with all vendor-provided performance data, you should take these numbers with a grain of salt. These benchmarks also come with the added caveat that they are conducted pre-silicon, meaning they’re emulated projections that haven’t been tested with actual silicon yet and are “subject to change.” As such, sprinkle some extra salt.

Nvidia’s new benchmark here is the score of 370 with a single Grace CPU in the SpecIntRate 2017 benchmark. This places the chips right at the range we would expect — Nvidia has already shared a multi-CPU benchmark, claiming a score of 740 for two Grace CPUs in the SpecIntRate2017 benchmark. Obviously, this suggests a linear scaling improvement with two chips.

AMD’s current-gen EPYC Milan chips, the current performance leader in the data center, have posted SPEC results ranging from 382 to 424 apiece, meaning the highest-end x86 chips will still hold the lead. However, Nvidia’s solution will have many other advantages, such as power efficiency and a more GPU-friendly design.