Toggle light / dark theme

The announcement comes shortly after IBM announced it would replace 7,800 jobs with AI.

After IBM’s CEO, earlier this month, announced that the company could easily replace at least 7,800 human personnel with artificial intelligence (AI) over the next five years, another startling announcement in the ‘Will AI replace humans’ debate has come to the fore.

BT, a prominent British multinational telecommunications firm, said it will become a ‘leaner business’ as it announced its plans to shed up to 55,000 jobs by the end of the decade, mostly in the United Kingdom. The company also announced that approximately 10,000 of its workforce will be replaced by AI, said a report by The Guardian.

According to RollingStone, and other news outlets, a group of students at Texas A&M University-Commerce’s graduations is in question after being accused of using ChatGPT for their essays.

A Texas A&M University-Commerce professor has taken drastic action to fail all his students after suspecting them of using ChatGPT to write their papers. This decision has now delayed them from passing their diplomas. According to RollingStone, the professor, Dr. Jard Mumm, the decision appears flawed as he used the natural language processing software to analyze each essay and judge whether it generated it.


Glegorly/iStock.

“I copy and paste your responses in [ChatGPT], and [it] will tell me if the program generated the content,” the professor wrote in the email. He went on to say that he had tested each paper twice. Dr. Mumm then went on to offer the class a makeup assignment to avoid the failing grade — which could otherwise, in theory, threaten their graduation status.

Artificial intelligence (AI) and the metaverse are some of the most captivating technologies of the 21st century so far. Both are believed to have the potential to change many aspects of our lives, disrupt different industries, and enhance the efficiency of traditional workflows. While these two technologies are often looked at separately, they’re more connected than we may think. Before we explore the relationship between AI and the metaverse, let’s start by defining both terms.

The metaverse is a concept describing a hypothetical future design of the internet. It features an immersive, 3D online world where users are represented by custom avatars and access information with the help of virtual reality (VR), augmented reality (AR), and similar technologies. Instead of accessing the internet via their screens, users access the metaverse via a combination of the physical and digital. The metaverse will enable people to socialize, play, and work alongside others in different 3D virtual spaces.

A similar arrangement was described in Neal Stephenson’s 1992 science-fiction novel Snow Crash. While it was perceived as a fantasy mere three decades ago, it seems like it could become a reality sooner rather than later. Although the metaverse isn’t fully in existence yet, some online platforms incorporate elements of it. For example, video games like Fortnite and Horizon World port multiple elements of our day-to-day lives into the online world.

An exhibition dedicated to the work of British architect Norman Foster has opened at the Centre Pompidou in Paris, showcasing drawings and original models produced by the architect over the last six decades.

The exhibition, which according to the Norman Foster Foundation is the largest-ever retrospective display of Foster’s work, features around 130 of the architect’s projects including the Hong Kong and Shanghai Banking Corporation Headquarters, Hong Kong International Airport and Apple Park.

Designs that informed Foster’s work are also exhibited, including works by Chinese artist Ai Weiwei, French painter Fernand Léger, Romanian sculptor Constantin Brancusi and Italian painter Umberto Boccioni, and even cars, which the architect is passionate about.

Summary: Researchers have developed an artificial electronic skin (e-skin) capable of converting sensory inputs into electrical signals that the brain can interpret. This skin-like material incorporates soft integrated circuits and boasts a variety of sensory abilities, including temperature and pressure detection.

This advance could facilitate the creation of prosthetic limbs with sensory feedback or advanced medical devices. The e-skin operates at a low voltage and can endure continuous stretching without losing its electrical properties.

In my work, I build instruments to study and control the quantum properties of small things like electrons. In the same way that electrons have mass and charge, they also have a quantum property called spin. Spin defines how the electrons interact with a magnetic field, in the same way that charge defines how electrons interact with an electric field. The quantum experiments I have been building since graduate school, and now in my own lab, aim to apply tailored magnetic fields to change the spins of particular electrons.

Research has demonstrated that many physiological processes are influenced by weak magnetic fields. These processes include stem cell development and maturation, cell proliferation rates, genetic material repair, and countless others. These physiological responses to magnetic fields are consistent with chemical reactions that depend on the spin of particular electrons within molecules. Applying a weak magnetic field to change electron spins can thus effectively control a chemical reaction’s final products, with important physiological consequences.

Currently, a lack of understanding of how such processes work at the nanoscale level prevents researchers from determining exactly what strength and frequency of magnetic fields cause specific chemical reactions in cells. Current cell phone, wearable, and miniaturization technologies are already sufficient to produce tailored, weak magnetic fields that change physiology, both for good and for bad. The missing piece of the puzzle is, hence, a “deterministic codebook” of how to map quantum causes to physiological outcomes.

A research group led by Professor Minoru Osada (he, him) and postdoctoral researcher Yue Shi (she, her) at the Institute for Future Materials and Systems (IMaSS), Nagoya University in Japan, has developed a new technology to fabricate nanosheets, thin films of two-dimensional materials a couple of nanometers thick, in about one minute.

This technology enables the formation of high-quality, large films with a single click without the need for specialized knowledge or technology. Their findings are expected to contribute to developing the industrial manufacturing process for various types of nanosheet devices. The study was published in ACS Applied Materials & Interfaces.

Nanosheets have a thickness that is measured in nanometers. Nanometers are so thin that the sheets cannot be seen from the side with the naked eye. They have potential uses in several different fields, including electronics, catalysis, energy storage, and biomedicine. Those made from graphene and inorganic nanosheets are being tested for use in a range of devices, from to sensors and batteries, because they have electrical, transparency, and heat-resistance functions different from those of conventional bulk materials.