Toggle light / dark theme

Lauded for years as the system able to best prevent malware infection, macOS recently fell victim to an operating system vulnerability that hackers used to circumvent all of Apple’s system defenses.

Security researcher Cedric Owens discovered this bug in March 2021 while assessing Apple’s Gatekeeper mechanism, a safeguard that will only allow developers to run their on Macs after registering with Apple and paying a fee. Moreover, the company requires that all applications undergo an automated vetting process to further protect against malicious software.

Unfortunately, Owens uncovered a logic flaw in the macOS itself, rather than the . The bug allowed attackers to develop able to deceive the operating system into running their malware regardless of whether they passed Apple’s safety checks. Indeed, this flaw resembles a door that has been securely locked and bolted but still has a small pet door at the bottom through which you can break in or insert a bomb.

The field of soft robotics has exploded in the past decade, as ever more researchers seek to make real the potential of these pliant, flexible automata in a variety of realms, including search and rescue, exploration and medicine.

For all the excitement surrounding these new machines, however, UC Santa Barbara mechanical engineering professor Elliot Hawkes wants to ensure that research is more than just a flash in the pan. “Some new, rapidly growing fields never take root, while others become thriving disciplines,” Hawkes said.

To help guarantee the longevity of soft robotics research, Hawkes, whose own robots have garnered interest for their bioinspired and novel locomotion and for the new possibilities they present, offers an approach that moves the field forward. His viewpoint, written with colleagues Carmel Majidi from Carnegie Mellon University and Michael T. Tolley of UC San Diego, is published in the journal Science Robotics.

An animal scientist with Wageningen University & Research in the Netherlands has created an artificial-intelligence-based application that can gauge the emotional state of farm animals based on photographs taken with a smartphone. In his paper uploaded to the bioRxiv preprint server, Suresh Neethirajan describes his app and how well it worked when tested.

Prior research and anecdotal evidence has shown that are more productive when they are not living under stressful conditions. This has led to changes in , such as shielding cows’ eyes from the spike that is used to kill them prior to slaughter to prevent stress hormones from entering the meat. More recent research has suggested that it may not be enough to shield from stressful situations—adapting their environment to promote peacefulness or even playfulness can produce desired results, as well. Happy cows or goats, for example, are likely to produce more milk than those that are bored. But as Neethirajan notes, the emotional state of an animal can be quite subjective, leading to incorrect conclusions. To address this problem, he adapted human face recognition software for use in detecting emotions in cows and pigs.

The system is called WUR Wolf and is based on several pieces of technology: the YOLO Object Detection System, the YOLOv4 that works with a convolution and Faster R-CNN, which also allows for detection of objects, but does so with different feature sets. For training, he used the Nvidia GeForece GTX 1080 Ti GRP running on a CUDA 9.0 computer. The data consisted of thousands of images of cows and pigs taken with a smartphone from six farms located in several countries with associated classification labels indicating which could be associated with which mood—raised ears on a cow, for example, generally indicate the animal is excited.

Training neural networks to perform tasks, such as recognizing images or navigating self-driving cars, could one day require less computing power and hardware thanks to a new artificial neuron device developed by researchers at the University of California San Diego. The device can run neural network computations using 100 to 1000 times less energy and area than existing CMOS-based hardware.

Researchers report their work in a paper published recently in Nature Nanotechnology.

Neural networks are a series of connected layers of artificial neurons, where the output of one layer provides the input to the next. Generating that input is done by applying a mathematical calculation called a non-linear activation function. This is a critical part of running a neural network. But applying this function requires a lot of computing power and circuitry because it involves transferring data back and forth between two separate units – the memory and an external processor.

The UK government on Wednesday became the first country to announce it will regulate the use of self-driving vehicles at slow speeds on motorways, with the first such cars possibly appearing on public roads as soon as this year.

Britain’s transport ministry said it was working on specific wording to update the country’s highway code for the safe use of self-driving vehicle systems, starting with Automated Lane Keeping Systems (ALKS) — which use sensors and software to keep cars within a lane, allowing them to accelerate and brake without driver input.

The government said the use of ALKS would be restricted to motorways, at speeds under 37 miles (60 km) per hour.

Consciousness remains scientifically elusive because it constitutes layers upon layers of non-material emergence: Reverse-engineering our thinking should be done in terms of networks, modules, algorithms and second-order emergence — meta-algorithms, or groups of modules. Neuronal circuits correlate to “immaterial” cognitive modules, and these cognitive algorithms, when activated, produce meta-algorithmic conscious awareness and phenomenal experience, all in all at least two layers of emergence on top of “physical” neurons. Furthermore, consciousness represents certain transcendent aspects of projective ontology, according to the now widely accepted Holographic Principle.

#CyberneticTheoryofMind


There’s no shortage of workable theories of consciousness and its origins, each with their own merits and perspectives. We discuss the most relevant of them in the book in line with my own Cybernetic Theory of Mind that I’m currently developing. Interestingly, these leading theories, if metaphysically extended, in large part lend support to Cyberneticism and Digital Pantheism which may come into scientific vogue with the future cyberhumanity.

This article is part of our new series, Currents, which examines how rapid advances in technology are transforming our lives.

Imagine operating a computer by moving your hands in the air as Tony Stark does in “Iron Man.” Or using a smartphone to magnify an object as does the device that Harrison Ford’s character uses in “Blade Runner.” Or a next-generation video meeting where augmented reality glasses make it possible to view 3D avatars. Or a generation of autonomous vehicles capable of driving safely in city traffic.

These advances and a host of others on the horizon could happen because of metamaterials, making it possible to control beams of light with the same ease that computer chips control electricity.

WASHINGTON — The Department of Defense wants to see a prototype that can ensure spectrum is available whenever it’s needed for aerial combat training, according to an April 26 request from the National Spectrum Consortium.

The effort, focused specifically on the Operational Spectrum Comprehension, Analytics, and Response (OSCAR) project, is part of a larger portfolio included in the DoD’s office of research and engineering’s Spectrum Access Research & Development Program. That program hope to develop near real time spectrum management technologies that leverage machine learning and artificial intelligence to more efficiently and dynamically allocate spectrum assignments based on operational planning or on operational outcomes, a release said.

“I think of this set of projects as a toolset that’s really the beginning of starting to move toward pushing those fundamental technologies into more direct operational application,” Maren Leed, executive director of the National Spectrum Consortium, told C4ISRNET. It’s “starting to bridge from just sharing with commercial into capabilities that are going to enable warfighting much more directly.”

The Autonomous Weeder, developed by Carbon Robotics, uses a combination of artificial intelligence (AI), robotics, and laser technology to safely and effectively drive through crop fields – identifying, targeting and eliminating weeds.

Unlike other weeding technologies, the robot utilises high-power lasers to eradicate weeds through thermal energy, without disturbing the soil. This could allow farmers to use less herbicides, while reducing labour costs and improving the reliability and predictability of crop yields.

“AI and deep learning technology are creating efficiencies across a variety of industries and we’re excited to apply it to agriculture,” said Paul Mikesell, CEO and founder of Carbon Robotics. “Farmers, and others in the global food supply chain, are innovating now more than ever to keep the world fed. Our goal is to create tools that address their most challenging problems, including weed management and elimination.”