Toggle light / dark theme

The lander and the rover, which landed on the Moon on August 23, were designed to operate for only one lunar day.

As the lunar day draws to a close, ISRO has decided to put its Chandrayaan-3 rover Pragyan in sleep mode to conserve its battery and protect it from the extreme cold of the lunar night. The rover, which has completed its assigned tasks, is now parked safely and has transmitted the data collected by its payloads to the lander, which in turn relays it to Earth.


Credits: ISRO/twitter.

Lunar Night

A collaboration between researchers at the University of Sheffield and Budapest Zoo sees an aging gorilla walk again with ease. Will this treatment be fruitful for humans?

Scientists at the university of sheffield.

Liesel, the elderly matriarch of the Budapest Zoo, had been struggling to walk on her left leg, signaling a possible battle with arthritis. This marked the initiation of a unique collaboration between veterinary expertise and cutting-edge science to alleviate the suffering of the aging primate.

Ensuring security in the software market is undeniably crucial, but it is important to strike a balance that avoids excessive government regulation and the burdens associated with government-mandated legal responsibility, also called a liability regime. While there’s no question the market is broken with regards to security, and intervention is necessary, there is a less intrusive approach that enables the market to find the right level of security while minimizing the need for heavy-handed government involvement.

Imposing a liability regime on software companies may go too far and create unintended consequences. The downsides of liability, such as increased costs, potential legal battles, and disincentives to innovation, can hinder the development of secure software without necessarily guaranteeing improved security outcomes. A liability regime could also burden smaller companies disproportionately and stifle the diversity and innovation present in the software industry.

Instead, a more effective approach involves influencing the software market through measures that encourage transparency and informed decision-making. By requiring companies to be fully transparent about their security practices, consumers and businesses can make informed choices based on their risk preferences. Transparency allows the market to drive the demand for secure software, enabling companies with robust security measures to potentially gain a competitive edge.

The research paper reviews the potential benefits of marine plasmalogens, a type of glycerophospholipid, in combating age-related diseases like Alzheimer’s and Parkinson’s. These compounds, abundant in marine resources, could improve lipid metabolism and reduce oxidative stress, offering a new avenue for improving the quality of life in aging populations.

Infamous Chisel is described as a collection of multiple components that’s designed with the intent to enable remote access and exfiltrate information from Android phones.

Besides scanning the devices for information and files matching a predefined set of file extensions, the malware also contains functionality to periodically scan the local network and offer SSH access.

“Infamous Chisel also provides remote access by configuring and executing TOR with a hidden service which forwards to a modified Dropbear binary providing a SSH connection,” the Five Eyes (FVEY) intelligence alliance said.

Artificial intelligence (AI) has been helping humans in IT security operations since the 2010s, analyzing massive amounts of data quickly to detect the signals of malicious behavior. With enterprise cloud environments producing terabytes of data to be analyzed, threat detection at the cloud scale depends on AI. But can that AI be trusted? Or will hidden bias lead to missed threats and data breaches?

Bias can create risks in AI systems used for cloud security. There are steps humans can take to mitigate this hidden threat, but first, it’s helpful to understand what types of bias exist and where they come from.