Toggle light / dark theme

An international group of researchers has made graphene more affordably and with a lower environmental impact than current chemical methods by using bacteria.

Graphene is a very strong and conductive material that could revolutionize electronics and engineering. However, producing graphene in large quantities requires lots of energy and involves toxic chemicals, such as hydrazine, which damages the nervous system.

Researchers from the Delft University of Technology in the Netherlands and the University of Rochester in the US have worked to overcome these problems by using bacteria to produce graphene. Their work has been published in the journal ChemOpen.

We do have to grow meat in the labs!


Maastricht-based Mosa Meat, which has in the past received more than $1m from Google cofounder Sergey Brin, said it hopes to sell its first products — most likely ground beef for burgers — in the next three years.

The aim is to achieve industrial-scale production two to three years later, with a typical hamburger patty costing about $1.

Several companies are looking into cultured meat or meat substitute products aimed at consumers concerned about the environmental and ethical effect of raising and slaughtering animals.

Chinese health officials have already been administering the HIV and flu drugs to fight the coronavirus. The use of the three together in a cocktail seemed to improve the treatment, the Thai doctors said.

Another doctor said that a similar approach in two other patients resulted in one displaying some allergic reaction but the other showed improvement.

“We have been following international practices, but the doctor increased the dosage of one of the drugs,” said Somsak Akkslim, director-general of the Medical Services Department, referring to the flu medicine Oseltamivir.

Steam’s concurrent user count passed 18.8 million earlier today, peaking at 18,801,944 according to tracker SteamDB. The previous record was 18.5 million just about two years ago on January 6, 2018. As noted by SteamDB, this concurrent user record did not coincide with a record number of players in game, with only 5.8 million today against 7 million two years ago, about 1.2 million fewer. The previous record was boosted by the huge number of players in Playerunknown’s Battlegrounds. It’s not known what caused the current peak, though the time of the peak coincides with peak playtimes for all of Steam’s top games.

.@Steam has broken its record for most concurrently online users that was held for two years. Previous record was 18,537,490 users. It’s still increasing!But there’s about 1 million less players actually in-game (≈5.8mil vs ≈7mil two years ago). https://steamdb.info/app/753/graphs/ February 2, 2020

Why could the mongoose Rikki Tikki Tavi attack deadly snakes with impunity in Kipling’s “Jungle Book?” Because he has a uniquely mutated receptor for a brain neurotransmitter called acetylcholine, researchers at the Weizmann Institute of Science report in the Proceedings of the National Academy of Sciences.

The toxins in many snake venoms, including that of cobras, bind to the acetylcholine receptors of their victims, blocking nerve-muscle communications. Molecular biologist Sara Fuchs and her colleagues found that the acetylcholine receptor in mongooses—like that in the snakes themselves—is slightly mutated so that the venom simply bounces off the muscle cells, causing them no harm.

Circa 2018


Stem cells have been a source of much excitement in the medical community for years, as if they could be properly developed for use in humans doctors could theoretically use them to regrow missing body parts.

While the techniques have seen some success in the lab, the processes involved are complex and it will likely be some time before they become widely used in clinical practice.

The hidden secret of artificial intelligence is that much of it is actually powered by humans. Well, to be specific, the supervised learning algorithms that have gained much of the attention recently are dependent on humans to provide well-labeled training data that can be used to train machine learning algorithms. Since machines have to first be taught, they can’t teach themselves (yet), so it falls upon the capabilities of humans to do this training. This is the secret achilles heel of AI: the need for humans to teach machines the things that they are not yet able to do on their own.

Machine learning is what powers today’s AI systems. Organizations are implementing one or more of the seven patterns of AI, including computer vision, natural language processing, predictive analytics, autonomous systems, pattern and anomaly detection, goal-driven systems, and hyperpersonalization across a wide range of applications. However, in order for these systems to be able to create accurate generalizations, these machine learning systems must be trained on data. The more advanced forms of machine learning, especially deep learning neural networks, require significant volumes of data to be able to create models with desired levels of accuracy. It goes without saying then, that the machine learning data needs to be clean, accurate, complete, and well-labeled so the resulting machine learning models are accurate. Whereas it has always been the case that garbage in is garbage out in computing, it is especially the case with regards to machine learning data.

According to analyst firm Cognilytica, over 80% of AI project time is spent preparing and labeling data for use in machine learning projects: