Toggle light / dark theme

Google today is rolling out a few new updates to its nearly three-month-old Search Generative Experience (SGE), the company’s AI-powered conversational mode in Search, with a goal of helping users better learn and make sense of the information they discover on the web. The features include tools to see definitions of unfamiliar terms, those that help to improve your understanding and coding information across languages, and an interesting feature that lets you tap into the AI power of SGE while you’re browsing.

The company explains that these improvements aim to help people better understand complicated concepts or complex topics, boost their coding skills and more.

One of the new features will let you hover over certain words to preview their definitions and see related images or diagrams related to the topic, which you can then tap on to learn more. This feature will become available across Google’s AI-generated responses to topics or questions related to certain subjects, like STEM, economics, history and others, where you may encounter terms you don’t understand or concepts you want to dive deeper into for a better understanding.

We’ve seen a lot about large learning models in general, and a lot of that has been elucidated at this conference, but many of the speakers have great personal takes on how this type of process works, and what it can do!

For example, here we have Yoon Kim talking about statistical objects, and the use of neural networks (transformer-based neural networks in particular) to use next-word prediction in versatile ways. He uses the example of the location of MIT:

“You might have a sentence like: ‘the Massachusetts Institute of Technology is a private land grant research university’ … and then you train this language model (around it),” he says. “Again, (it takes) a large neural network to predict the next word, which, in this case, is ‘Cambridge.’ And in some sense, to be able to accurately predict the next word, it does require this language model to store knowledge of the world, for example, that must store factoid knowledge, like the fact that MIT is in Cambridge. And it must store … linguistic knowledge. For example, to be able to pick the word ‘Cambridge,’ it must know what the subject, the verb and the object of the preceding or the current sentence is. But these are, in some sense, fancy autocomplete systems.”

The company aims for low-cost, high-throughput chips that allow users to work with its web services on the cloud.

Even as the world looks to Microsoft and Google to reveal the next big thing in the generative artificial intelligence (AI) field, Jeff Bezos-founded Amazon has been silently working to let its customers work directly with the technology. In an unmarked building in Austin, Texas, Amazon engineers are busy developing two types of microchips that will be used to train and run AI models, CNBC

The world took notice of generative AI when OpenAI launched ChatGPT last year. Microsoft, which has partnered with OpenAI previously, was quick to use its association with the company and incorporate the features of the AI model into its existing products.

Chromium compounds could soon replace the rare and expensive metals osmium and ruthenium.

Scientists have found a way to make solar panels and phone screens from readily available chromium. This is according to a report.

The article highlights how a major breakthrough sees material “almost as rare as gold” replaced by everyday components, significantly reducing “the price of manufacturing the technology that relies on it.”


Thinnapob/iStock.

Scientists at Stanford University have found a way to induce cell death in cancer cells with a method that could be effective in around 50% of cancers. In a paper, “Rewiring cancer drivers to activate apoptosis,” published in Nature, the team describes a new class of molecules called transcriptional/epigenetic CIPs (TCIPs) that can activate apoptosis with the help of cancer growth gene expressions within the cancer cells.

The researchers designed small molecules that bind specific transcriptional suppressors to transcription activators. The most potent molecule created, TCIP1, works by linking that bind BCL6 to those that bind transcriptional activators BRD4.

One of the components that makes cancer cells cancerous is that they ignore signals from surrounding healthy tissues to stop growing and to initiate apoptosis or cell death. The apoptosis pathways still exist but are actively blocked in certain types of cancer where the transcription factor B cell lymphoma 6 (BCL6) binds to the promoters of apoptosis and suppresses their expression through .

Organizations are building resilient supply chains with a “phygital” approach, a blend of digital and physical tools. In recent years, the global supply chain has been disrupted due to the covid-19 pandemic, geopolitical volatility, overwhelmed legacy systems, and labor shortages. The National Association of Manufacturers (NAM), an industrial advocacy group, warns the disruption isn’t over— NAM’s spring 2023 survey found 90% of respondents saw significant (52.5%) or partial (39%) supply chain disruption during the past two years. Just 0.5% of respondents reported no disruption at all. Digitization presents an opportunity to overcome supply chain disruption by making data flow more efficiently, using technology and data standards to break barriers between disparate systems.

“Phygital merges two worlds together, where standards provide an interoperable system of defined data structures,” says Melanie Nuce-Hilton, senior vice president of innovation and partnerships at GS1 US, a member of GS1, a global not-for-profit supply chain standards organization. “The approach is intended to deliver multiple benefits—improved supply chain visibility for traceability and inventory management, better customer experiences across online and offline interactions, and the potential for better circularity and waste reduction by maintaining linkages between products and their data throughout their lifecycle,” she says.

Presented by VAST Data

With access to just a sliver of the 2.5 quintillion bytes of data created every day, AI produces what often seem like miracles that human intellect can’t match — identifying cancer on a medical scan, a viable embryo for IVF, new ways of tackling climate change and the opioid crisis and on and on. However, that’s not true intelligence; rather, these AI systems are just designed to link data points and report conclusions, to power increasingly disruptive automation across industries.

While generative AI is trending and GPT models have taken the world by storm with their astonishing capabilities to respond to human prompts, do they truly acquire the ability to perform reasoning tasks that humans find easy to execute? It’s important to understand that the current AI the world is working with has little understanding of the world it exists in, and is unable to build a mental model that goes beyond regurgitating information that is already known.

Hints from the bowhead whale genome published nearly a decade ago predicted that the mammals may use this alternate strategy (SN: 1/6/15). “But you need actual experiments to actually validate those predictions,” Tollis says.

In the lab, study coauthor Vera Gorbunova at the University of Rochester in New York and her colleagues ran an assortment of experiments on cells harvested from bowhead whale tissue, as well as on cells from humans, cows and mice.

The whale cells were both efficient and accurate at repairing double-strand breaks in DNA, damage that severs both strands of the DNA double helix. Whale repair restored broken DNA to like-new condition more often than cells from other mammals, the team found. In those animals, mends to the genome tended to be sloppier, like a poorly patched pair of jeans. The team also identified two proteins in bowhead whale cells, CIRBP and RPA2, that are part of the DNA repair crew.