Toggle light / dark theme

Despite surges in fields like AI, medicine and nuclear energy, major advances in science and technology are slowing and are fewer and farther between than decades ago, according to a study published in Nature.

The researchers analyzed some 45 million scientific papers and 3.9 million patents between 1945 and 2010, examining networks of citations to assess whether breakthroughs reinforced the status quo or disrupted existing knowledge and more dramatically pushed science and technology off into new directions.

Across all major scientific and technological fields, these big disruptions—the discovery of the double helix structure of DNA, which rendered earlier research obsolete, is a good example of such research—have become less common since 1945, the researchers found.

Both the European Space Agency and NASA are planning to test even more sensitive sensors on future moon missions to try and hone in on satellite signals. If they can truly connect with sats back home, we could get closer to achieving autonomous moon travel. But eventually that won’t be enough. To help direct humans on the lunar surface, we’re going to need a fleet of satellites specifically around the moon. NASA calls its project LunaNet, and it’s part of the Gateway space station, which is the culmination of America’s plan to return to the moon. It needs to be designed to play well with ESA technology and, eventually, will be the source of high-speed internet on the moon.

Artemis I launched back in November, rounded the moon just 81 miles above the lunar surface and touched down Earth-side in December. Artemis II, which will carry astronauts around the moon in a similar trajectory, is slated to launch in late 2024, according to Space.com. Artemis III, which will be humanity’s first boots on the moon since 1972, could launch as early as 2025.

Check out all the on-demand sessions from the Intelligent Security Summit here.

Over the last half-decade, quantum computing has attracted tremendous media attention. Why?

After all, we have computers already, which have been around since the 1940s. Is the interest because of the use cases? Better AI? Faster and more accurate pricing for financial services firms and hedge funds? Better medicines once quantum computers get a thousand times bigger?

Check out all the on-demand sessions from the Intelligent Security Summit here.

The metaverse is becoming one of the hottest topics not only in technology but in the social and economic spheres. Tech giants and startups alike are already working on creating services for this new digital reality.

The metaverse is slowly evolving into a mainstream virtual world where you can work, learn, shop, be entertained and interact with others in ways never before possible. Gartner recently listed the metaverse as one of the top strategic technology trends for 2023, and predicts that by 2026, 25% of the population will spend at least one hour a day there for work, shopping, education, social activities and/or entertainment. That means organizations that use the metaverse effectively will be able to engage with both human and machine customers and create new revenue streams and markets.

Researchers tested GPT-3.5 with questions from the US Bar Exam. They predict that GPT-4 and comparable models might be able to pass the exam very soon.

In the U.S., almost all jurisdictions require a professional license exam known as the Bar Exam. By passing this exam, lawyers are admitted to the bar of a U.S. state.

In most cases, applicants must complete at least seven years of post-secondary education, including three years at an accredited law school.

Benjamin Franklin stated, “If you would not be forgotten as soon as you are dead and rotten, either write things worth reading, or do things worth the writing.”

MIT’s well-known late Director of Artificial Intelligence Laboratory, Patrick Winston, expanded upon this adage, saying, “Your success in life will be determined largely by your ability to speak, your ability to write, and the quality of your ideas. In that order.”

We are at a precarious point in human development, with the positive and negative impact of technology surrounding us as individuals and as a society. Technology has helped improve our living standards, extended our lives, cured diseases, fed our growing populations, and expanded our frontiers. But it has also helped create greater economic and digital divides, increased pollution and harm to our environment, and potentially endangered the intellectual development of our human population.

Year 2022 😗


Testing multiple treatments for heterogeneous (varying) effectiveness with respect to many underlying risk factors requires many pairwise tests; we would like to instead automatically discover and visualize patient archetypes and predictors of treatment effectiveness using multitask machine learning. In this paper, we present a method to estimate these heterogeneous treatment effects with an interpretable hierarchical framework that uses additive models to visualize expected treatment benefits as a function of patient factors (identifying personalized treatment benefits) and concurrent treatments (identifying combinatorial treatment benefits). This method achieves state-of-the-art predictive power for COVID-19 in-hospital mortality and interpretable identification of heterogeneous treatment benefits. We first validate this method on the large public MIMIC-IV dataset of ICU patients to test recovery of heterogeneous treatment effects. Next we apply this method to a proprietary dataset of over 3,000 patients hospitalized for COVID-19, and find evidence of heterogeneous treatment effectiveness predicted largely by indicators of inflammation and thrombosis risk: patients with few indicators of thrombosis risk benefit most from treatments against inflammation, while patients with few indicators of inflammation risk benefit most from treatments against thrombosis. This approach provides an automated methodology to discover heterogeneous and individualized effectiveness of treatments.