Toggle light / dark theme

NVIDIA “Huang’s Law” is the primary catalyst for driving chip performance and efficiency to over 1000x in less than a decade. For NVIDIA, Huang’s Law is the fundamental approach that moves beyond traditional chip speedup fundamentals such as Moore’s Law which had dominated the tech industry in the past.

Huang’s Law To Dominate The Future For NVIDIA, Chip Shrinking In No Way Defines The Increment in Performance

NVIDIA’s CEO Jensen Huang has expressed multiple times that Moore’s Law is “slowing down,” and the concept it is backed with is starting to get outdated. The argument became heated, especially after Jensen’s GTC 2023 keynote. If we look at what Moore’s Law is, it is related to the number of transistors on a microchip and how it “should” double every year.

Last week, Unity rolled out a new look version of its controversial Runtime Fee in the wake of a seismic backlash from developers who felt the original policy represented an egregious act of betrayal for a myriad of reasons.

While plenty of fury was aimed at how the fee might impact developers’ finances, some of that anger stemmed from Unity’s inability to effectively communicate its new policy and provide clear answers to pertinent questions.

The dust is now supposedly settled, but here’s one more thing: why doesn’t Unity’s explanation for its shifting answers about the Runtime Fee in relation to subscription services hold up to scrutiny?

Elon Musk, the owner of X (formerly known as Twitter) called for more people to record what is happening in the world around them, saying it would “change the world.”

Musk has been a strong advocate for free speech; prior to buying Twitter, he tweeted, “Free speech is essential to a functioning democracy. Do you believe Twitter rigorously adheres to this principle? The consequences of this poll will be important. Please vote carefully.”

CERN’s data store has now crossed the remarkable capacity threshold of one exabyte, meaning that CERN has one million terabytes of disk space ready for data!

CERN’s data store not only serves LHC physics data, but also the whole spectrum of experiments and services needing online data management. This data capacity is provided using 111 000 devices, predominantly hard disks along with an increasing fraction of flash drives. Having such a large number of commodity devices means that component failures are common, so the store is built to be resilient, using different data replication methods. These disks, most of which are used to store physics data, are orchestrated by CERN’s open-source software solution, EOS, which was created to meet the LHC’s extreme computing requirements.

“We reached this new all-time record for CERN’s storage infrastructure after capacity extensions for the upcoming LHC heavy-ion run,” explains Andreas Peters, EOS project leader. “It is not just a celebration of data capacity, it is also a performance achievement, thanks to the reading rate of the combined data store crossing, for the first time, the one terabyte per second (1 TB/s) threshold.”