Toggle light / dark theme

YouTube will soon make users add a disclaimer when they post artificial intelligence-generated or manipulated videos.

In a company blog post, the video giant outlined its forthcoming rule change that will not only require a warning label, but will display disclaimers larger for certain types of “sensitive” content such as elections and public health crises.

As Bloomberg reports, this change at the Alphabet-owned company comes after a September announcement that election ads across the firm’s portfolio will require “prominent” disclosures if manipulated or generated by AI — a rule that’s slated to begin mid-November, the outlet previously reported.

This is why i laughed at all that un canny valley crap talk in early 2010s. notice term is almost never used anymore. And, as for makin robots more attractive than most people. done in mid 2030s.


Does ChatGPT ever give you the eerie sense you’re interacting with another human being?

Artificial intelligence (AI) has reached an astounding level of realism, to the point that some tools can even fool people into thinking they are interacting with another human.

The eeriness doesn’t stop there. In a study published today in Psychological Science, we’ve discovered images of white faces generated by the popular StyleGAN2 algorithm look more “human” than actual people’s faces.

The world’s most valuable chip maker has announced a next-generation processor for AI and high-performance computing workloads, due for launch in mid-2024. A new exascale supercomputer, designed specifically for large AI models, is also planned.

H200 Tensor Core GPU. Credit: NVIDIA

In recent years, California-based NVIDIA Corporation has played a major role in the progress of artificial intelligence (AI), as well as high-performance computing (HPC) more generally, with its hardware being central to astonishing leaps in algorithmic capability.

In this article we look at several introductions of digital storage related products at the 2023 Supercomputing Conference.


WDC was also showing its hybrid storage JBOD Ultrastar Data102 and Data60 platforms to support disaggregated storage and software-defined storage (SDS). This comes in dual-port SAS or single-port SATA configurations. The Data102 has storage capacities up to 2.65PB and the Data60 has up to 1.56TB in a 4U enclosure that includes IsoVibe and ArticFlow technologies for improved performance and reliability. The Data102 and Data60 capacity numbers assume using 26TB SMR HDDs.

WDC was also showing a GPUDirect storage proof of concept combining the company’s RaidFlex technology with Ingrasys ES2100 with integrated NVIDIA Spectrum Ethernet switches as well as NVIDIA’s GPUs, Magnum IO GPUDirect storage, BlueField DPUs and ConnectX SmartNICs. The proof-of-concept demonstration can provide 25GB/s bandwidth for a single NVIDIA A100 Tensor Core GPU and over 100GB/s for four NVIDIA A100 GPUs.

At SC23 Arcitecta and DDN introduced software defined storage solutions for AI and cloud applications. WDC was also showing SDS, its OpenFlex NVMe storage and GPUDirect storage.

The only AI Hardware startup to realize revenue exceeding $100M has finished the first phase of Condor Galaxy 1 AI Supercomputer with partner G42 of the UAE. Other Cerebras customers are sharing their CS-2 results at Supercomputing ‘23, building momentum for the inventor of wafer-scale computing. This company is on a tear.

Four short months ago, Cerebras announced the most significant deal any AI startup has been able to assemble with partner G42 (Group42), an artificial intelligence and cloud computing company. The eventual 256 CS-2 wafer-scale nodes with 36 Exaflops of AI performance will be one of the world’s largest AI supercomputers, if not the largest.

Cerebras has now finished the first data center implementation and started on the second. These two companies are moving fast to capitalize on the $70B (2028) gold rush to stand up Large Language Model services to researchers and enterprises, especially while the supply of NVIDIA H100 remains difficult to obtain, creating an opportunity for Cerebras. In addition, Cerebras has recently announced it has released the largest Arabic Language Model, the Jais30B with Core42 using the CS-2, a platform designed to make the development of massive AI models accessible by eliminating the need to decompose and distribute the problem.