Menu

Blog

Archive for the ‘computing’ category: Page 316

Mar 22, 2022

Nvidia’s new Omniverse tools will make it easier than ever to build virtual worlds

Posted by in category: computing

“This is an answer to a huge demand we’ve had from a number of customers who wanted access to this platform but were limited because of the platform they’re on,” Richard Kerris, Nvidia’s Omniverse VP, said to reporters this week.

Omniverse Cloud is in early access now, and Nvidia is taking applications for it.

Next, Nvidia announced Omniverse OVX, a computing system designed specifically to meet the needs of massive simulations — or industrial digital twins.

Mar 21, 2022

Lensless Camera Captures Cellular-Level 3D Details

Posted by in categories: computing, information science

Rice University researchers have tested a tiny lensless microscope called Bio-FlatScope, capable of producing high levels of detail in living samples. The team imaged plants, hydra, and, to a limited extent, a human.

A previous iteration of the technology, FlatCam, was a lensless device that channeled light through a mask and directly onto a camera sensor, aimed primarily outward at the world at large. The raw images looked like static, but a custom algorithm translated the raw data into focused images.

The device described in current research looks inward to image micron-scale targets such as cells and blood vessels inside the body, and even through skin. The technology combines a sophisticated phase mask to generate patterns of light that fall directly onto the chip, the researchers said. The mask in the original FlatCam looked like a barcode and limited the amount of light that passes through to the sensor.

Mar 21, 2022

AMD Releases Milan-X CPUs With 3D V-Cache: EPYC 7003 Up to 64 Cores and 768 MB L3 Cache

Posted by in categories: computing, futurism

There’s been a lot of focus on how both Intel and AMD are planning for the future in packaging their dies to increase overall performance and mitigate higher manufacturing costs. For AMD, that next step has been V-cache, an additional L3 cache (SRAM) chiplet that’s designed to be 3D die stacked on top of an existing Zen 3 chiplet, tripling the total about of L3 cache available. Today, AMD’s V-cache technology is finally available to the wider market, as AMD is announcing that their EPYC 7003X “Milan-X” server CPUs have now reached general availability.

As first announced late last year, AMD is bringing its 3D V-Cache technology to the enterprise market through Milan-X, an advanced variant of its current-generation 3rd Gen Milan-based EPYC 7,003 processors. AMD is launching four new processors ranging from 16-cores to 64-cores, all of them with Zen 3 cores and 768 MB L3 cache via 3D stacked V-Cache.

AMD’s Milan-X processors are an upgraded version of its current 3rd generation Milan-based processors, EPYC 7003. Adding to its preexisting Milan-based EPYC 7,003 line-up, which we reviewed back in June last year, the most significant advancement from Milan-X is through its large 768 MB of L3 cache using AMD’s 3D V-Cache stacking technology. The AMD 3D V-Cache uses TSMC’s N7 process node – the same node Milan’s Zen 3 chiplets are built upon – and it measures at 36 mm², with a 64 MiB chip on top of the existing 32 MiB found on the Zen 3 chiplets.

Mar 20, 2022

Transistor gate is just 0.3 nm long

Posted by in categories: computing, particle physics

“Moore’s law could once again get a reprieve, in spite of the naysayers.”


Using graphene and molybdenum disulphide, scientists in China have made a transistor gate with a length of only 0.3 nanometres, equivalent to just one carbon atom, by exploiting the vertical aspect of the device.

In 1959, scientists at Bell Labs invented the metal–oxide–semiconductor field-effect transistor (MOSFET). This led to mass-production of transistors for a wide range of applications – including computer processors. The Intel 4,004, the first commercially produced microprocessor, debuted in 1971 and featured 2,250 transistors on a single chip, using a 10,000 nm (10 µm) fabrication process.

Continue reading “Transistor gate is just 0.3 nm long” »

Mar 19, 2022

Clockwork DevTerm R-01 Takes RISC-V Out For A Spin

Posted by in categories: computing, education

If you’re anything like us you’ve been keeping a close eye on the development of RISC-V: an open standard instruction set architecture (ISA) that’s been threatening to change the computing status quo for what seems like forever. From its humble beginnings as a teaching tool in Berkeley’s Parallel Computing Lab in 2010, it’s popped up in various development boards and gadgets from time to time. It even showed up in the 2019 Hackaday Supercon badge, albeit in FPGA form. But getting your hands on an actual RISC-V computer has been another story entirely. Until now, that is.

Clockwork has recently announced the availability of the DevTerm R-01, a variant of their existing portable computer that’s powered by a RISC-V module rather than the ARM chips featured in the earlier A04 and A06 models. Interestingly the newest member of the family is actually the cheapest at $239 USD, though it’s worth mentioning that not only does this new model only include 1 GB of RAM, but the product page makes it clear that the RISC-V version is intended for experienced penguin wranglers who aren’t afraid of the occasional bug.

Beyond the RISC-V CPU and slimmed down main memory, this is the same DevTerm that our very own [Donald Papp] reviewed earlier this month. Thanks to the modular nature of the portable machine, this sort of component swapping is a breeze, though frankly we’re impressed that the Clockwork team is willing to go out on such a limb this early in the product’s life. In our first look at the device we figured at best they would release an updated CPU board to accommodate the Raspberry Pi 4 Compute Module, but supporting a whole new architecture is a considerably bolder move. One wonders that other plans they may have for the retro-futuristic machine. Perhaps a low-power x86 chip isn’t out of the question?

Mar 18, 2022

Human brain organoids grown in cheap 3D-printed bioreactor

Posted by in categories: computing, neuroscience

Circa 2021


It is now possible to grow and culture human brain tissue in a device that costs little more than a cup of coffee. With a $5 washable and reusable microchip, scientists can watch self-organising brain samples, known as brain organoids, growing in real time under a microscope.

The device, dubbed a “microfluidic bioreactor”, is a 4-by-6-centimetre chip that includes small wells in which the brain organoids grow. Each is filled with nutrient-rich fluid that is pumped in and out automatically, like the fluids that flush through the human brain.

Continue reading “Human brain organoids grown in cheap 3D-printed bioreactor” »

Mar 18, 2022

Future evolution: from looks to brains and personality, how will humans change in the next 10,000 years?

Posted by in categories: biotech/medical, computing, food, genetics, information science, mobile phones, neuroscience

And going forward, we’ll do this with far more knowledge of what we’re doing, and more control over the genes of our progeny. We can already screen ourselves and embryos for genetic diseases. We could potentially choose embryos for desirable genes, as we do with crops. Direct editing of the DNA of a human embryo has been proven to be possible — but seems morally abhorrent, effectively turning children into subjects of medical experimentation. And yet, if such technologies were proven safe, I could imagine a future where you’d be a bad parent not to give your children the best genes possible.

Computers also provide an entirely new selective pressure. As more and more matches are made on smartphones, we are delegating decisions about what the next generation looks like to computer algorithms, who recommend our potential matches. Digital code now helps choose what genetic code passed on to future generations, just like it shapes what you stream or buy online. This might sound like dark science fiction, but it’s already happening. Our genes are being curated by computer, just like our playlists. It’s hard to know where this leads, but I wonder if it’s entirely wise to turn over the future of our species to iPhones, the internet and the companies behind them.

Discussions of human evolution are usually backward looking, as if the greatest triumphs and challenges were in the distant past. But as technology and culture enter a period of accelerating change, our genes will too. Arguably, the most interesting parts of evolution aren’t life’s origins, dinosaurs, or Neanderthals, but what’s happening right now, our present – and our future.

Mar 18, 2022

How Graphene will Save Moore’s Law

Posted by in categories: computing, materials

While many say that Moore’s Law is dead, scientists are hard at work discovering new semiconductor materials which will help increase CPU and GPU performance well into the 2030’s right on track of Moore’s Laws exponential properties. Companies such as TSMC and Intel could use Graphene to make the smallest possible transistors and much improve their efficiency as electricity prices skyrocket. 2nm or 1nm processors might soon come out.

TIMESTAMPS:
00:00 The Revival of Moore’s Law.
01:15 Smallest Transistor ever made.
03:54 What actually are transistors?
05:49 Moore’s Law Is Dead?
07:55 Last Words.

#cpu #mooreslaw #graphene

Mar 18, 2022

This Diamond Transistor is Still Raw, But Its Future Looks Bright

Posted by in categories: computing, cosmology, quantum physics

Researchers in Japan have developed a diamond FET with high hole mobility.


In the 1970s, Stephen Hawking found that an isolated black hole would emit radiation but only when considered quantum mechanics. This is known as black hole evaporation because the black hole shrinks. However, this led to the black hole information paradox.

If the black hole evaporates entirely, physical information would permanently disappear in a black hole. However, this violates a core precept of quantum physics: the information cannot vanish from the Universe.

Continue reading “This Diamond Transistor is Still Raw, But Its Future Looks Bright” »

Mar 18, 2022

The coming decade of digital brain research — A vision for neuroscience at the intersection of technology and computing

Posted by in categories: biotech/medical, computing, neuroscience

Brain research has in recent years indisputably entered a new epoch, driven by substantial methodological advances and digitally enabled data integration and modeling at multiple scales – from molecules to the whole system. Major advances are emerging at the intersection of neuroscience with technology and computing. This new science of the brain integrates high-quality basic research, systematic data integration across multiple scales, a new culture of large-scale collaboration and translation into applications. A systematic approach, as pioneered in Europe’s Human Brain Project (HBP), will be essential in meeting the pressing medical and technological challenges of the coming decade.