Computing machinery and intelligence by Alan Turing.
Shared with Dropbox.
Posted in computing
I recently got a call from my IT department asking why I was driving a significant amount of Azure spending in the past month. Before we were in the cloud, this type of question never came up. Rather, it was me asking IT for more servers to run my workloads. Whether or not I was using our on-premise computing resources was irrelevant—that is, until I ran out.
My experience is not at all unique. In our modern, post-cloud world, every organization has gone from unmetered, unfettered access to compute resources to a metered, easy-to-inspect, pay-by-the-second cloud spending nightmare. What we gained in endlessly scalable, elastic compute, we lost in our ability to run workloads without anyone watching. This new reality demands an elevated level of fiscal responsibility and shared ownership, especially as it relates to analytics.
The cloud computing pay-per-use model means organizations can no longer run workloads without considering the costs those workloads generate. It’s now imperative that organizations manage their cloud spending to stay competitive.
A perovskite-based device that combines aspects of electronics and photonics may open doors to new kinds of computer chips or quantum qubits.
MIT
MIT is an acronym for the Massachusetts Institute of Technology. It is a prestigious private research university in Cambridge, Massachusetts that was founded in 1861. It is organized into five Schools: architecture and planning; engineering; humanities, arts, and social sciences; management; and science. MIT’s impact includes many scientific breakthroughs and technological advances. Their stated goal is to make a better world through education, research, and innovation.
Particle-like quantum states called non-abelian anyons remember being swapped and could be useful for protecting information in quantum computers.
The benefits of smart cities will only be realised when digital infrastructures can cope, says Neil Cresswell, VIRTUS Data Centres. What’s the role of next-generation data centres?
Light is a key carrier of information. It enables high-speed data transmission around the world via fiber-optic telecommunication networks. This information-carrying capability can be extended to transmitting quantum information by encoding it in single particles of light (photons).
“To efficiently load single photons into quantum information processing devices, they must have specific properties: the right central wavelength or frequency, a suitable duration, and the right spectrum,” explains Dr. Michał Karpinski, head of the Quantum Photonics Laboratory at the Faculty of Physics of the University of Warsaw, and an author of the paper published in Nature Photonics.
Researchers around the globe are building prototypes of quantum computers using a variety of techniques, including trapped ions, quantum dots, superconducting electric circuits, and ultracold atomic clouds. These quantum information processing platforms operate on a variety of time scales, from picoseconds through nanoseconds to even microseconds.
Quantum computing – “Youre gonna need a smarter IT team…”
• Quantum computing is expected to become a functioning reality in the next seven years. • The IT sector already has a skills gap. • Quantum computing is likely to add new skills to the shortage.
Quantum computing is expected to become a functioning reality within a generation, with many leading companies predicting it will be an adoptable technology by 2030. That’s going to bring a significant difference to traditional IT teams, as quantum computing is likely to involve different problems, different solutions, and a fairly new methodology to what we think of as the IT team’s role.
Discovered in 2004, graphene has revolutionized various scientific fields. It possesses remarkable properties like high electron mobility, mechanical strength, and thermal conductivity. Extensive time and effort has been invested in exploring its potential as a next-generation semiconductor material, leading to the development of graphene-based transistors, transparent electrodes, and sensors.
But to render these devices into practical application, it’s crucial to have efficient processing techniques that can structure graphene films at micrometer and nanometer scale. Typically, micro/nanoscale material processing and device manufacturing employ nanolithography and focused ion beam methods. However, these have posed longstanding challenges for laboratory researchers due to their need for large-scale equipment, lengthy manufacturing times, and complex operations.
In January 2023, Tohoku University researchers created a technique that could micro/nanofabricate silicon nitride devices with thicknesses ranging from five to 50 nanometers. The method employed a femtosecond laser, which emitted extremely short, rapid pulses of light. It turned out to be capable of quickly and conveniently processing thin materials without a vacuum environment.
Researchers from the Max Born Institute in Berlin have successfully performed X-ray Magnetic Circular Dichroism (XMCD) experiments in a laser laboratory for the first time.
Unlocking the secrets of magnetic materials requires the right illumination. Magnetic x-ray circular dichroism makes it possible to decode magnetic order in nanostructures and to assign it to different layers or chemical elements. Researchers at the Max Born Institute in Berlin have succeeded in implementing this unique measurement technique in the soft-x-ray range in a laser laboratory. With this development, many technologically relevant questions can now be investigated outside of scientific large-scale facilities for the first time.
Magnetic nanostructures have long been part of our everyday life, e.g., in the form of fast and compact data storage devices or highly sensitive sensors. A major contribution to the understanding of many of the relevant magnetic effects and functionalities is made by a special measurement method: X-ray Magnetic Circular Dichroism (XMCD).
The first protein-based nano-computing agent that functions as a circuit has been created by Penn State researchers. The milestone puts them one step closer to developing next-generation cell-based therapies to treat diseases like diabetes and cancer.
Traditional synthetic biology approaches for cell-based therapies, such as ones that destroy cancer cells or encourage tissue regeneration after injury, rely on the expression or suppression of proteins that produce a desired action within a cell. This approach can take time (for proteins to be expressed and degrade) and cost cellular energy in the process. A team of Penn State College of Medicine and Huck Institutes of the Life Sciences researchers are taking a different approach.
“We’re engineering proteins that directly produce a desired action,” said Nikolay Dokholyan, G. Thomas Passananti Professor and vice chair for research in the Department of Pharmacology. “Our protein-based devices or nano-computing agents respond directly to stimuli (inputs) and then produce a desired action (outputs).”