Jun 28, 2022
Why changing your mind is a feature of evolution, not a bug
Posted by Kelvin Dafiaghor in category: evolution
The ability to change your mind is a key part of being a social creature. Reasoning by oneself is a shallow endeavor.
The ability to change your mind is a key part of being a social creature. Reasoning by oneself is a shallow endeavor.
Our bodies can’t plug-and-play organs like replacement computer parts. The first rule of organ transplant is that the donor organs need to “match” with the host to avoid rejection. That is, the protein molecules that help the body discriminate between self and other need to be similar—a trait common (but not guaranteed) among members of the same family.
The key for getting an organ to “take” is reducing destructive immune attacks—the holy grail in transplantation. One idea is to genetically engineer the transplanted organ so that it immunologically “fits” better with the recipient. Another idea is to look beyond the organ itself to the source of rejection: haemopoietic stem cells, nestled inside the bone marrow, that produce blood and immune cells.
DISOT’s theory is simple but clever: swap out the recipient’s immune system with the donor’s, then transplant the organ. The recipient’s bone marrow is destroyed, but quickly repopulates with the donor’s stem cells. Once the new immune system takes over, the organ goes in.
Sanctuary AI says building human-level artificial intelligence that can execute human tasks safely requires a deep understanding of the living mind. Hello World’s Ashlee Vance heads to Vancouver to see the startup’s progress toward bringing robots to life.
Watch more Hello World in the Pacific Northwest:
Continue reading “The World’s Smartest Robot Is Living in Vancouver” »
Life (as we know it) is based on carbon. Despite its ubiquity, this important element still holds plenty of secrets, on earth and in the heavens above us. For example, astrophysicists like Columbia’s Daniel Wolf Savin who study interstellar clouds want to understand how the chemicals, including carbon, swirling within these nebulous aggregations of gas and dust form the stars and planets that dot our universe and give rise to organic life.
Yet when faced with enormous protein complexes, AI faltered. Until now. In a mind-bending feat, a new algorithm deciphered the structure at the heart of inheritance—a massive complex of roughly 1,000 proteins that helps channel DNA instructions to the rest of the cell. The AI model is built on AlphaFold by DeepMind and RoseTTAfold from Dr. David Baker’s lab at the University of Washington, which were both released to the public to further experiment on.
Our genes are housed in a planet-like structure, dubbed the nucleus, for protection. The nucleus is a high-security castle: only specific molecules are allowed in and out to deliver DNA instructions to the outside world—for example, to protein-making factories in the cell that translate genetic instructions into proteins.
Never mind the cost of computing, OpenAI said the Upwork contractors alone cost $160,000. Though to be fair, manually labeling the whole data set would’ve run into the millions and taken considerable time to complete. And while the computing power wasn’t negligible, the model was actually quite small. VPT’s hundreds of millions of parameters are orders of magnitude less than GPT-3’s hundreds of billions.
Still, the drive to find clever new approaches that use less data and computing is valid. A kid can learn Minecraft basics by watching one or two videos. Today’s AI requires far more to learn even simple skills. Making AI more efficient is a big, worthy challenge.
In any case, OpenAI is in a sharing mood this time. The researchers say VPT isn’t without risk—they’ve strictly controlled access to algorithms like GPT-3 and DALL-E partly to limit misuse—but the risk is minimal for now. They’ve open sourced the data, environment, and algorithm and are partnering with MineRL. This year’s contestants are free to use, modify, and fine-tune the latest in Minecraft AI.
The gene-editing technology has led to innovations in medicine, evolution and agriculture — and raised profound ethical questions about altering human DNA.
Researchers at the University of New South Wales and a startup company, Silicon Quantum Computing, published results of their quantum dot experiments. The circuits use up to 10 carbon-based quantum dots on a silicon substrate. Metal gates control the flow of electrons. The paper appears in Nature and you can download the full paper from there.
What’s new about this is that the dots are precisely arranged to simulate an organic compound, polyacetylene. This allowed researchers to model the actual molecule. Simulating molecules is important in the study of exotic matter phases, such as superconductivity. The interaction of particles inside, for example, a crystalline structure is difficult to simulate using conventional methods. By building a model using quantum techniques on the same scale and with the same topology as the molecule in question, simulation is simplified.
The SSH (Su-Schreffer-Heeger) model describes a single electron moving along a one-dimensional lattice with staggered tunnel couplings. At least, that’s what the paper says and we have to believe it. Creating such a model for simple systems has been feasible, but for a “many body” problem, conventional computing just isn’t up to the task. Currently, the 10 dot model is right at the limit of what a conventional computer can simulate reasonably. The team plans to build a 20 dot circuit that would allow for unique simulations not feasible with classic computing tech.
Researchers from Rice University claim that processing boron nitride nanotubes used to be challenging, but not anymore.
Professors Matteo Pasquali and Angel Martí, along with their team of researchers, have simplified the handling of the highly valuable nanotubes, making them more suited for use in large-scale applications including electronics, aerospace, and energy-efficient materials.
According to the study’s findings published in Nature Communications, boron nitride nanotubes, also known as BNNTs, can self-assemble into liquid crystals when exposed to certain circumstances, particularly concentrations of chlorosulfonic acid that are greater than 170 parts per million by weight.
Transistors are the building blocks of modern electronics, used in everything from televisions to laptops. As transistors have gotten smaller and more compact, so have electronics, which is why your cell phone is a super powerful computer that fits in the palm of your hand.
But there’s a scaling problem: Transistors are now so small that they are difficult to turn off. A key device element is the channel that charge carriers (such as electrons) travel across between electrodes. If that channel gets too short, quantum effects allow electrons to effectively jump from one side to another even when they shouldn’t.
One way to get past this sizing roadblock is to use layers of 2D materials—which are only a single atom thick—as the channel. Atomically thin channels can help enable even smaller transistors by making it harder for the electrons to jump between electrodes. One well-known example of a 2D material is graphene, whose discoverers won the Nobel Prize in Physics in 2010. But there are other 2D materials, and many believe they are the future of transistors, with the promise of scaling channel thickness down from its current 3D limit of a few nanometers (nm, billionths of a meter) to less than a single nanometer thickness.