Toggle light / dark theme

Discusses the possibility of Femtotech and the technological possibilities it may unlock. Not long ago nanotechnology was a fringe topic; now it’s a flourishing engineering field, and fairly mainstream. For example, while writing this article, I happened to receive an email advertisement for the “Second World Conference on Nanomedicine and Drug Delivery,” in Kerala, India. It wasn’t so long ago that nanomedicine seemed merely a flicker in the eyes of Robert Freitas and a few other visionaries!

But nano is not as small as the world goes. A nanometer is 10–9 meters – the scale of atoms and molecules. A water molecule is a bit less than one nanometer long, and a germ is around a thousand nanometers across. On the other hand, a proton has a diameter of a couple femtometers – where a femtometer, at 10–15 meters, makes a nanometer seem positively gargantuan. Now that the viability of nanotech is widely accepted (in spite of some ongoing heated debates about the details), it’s time to ask: what about femtotech? Picotech or other technologies at the scales between nano and femto seem relatively uninteresting, because we don’t know any basic constituents of matter that exist at those scales. But femtotech, based on engineering structures from subatomic particles, makes perfect conceptual sense, though it’s certainly difficult given current technology.

The nanotech field was arguably launched by Richard Feynman’s 1959 talk “There’s Plenty of Room at the Bottom.” As Feynman wrote there.

“It is a staggeringly small world that is below. In the year 2000, when they look back at this age, they will wonder why it was not until the year 1960 that anybody began seriously to move in this direction.

You can buy Universe Sandbox 2 game here: http://amzn.to/2yJqwU6

Hello and welcome to What Da Math!
In this video, we will talk about alien life.

Links: http://www.qsl.net/pa2ohh/jsffield.htm.
http://www.seti.org.au/spacecom/setionabudget.html.
https://en.wikipedia.org/wiki/Link_budget.
https://en.wikipedia.org/wiki/List_of_interstellar_radio_messages.
http://www.spaceacademy.net.au/spacelink/spcomcalc.htm.

Support this channel on Patreon to help me make this a full time job:

Clues to a black hole’s origins can be found in the way it spins. This is especially true for binaries, in which two black holes circle close together before merging. The spin and tilt of the respective black holes just before they merge can reveal whether the invisible giants arose from a quiet galactic disk or a more dynamic cluster of stars.

Astronomers are hoping to tease out which of these origin stories is more likely by analyzing the 69 confirmed detected to date. But a new study finds that for now, the current catalog of binaries is not enough to reveal anything fundamental about how black holes form.

In a study appearing today in the journal Astronomy and Astrophysics, MIT physicists show that when all the known binaries and their spins are worked into models of black hole formation, the conclusions can look very different, depending on the particular model used to interpret the data.

For decades, astronomers and physicists have been trying to solve one of the deepest mysteries about the cosmos: An estimated 85% of its mass is missing. Numerous astronomical observations indicate that the visible mass in the universe is not nearly enough to hold galaxies together and account for how matter clumps. Some kind of invisible, unknown type of subatomic particle, dubbed dark matter, must provide the extra gravitational glue.

In underground laboratories and at , scientists have been searching for this dark matter with no success for more than 30 years. Researchers at NIST are now exploring new ways to search for the invisible particles. In one study, a prototype for a much larger experiment, researchers have used state-of-the-art superconducting detectors to hunt for dark matter.

The study has already placed new limits on the possible mass of one type of hypothesized dark matter. Another NIST team has proposed that trapped electrons, commonly used to measure properties of ordinary particles, could also serve as highly sensitive detectors of hypothetical dark matter particles if they carry charge.

Non-metal nitrides are compounds in which nitrogen and non-metallic elements are linked by covalent bonds. Because of their technologically interesting properties, they have increasingly become the focus of materials research. In Chemistry—A European Journal, an international team with researchers from the University of Bayreuth presents previously unknown phosphorus-nitrogen compounds synthesized under very high pressures.

They contain structural units whose existence could not be empirically proven before. The study exemplifies the great, as yet untapped potential of high-pressure research for nitrogen chemistry.

The researchers succeeded in synthesizing a previously unknown modification of the phosphorus nitride P₃N₅, the polymorph δ-P₃N₅, at a pressure of 72 gigapascals. At 134 gigapascals, the phosphorus nitride PN₂ formed in the diamond anvil cell. Both compounds are classified as ultra-incompressible materials with the bulk moduli above 320 GPa.

The Large Hadron Collider Beauty (LHCb) experiment at CERN is the world’s leading experiment in quark flavor physics with a broad particle physics program. Its data from Runs 1 and 2 of the Large Hadron Collider (LHC) has so far been used for over 600 scientific publications, including a number of significant discoveries.

While all scientific results from the LHCb collaboration are already publicly available through open access papers, the data used by the researchers to produce these results is now accessible to anyone in the world through the CERN open data portal. The data release is made in the context of CERN’s Open Science Policy, reflecting the values of transparency and international collaboration enshrined in the CERN Convention for more than 60 years.

“The data collected at LHCb is a unique legacy to humanity, especially since no other experiment covers the region LHCb looks at,” says Sebastian Neubert, leader of the LHCb open data project. “It has been obtained through a huge international collaborative effort, which was funded by the public. Therefore the data belongs to society.”

To perform coordinated movements, we rely on special sensory neurons in our muscles and joints. Without them, the brain wouldn’t know what the rest of our body was doing. A team led by Niccolò Zampieri has studied their molecular markers to better understand how they work and describes the results in Nature Communications.

Sight, hearing, smell, taste, touch: We’re all familiar with the five senses that allow us to experience our surroundings.

Equally important but much less well known is the sixth sense: “Its job is to collect information from the muscles and joints about our movements, our posture and our position in space, and then pass that on to our central nervous system,” says Dr. Niccolò Zampieri, head of the Development and Function of Neural Circuits Lab at the Max Delbrück Center in Berlin. “This sense, known as proprioception, is what allows the central nervous system to send the right signals through to muscles so that we can perform a specific movement.”

Elon Musk told Twitter’s founder Jack Dorsey that there was allegedly important data being hidden from the former CEO during his tenure at the helm of the social media company after Mr Dorsey had called for “full transparency” around the so-called “Twitter Files”.

On Wednesday, Mr Dorsey responded to a tweet from Mr Musk and asked him to publish all data from the microblogging platform, uncensored, in a Wikileaks-style dump.

“If the goal is transparency to build trust, why not just release everything without filter and let people judge for themselves? Including all discussions around current and future actions?” tweeted Mr Dorsey. “Make everything public now.”

The software system competed against human coders in programming contests.

A novel system called AlphaCode uses artificial intelligence (AI) to create computer code, and has recently participated in programming competitions, using critical thinking, algorithms, and natural language comprehension. The AI system performed extremely well in competitions.


AlphaCode can create code quickly and efficiently

AlphaCode is an AI software system created by DeepMind, a subsidiary of the company Alphabet, the parent company of Google. The software generates code in Python or C++, while filtering out any bad coding. It has the ability to generate code at an exceptional rate.