A new paper shows that the recent discoveries of exoplanets, plus a revised Drake’s equation, produces a new, empirically valid probability of whether any other advanced civilizations have ever existed.
Category: information science – Page 310
Cambridge University spin-out Optalysys has been awarded a $350k grant for a 13-month project from the US Defense Advanced Research Projects Agency (DARPA). The project will see the company advance their research in developing and applying their optical co-processing technology to solving complex mathematical equations. These equations are relevant to large-scale scientific and engineering simulations such as weather prediction and aerodynamics.
The Optalysys technology is extremely energy efficient, using light rather than electricity to perform intensive mathematical calculations. The company aims to provide existing computer systems with massively boosted processing capabilities, with the aim to eventually reach exaFLOP rates (a billion billion calculations per second). The technology operates at a fraction of the energy cost of conventional high-performance computers (HPCs) and has the potential to operate at orders of magnitude faster.
In April 2015 Optalysys announced that they had successfully built a scaleable, lens-less optical processing prototype that can perform mathematical functions. Codenamed Project GALELEO, the device demonstrates that second order derivatives and correlation pattern matching can be performed optically in a scaleable design.
Nice; however, I see also 3D printing along with machine learning being part of any cosmetic procedures and surgeries.
With an ever-increasing volume of electronic data being collected by the healthcare system, researchers are exploring the use of machine learning—a subfield of artificial intelligence—to improve medical care and patient outcomes. An overview of machine learning and some of the ways it could contribute to advancements in plastic surgery are presented in a special topic article in the May issue of Plastic and Reconstructive Surgery®, the official medical journal of the American Society of Plastic Surgeons (ASPS).
“Machine learning has the potential to become a powerful tool in plastic surgery, allowing surgeons to harness complex clinical data to help guide key clinical decision-making,” write Dr. Jonathan Kanevsky of McGill University, Montreal, and colleagues. They highlight some key areas in which machine learning and “Big Data” could contribute to progress in plastic and reconstructive surgery.
Machine Learning Shows Promise in Plastic Surgery Research and Practice
Machine learning analyzes historical data to develop algorithms capable of knowledge acquisition. Dr. Kanevsky and coauthors write, “Machine learning has already been applied, with great success, to process large amounts of complex data in medicine and surgery.” Projects with healthcare applications include the IBM Watson Health cognitive computing system and the American College of Surgeons’ National Surgical Quality Improvement Program.
Due to the pace of Quantum Computing is developing; NIST is rushing to create a Quantum proof cryptographic algorithms to prevent QC hacking. Like I have stated, I believe we’re now less that 7 years away for QC being in many mainstream devices, infrastructure, etc. And, China and it’s partnership with Australia; the race is now on and hotter than ever.
The National Institute for Standards and Technology has begun to look into quantum cybersecurity, according to a new report that details and plans out ways scientists could protect these futuristic computers.
April 29, 2016.
Ransomware has taken off in 2016, already eclipsing the number of attacks observed in a recently published threat report from Symantec.
Post-quantum cryptography discussion in Tacoma WA on May 5th discussing hacking by QC hackers and leveraging Cryptography algorithms to offset the attacks; may be of interest to sit in and even join in the debates. I will try attend if I can because it would be interesting to see the arguments raised and see the responses.
The University of Washington Tacoma Institute of Technology will present a discussion about the esoteric field of post-quantum cryptography at the Northwest Cybersecurity Symposium on May 5.
“I’ve been researching post-quantum cryptography for years, finding ways to protect against a threat that doesn’t yet exist,” said Anderson Nascimento, assistant professor of computer science at the institute, in a release.
Post-quantum cryptography refers to encryption that would be secure against an attack by a quantum computer — a kind of supercomputer using quantum mechanics, which, so far, exists only in theory.
“[Using DNA,] you could fit all the knowledge in the whole world inside the trunk of your car,” Twist Bioscience CEO Emily Leproust told TechCrunch.
Twist Bioscience, a startup making and using synthetic DNA to store digital data, just struck a contract with Microsoft and the University of Washington to encode vast amounts of information on synthetic genes.
Big data means business and the company able to gather a lot of it is very valuable to investors and stockholders. But that data needs to be stored somewhere and can cost a lot for the upkeep.
Digital data stored on media also has a finite shelf life. But researchers have discovered new ways to stuff digital information over the last few years – including in our DNA, which can last thousands of years intact.
Supercomputer facing problems?
In the world of High Performance Computing (HPC), supercomputers represent the peak of capability, with performance measured in petaFLOPs (1015 operations per second). They play a key role in climate research, drug research, oil and gas exploration, cryptanalysis, and nuclear weapons development. But after decades of steady improvement, changes are coming as old technologies start to run into fundamental problems.
When you’re talking about supercomputers, a good place to start is the TOP500 list. Published twice a year, it ranks the world’s fastest machines based on their performance on the Linpack benchmark, which solves a dense system of linear equations using double precision (64 bit) arithmetic.
Looking down the list, you soon run into some numbers that boggle the mind. The Tianhe-2 (Milky Way-2), a system deployed at the National Supercomputer Center in Guangzho, China, is the number one system as of November 2015, a position it’s held since 2013. Running Linpack, it clocks in at 33.86 × 1015 floating point operations per second (33.86 PFLOPS).
Good article overall highlighting the gaps in AI talent. I do know that some of the best AI SMEs in the US all have worked somewhere in their careers at the US National Labs because many us had to build “real time” systems that leveraged complex algorithms to self-monitor conditions and react independently under certain conditions arise and in some cases we leveraged the super computer to prove theories as well. I suggest locate where some of these folks exist because you will find your talent pool.
Artificial Intelligence is the field where jobs continue to grow, provided you have the desired skill sets
Diksha Gupta, Techgig.com
Artificial intelligence (AI) is the buzzword in almost all industries. Decision-makers want to make use of massive data they get from various sources. This is where data analytics and artificial intelligence come into play.
Nice; taking design and manufacturing to new levels.
Advanced materials are increasingly embodying counterintuitive properties, such as extreme strength and super lightness, while additive manufacturing and other new technologies are vastly improving the ability to fashion these novel materials into shapes that would previously have been extremely costly or even impossible to create. Generating new designs that fully exploit these properties, however, has proven extremely challenging. Conventional design technologies, representations, and algorithms are inherently constrained by outdated presumptions about material properties and manufacturing methods. As a result, today’s design technologies are simply not able to bring to fruition the enormous level of physical detail and complexity made possible with cutting-edge manufacturing capabilities and materials.
To address this mismatch, DARPA today announced its TRAnsformative DESign (TRADES) program. TRADES is a fundamental research effort to develop new mathematics and algorithms that can more fully take advantage of the almost boundless design space that has been enabled by new materials and fabrication methods.
“The structural and functional complexities introduced by today’s advanced materials and manufacturing methods have exceeded our capacity to simultaneously optimize all the variables involved,” said Jan Vandenbrande, DARPA program manager. “We have reached the fundamental limits of what our computer-aided design tools and processes can handle, and need revolutionary new tools that can take requirements from a human designer and propose radically new concepts, shapes and structures that would likely never be conceived by even our best design programs today, much less by a human alone.”