Toggle light / dark theme

Scientists at the California Institute of Technology can now assess a person’s intelligence in moments with nothing more than a brain scan and an AI algorithm, university officials announced this summer.

Caltech researchers led by Ralph Adolphs, PhD, a professor of psychology, neuroscience and biology and chair of the Caltech Brain Imaging Center, said in a recent study that they, alongside colleagues at Cedars-Sinai Medical Center and the University of Salerno, were successfully able to predict IQ in hundreds of patients from fMRI scans of resting-state brain activity. The work is pending publication in the journal Philosophical Transactions of the Royal Society.

Adolphs and his team collected data from nearly 900 men and women for their research, all of whom were part of the National Institutes of Health (NIH)-driven Human Connectome Project. The researchers trained their machine learning algorithm on the complexities of the human brain by feeding the brain scans and intelligence scores of these hundreds of patients into the algorithm—something that took very little effort on the patients’ end.

Read more

It’s amusing that these people know where this is headed, but arent interested enough to stop it.


The co-chief investment officer and co-chairman of Bridgewater Associates shared his thoughts in a Facebook post on Thursday.

Dalio says he was responding to a question about whether machine intelligence would put enough people out of work that the government will have to pay people to live with a cash handout, a concept known as universal basic income.

My view is that algorithmic/automated decision making is a two edged sword that is improving total productivity but is also eliminating jobs, leading to big wealth and opportunity gaps and populism, and creating a national emergency.

Contrary to what Silicon Valley portrays, you’ll need more than drive and intelligence to land a high-paying job in the tech world. You’ll need to be well versed in one of the most popular and fastest growing programming languages: Python.

SEE ALSO: Walmart’s new text service bypasses app, website to order stuff online

Python made its debut in 1990, and since then it’s been focused and refined by some of the brightest programmers in the industry. That’s resulted in its current status as a multi-faceted, yet beautifully simple language with a wide variety of applications, from interfacing with SQL databases to building websites.

Read more

What if a large class of algorithms used today—from the algorithms that help us avoid traffic to the algorithms that identify new drug molecules—worked exponentially faster?

Computer scientists at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed a completely new kind of algorithm, one that exponentially speeds up computation by dramatically reducing the number of parallel steps required to reach a solution.

The researchers will present their novel approach at two upcoming conferences: the ACM Symposium on Theory of Computing (STOC), June 25–29 and International Conference on Machine Learning (ICML), July 10 −15.

Read more

Recommended Books ➤

📖 Life 3.0 — http://azon.ly/ij9u
📖 The Master Algorithm — http://azon.ly/excm
📖 Superintelligence — http://azon.ly/v8uf

This video is the twelfth and final in a multi-part series discussing computing. In this video, we’ll be discussing the future of computing, more specifically – the evolution of the field of computing and extrapolating forward based on topics we’ve discussed so far in this series!

[0:31–5:50] Starting off we’ll discuss, the 3 primary eras in the evolution of the field of computing since its inception, the: tabulating, programming and cognitive eras.

We have a ‘thirst for knowledge’ but sometime ‘ignorance is bliss’, so how do we choose between these two mind states at any given time?

UCL psychologists have discovered our brains use the same algorithm and neural architecture to evaluate the opportunity to gain information, as it does to evaluate rewards like food or money.

Funded by the Wellcome Trust, the research, published in the Proceedings of the National Academy of Sciences, also finds that people will spend money to both obtain advance knowledge of a good upcoming event and to remain ignorant of an upcoming bad event.

Read more

Recommended Books ➤

📖 Life 3.0 — http://azon.ly/ij9u
📖 The Master Algorithm — http://azon.ly/excm
📖 Superintelligence — http://azon.ly/v8uf

This video is the eighth in a multi-part series discussing computing and the first discussing non-classical computing. In this video, we’ll be discussing what optical computing is and the impact it will have on the field of computing.

[0:27–6:03] Starting off we’ll discuss, what optical computing/photonic computing is. More specifically, how this paradigm shift is different from typical classical (electron-based computers) and the benefits it will bring to computational performance and efficiency!

[6:03–10:25] Following that we’ll look at, current optical computing initiatives including: optical co-processors, optical RAM, optoelectronic devices, silicon photonics and more!

Thank you to the patron(s) who supported this video ➤

Read more

  • There has been a 14X increase in the number of active AI startups since 2000. Crunchbase, VentureSource, and Sand Hill Econometrics were also used for completing this analysis with AI startups in Crunchbase cross-referenced to venture-backed companies in the VentureSource database. Any venture-backed companies from the Crunchbase list that were identified in the VentureSource database were included.

  • The share of jobs requiring AI skills has grown 4.5X since 2013., The growth of the share of US jobs requiring AI skills on the Indeed.com platform was calculated by first identifying AI-related jobs using titles and keywords in descriptions. Job growth is a calculated as a multiple of the share of jobs on the Indeed platform that required AI skills in the U.S. starting in January 2013. The study also calculated the growth of the share of jobs requiring AI skills on the Indeed.com platform, by country. Despite the rapid growth of the Canada and UK. AI job markets, Indeed.com reports they are respectively still 5% and 27% of the absolute size of the US AI job market.

  • Machine Learning, Deep Learning and Natural Language Processing (NLP) are the three most in-demand skills on Monster.com. Just two years ago NLP had been predicted to be the most in-demand skill for application developers creating new AI apps. In addition to skills creating AI apps, machine learning techniques, Python, Java, C++, experience with open source development environments, Spark, MATLAB, and Hadoop are the most in-demand skills. Based on an analysis of Monster.com entries as of today, the median salary is $127,000 in the U.S. for Data Scientists, Senior Data Scientists, Artificial Intelligence Consultants and Machine Learning Managers.

  • Error rates for image labeling have fallen from 28.5% to below 2.5% since 2010. AI’s inflection point for Object Detection task of the Large Scale Visual Recognition Challenge (LSVRC) Competition occurred in 2014. On this specific test, AI is now more accurate than human These findings are from the competition data from the leaderboards for each LSVRC competition hosted on the ImageNet website.

  • Global revenues from AI for enterprise applications is projected to grow from $1.62B in 2018 to $31.2B in 2025 attaining a 52.59% CAGR in the forecast period. Image recognition and tagging, patient data processing, localization and mapping, predictive maintenance, use of algorithms and machine learning to predict and thwart security threats, intelligent recruitment, and HR systems are a few of the many enterprise application use cases predicted to fuel the projected rapid growth of AI in the enterprise. Source: Statista.

  • 84% of enterprises believe investing in AI will lead to greater competitive advantages. 75% believe that AI will open up new businesses while also providing competitors new ways to gain access to their markets. 63% believe the pressure to reduce costs will require the use of AI. Source: Statista.

  • 87% of current AI adopters said they were using or considering using AI for sales forecasting and for improving e-mail marketing. 61% of all respondents said that they currently used or were planning to use AI for sales forecasting. The following graphic compares adoption rates of current AI adopters versus all respondents. Source: Statista.

Read more

Nature article presents an AI developed by Google’s Medical Brain team which outperforms hospitals’ own warning system in predicting the death risk among hospit…al patients.


Google’s Medical Brain team is now training its AI to predict the death risk among hospital patients — and its early results show it has slightly higher accuracy than a hospital’s own warning system.

Bloomberg describes the healthcare potential of the Medical Brain’s findings, including its ability to use previously unusable information in order to reach its predictions. The AI, once fed this data, made predictions about the likelihood of death, discharge, and readmission.

In a paper published in Nature in May, from Google’s team, it says of its predictive algorithm:

It’s not unrealistic to think that 80% of what doctors do will be replaced by algorithms and artificial intelligence. The idea, evangelized by venture capitalist Vinod Khosla two years ago, is that machines can more accurately diagnosis us — and that will reduce deadly medical errors and free doctors up to do other things.

The bottom line: We’re getting closer to this reality. Algorithms, for example, can already diagnose diseases from imaging scans better than human radiologists. Computers possibly could take over the entire radiology specialty.

Read more toggle.

Read more