Toggle light / dark theme

In this video, we’ll be discussing big data – more specifically, what big data is, the exponential rate of growth of data, how we can utilize the vast quantities of data being generated as well as the implications of linked data on big data.

[0:30–7:50] — Starting off we’ll look at, how data has been used as a tool from the origins of human evolution, starting at the hunter-gatherer age and leading up to the present information age. Afterwards, we’ll look into many statistics demonstrating the exponential rate of growth and future growth of data.

[7:50–18:55] — Following that we’ll discuss, what exactly big data is and delving deeper into the types of data, structured and unstructured and how they will be analyzed both by humans and machine learning (AI).

Read more

https://youtu.be/TbjQGooiJNs

James Hughes : “Great convo with Yuval Harari, touching on algorithmic governance, the perils of being a big thinker when democracy is under attack, the need for transnational governance, the threats of automation to the developing world, the practical details of UBI, and a lot more.”


In this episode of the Waking Up podcast, Sam Harris speaks with Yuval Noah Harari about his new book 21 Lessons for the 21st Century. They discuss the importance of meditation for his intellectual life, the primacy of stories, the need to revise our fundamental assumptions about human civilization, the threats to liberal democracy, a world without work, universal basic income, the virtues of nationalism, the implications of AI and automation, and other topics.

Yuval Noah Harari has a PhD in History from the University of Oxford and lectures at the Hebrew University of Jerusalem, specializing in world history. His books have been translated into 50+ languages, with 12+ million copies sold worldwide. Sapiens: A Brief History of Humankind looked deep into our past, Homo Deus: A Brief History of Tomorrow considered far-future scenarios, and 21 Lessons for the 21st Century focuses on the biggest questions of the present moment.

This video is the second in a two-part series discussing big data. In this video, we’ll be discussing how we can utilize the vast quantities of data being generated as well as the implications of linked data on big data.

[0:33–4:43] — Starting off we’ll look at, what exactly big data is and delving deeper into the types of data, structured and unstructured and how they will be analyzed both by humans and machine learning (AI).

[4:43–11:37] — Following that we’ll discuss, how this data will be put to use and the next evolution of data, linked data, and how it will change the world and the web!

[11:37–12:37] — To conclude we’ll briefly overview the role cloud computing will play with big data!

Recently, there has been an explosion of interest in applying artificial intelligence (AI) to medicine. Whether explicitly or implicitly, much of this interest has centered on using AI to automate decision-making tasks that are currently done by physicians. This includes two seminal papers in the Journal of the American Medical Association demonstrating that AI-based algorithms have similar or higher accuracy than physicians: one in diagnostic assessment of metastatic breast cancer compared to pathologists and the other in detecting diabetic retinopathy compared to ophthalmologists.

While promising, these applications of AI in medicine raise a number of novel regulatory and policy issues around efficacy, safety, health workforce, and payment. They have also triggered concerns from the medical and patient communities about AI replacing doctors. And, except in narrow domains of practice, general AI systems may fall far short of the hype.

We posit that the applications of AI to “augment” physicians may be more realistic and broader reaching than those that portend to replace existing health care services. In particular, with the right support from policy makers, physicians, patients, and the technology community, we see opportunities for AI to be a solution for—rather than a contributor to—burnout among physicians and achieving the quadruple aim of improving health, enhancing the experience of care, reducing cost, and attaining joy in work for health professionals.

Read more

This video is the first in a two-part series discussing big data. In this video, we’ll be discussing the importance of data and the role, it has played in advancing humankind as well as the exponential rate of growth of data.

[0:29–4:19] — Starting off we’ll look at, how data has been used as a tool from the origins of human evolution, starting at the hunter-gatherer age and leading up to the present information age.

[4:19–7:48] — Following that we’ll discuss, the many statistics demonstrating the exponential rate of growth and future growth of data.

Subscribe ➤ http://subscribe.singularityprosperity.com/youtube

Since its acquisition of ITA Matrix Software eight years ago, Google has been quietly rolling out new tools for travelers. Its progress has been even more notable over the past months and weeks as it began unveiling tools to help predict flight delays, plan trips, and manage itineraries — among other things.

These changes have some wondering: Is Google making a run at total domination in the travel space? If it is, there’s a strong case to be made for its potential to disrupt the travel and hospitality sector with a similar approach to Amazon’s run at retail, and more recently grocery.

Read more

Realistic climate simulations require huge reserves of computational power. An LMU study now shows that new algorithms allow interactions in the atmosphere to be modeled more rapidly without loss of reliability.

Forecasting global and local climates requires the construction and testing of mathematical . Since such models must incorporate a plethora of physical processes and interactions, climate simulations require enormous amounts of . And even the best models inevitably have limitations, since the phenomena involved can never be modeled in sufficient detail. In a project carried out in the context of the DFG-funded Collaborative Research Center “Waves to Weather”, Stephan Rasp of the Institute of Theoretical Meteorology at LMU (Director: Professor George Craig) has now looked at the question of whether the application of can improve the efficacy of climate modelling. The study, which was performed in collaboration with Professor Mike Pritchard of the University of California at Irvine und Pierre Gentine of Columbia University in New York, appears in the journal PNAS.

General circulation models typically simulate the global behavior of the atmosphere on grids whose cells have dimensions of around 50 km. Even using state-of-the-art supercomputers the relevant that take place in the atmosphere are simply too complex to be modelled at the necessary level of detail. One prominent example concerns the modelling of clouds which have a crucial influence on climate. They transport heat and moisture, produce precipitation, as well as absorb and reflect solar radiation, for instance. Many clouds extend over distances of only a few hundred meters, much smaller than the grid cells typically used in simulations – and they are highly dynamic. Both features make them extremely difficult to model realistically. Hence today’s models lack at least one vital ingredient, and in this respect, only provide an approximate description of the Earth system.

Read more