As conventional storage technologies struggle to keep up with big data, interest grows in a biological alternative.
Big Tech Has Become Way Too Powerful
Posted in business
I wanted to share this article because I am hearing this brought up again a lot lately across the US and the current political climate plus if we have a Trump Whitehouse what could this mean for big tech?
We’re not creating the new businesses we should be, and these giants have to be broken up.
Photos from Simon Waslander’s post
Posted in futurism
Random Idea on Inequality and an Attempt to Fix it:
******A one-time Mandatory 50% Giving-Pledge Commitment by the Worlds Billionaires (while they are still alive).******
The massive assets collected thru this one-time Plegde. Should then be managed by an extremely broad team which is multi-ethnic, multi-academic, gender diverse and with members from all ranks of society ect ect.
How the assets attained thru this Mandatorry Giving Pledge will be used, will partly be decided by this extremly broad team. Every step and descision this team makes will be constantly open-sourced on the Internet 24/7.
Hopefully then we can make a step to making the World a better place for us all and our environment.
Sharing is Caring.
Nice.
Richard Feynman suggested that it takes a quantum computer to simulate large quantum systems, but a new study shows that a classical computer can work when the system has loss and noise.
The field of quantum computing originated with a question posed by Richard Feynman. He asked whether or not it was feasible to simulate the behavior of quantum systems using a classical computer, suggesting that a quantum computer would be required instead [1]. Saleh Rahimi-Keshari from the University of Queensland, Australia, and colleagues [2] have now demonstrated that a quantum process that was believed to require an exponentially large number of steps to simulate on a classical computer could in fact be simulated in an efficient way if the system in which the process occurs has sufficiently large loss and noise.
The quantum process considered by Rahimi-Keshari et al. is known as boson sampling, in which the probability distribution of photons (bosons) that undergo a linear optical process [3] is measured or sampled. In experiments of this kind [4, 5], NN single photons are sent into a large network of beams splitters (half-silvered mirrors) and combined before exiting through MM possible output channels. The calculation of the probability distribution for finding the photons in each of the MM output channels is equivalent to calculating the permanent of a matrix. The permanent is the same as the more familiar determinant but with all of the minus signs replaced with plus signs.
Because of a plethora of data from sensor networks, Internet of Things devices and big data resources combined with a dearth of data scientists to effectively mold that data, we are leaving many important applications – from intelligence to science and workforce management – on the table.
It is a situation the researchers at DARPA want to remedy with a new program called Data-Driven Discovery of Models (D3M). The goal of D3M is to develop algorithms and software to help overcome the data-science expertise gap by facilitating non-experts to construct complex empirical models through automation of large parts of the model-creation process. If successful, researchers using D3M tools will effectively have access to an army of “virtual data scientists,” DARPA stated.
+More on Network World: Feeling jammed? Not like this I bet+
This army of virtual data scientists is needed because some experts project deficits of 140,000 to 190,000 data scientists worldwide in 2016 alone, and increasing shortfalls in coming years. Also, because the process to build empirical models is so manual, their relative sophistication and value is often limited, DARPA stated.