Toggle light / dark theme

gooooogle

After Spencer Kimball left Google, he found himself missing some of the custom-built software the company uses internally. So he and a bunch of fellow ex-Googlers started building their own. And now they want to make it available to everyone to power the next Google or Facebook.

Specifically, Kimball wanted something like Google’s database system Spanner. Spanner is designed to juggle data between potentially millions of database servers, a tool that keeps Google’s services online even if several servers, or an entire datacenter, go offline. While few companies need to operate at quite the scale Google does, the ability to stay online even if many systems fail, and to automatically balance resources between servers, would be useful to many other companies. Read more

1*jK9tSvzX0SweZGgJG_d_3g

So it was great to get back to New York and be able to report on what is called the“New NY Broadband Program.” It involves a $500 million expenditure to help ensure that New Yorkers across the state have access to current-generation Internet capacity. There’s lots of potential in the plan, targeted at providing every New Yorker with access to 100 megabit per second (Mbps) service (10 Mbps uploads) by the end of 2018. Because New York expects a 1:1 match from the private sector for each grant or loan it makes, that means the state hopes to be deploying at least $1 billion on high-speed Internet access infrastructure.

Read more

The National Security Agency knows Edward Snowden disclosed many of its innermost secrets when he revealed how aggressive its surveillance tactics are. What it doesn’t know is just how much information the whistleblower took with him when he left.

For all of its ability to track our telecommunications, the NSA seemingly has little clue exactly what documents, or even how many documents, Snowden gave to the media. Like most large organizations, the NSA had tools in place to track who accessed what data and when. But Snowden, a system administrator, apparently was able to cover his tracks by deleting or modifying the log files that tracked that access. Read more

“We are interested in now, most of us,” says Robert Grass, a researcher in chemistry at ETH Zurich. “We buy our furniture in Ikea. We don’t care if in 10 years it falls apart. With information it is similar. We don’t think into the future.”

But Grass isn’t like most of us. His team, which is exploring how to use DNA as a data storage mechanism, is one of several academic and commercial entities grappling with the challenge of protecting data against the elements over time spans stretching out to millions of years. Read more

Article: Harnessing “Black Holes”: The Large Hadron Collider – Ultimate Weapon of Mass Destruction

Posted in astronomy, big data, computing, cosmology, energy, engineering, environmental, ethics, existential risks, futurism, general relativity, governance, government, gravity, information science, innovation, internet, journalism, law, life extension, media & arts, military, nuclear energy, nuclear weapons, open source, particle physics, philosophy, physics, policy, posthumanism, quantum physics, science, security, singularity, space, space travel, supercomputing, sustainability, time travel, transhumanism, transparency, treatiesTagged , , , , , , , , , , , ,

Harnessing “Black Holes”: The Large Hadron Collider – Ultimate Weapon of Mass Destruction

Why the LHC must be shut down

CERN-Critics: LHC restart is a sad day for science and humanity!

Posted in astronomy, big data, complex systems, computing, cosmology, energy, engineering, ethics, existential risks, futurism, general relativity, governance, government, gravity, hardware, information science, innovation, internet, journalism, law, life extension, media & arts, military, nuclear energy, nuclear weapons, particle physics, philosophy, physics, policy, quantum physics, science, security, singularity, space, space travel, supercomputing, sustainability, time travel, transhumanism, transparency, treatiesTagged , , , , , , , , | 1 Comment on CERN-Critics: LHC restart is a sad day for science and humanity!

PRESS RELEASE “LHC-KRITIK”/”LHC-CRITIQUE” www.lhc-concern.info
CERN-Critics: LHC restart is a sad day for science and humanity!
These days, CERN has restarted the world’s biggest particle collider, the so-called “Big Bang Machine” LHC at CERN. After a hundreds of Million Euros upgrade of the world’s biggest machine, CERN plans to smash particles at double the energies of before. This poses, one would hope, certain eventually small (?), but fundamentally unpredictable catastrophic risks to planet Earth.
Basically the same group of critics, including Professors and Doctors, that had previously filed a law suit against CERN in the US and Europe, still opposes the restart for basically the same reasons. Dangers of: (“Micro”-)Black Holes, Strangelets, Vacuum Bubbles, etc., etc. are of course and maybe will forever be — still in discussion. No specific improvements concerning the safety assessment of the LHC have been conducted by CERN or anybody meanwhile. There is still no proper and really independent risk assessment (the ‘LSAG-report’ has been done by CERN itself) — and the science of risk research is still not really involved in the issue. This is a scientific and political scandal and that’s why the restart is a sad day for science and humanity.
The scientific network “LHC-Critique” speaks for a stop of any public sponsorship of gigantomanic particle colliders.
Just to demonstrate how speculative this research is: Even CERN has to admit, that the so called “Higgs Boson” was discovered — only “probably”. Very probably, mankind will never find any use for the “Higgs Boson”. Here we are not talking about the use of collider technology in medical concerns. It could be a minor, but very improbable advantage for mankind to comprehend the Big Bang one day. But it would surely be fatal – how the Atomic Age has already demonstrated — to know how to handle this or other extreme phenomena in the universe.
Within the next Billions of years, mankind would have enough problems without CERN.
Sources:
- A new paper by our partner “Heavy Ion Alert” will be published soon: http://www.heavyionalert.org/
- Background documents provided by our partner “LHC Safety Review”: http://www.lhcsafetyreview.org/

- Press release by our partner ”Risk Evaluation Forum” emphasizing on renewed particle collider risk: http://www.risk-evaluation-forum.org/newsbg.pdf

- Study concluding that “Mini Black Holes” could be created at planned LHC energies: http://phys.org/news/2015-03-mini-black-holes-lhc-parallel.html

- New paper by Dr. Thomas B. Kerwick on lacking safety argument by CERN: http://vixra.org/abs/1503.0066

- More info at the LHC-Kritik/LHC-Critique website: www.LHC-concern.info
Best regards:
LHC-Kritik/LHC-Critique

Quoted: “Ethereum’s developers believe their project will lead to the proliferation of programs they call “smart contracts,” in which the terms of an agreement are written in code and enforced by software. These smart contracts could carry out the instructions of a complex algorithm based on data feed—such as a stock ticker. They could facilitate practically any financial transaction, such as holding money in escrow or dispersing micropayments among autonomous machines. They could be used to create a peer-to-peer gambling network, a peer-to-peer stock trading platform, a peer-to-peer social network, a prenuptial agreement, a will, a standard agreement to split a dinner check, or a public registry for keeping track of who owns what land in a city.

Gupta predicts that these smart contracts will be so cheap and versatile that they’ll do “a lot of things that today we do informally,” and take on a lot of the “donkey work of running a society.””

Read the article here > http://reason.com/blog/2015/03/19/here-comes-ethereum-an-information-techn

Quoted: “The decentralized Sapience AIFX project has developed a distributed artificial intelligence system running on a cryptocurrency network. In addition, the project has implemented the first distributed database platform running entirely over the bitcoin peer-to-peer protocol, built on top of a distributed hash table with redundancy, resiliency, and multi-dimensional trie-based indexing. These technologies are the first core pieces in the Sapience AIFX platform strategy to be the market leader in the consumerization of the blockchain.

The project has implemented the first in-wallet interactive Lua shell, bringing developers unprecedented capabilities to build solutions leveraging the blockchain, multi-layer perceptron networks, and distributed data storage. The possibilities span from algorithmic trading tools to bioinformatics and data mining, and the traditional applications of deep learning.”

Read more here > http://www.pressreleaserocket.net/first-cryptocurrency-to-ut…in/104609/