Blog

Archive for the ‘information science’ category: Page 83

Oct 31, 2012

FuturICT Vision for the Social Sciences, ICT & Complexity Science

Posted by in categories: futurism, information science

FutureICT have submitted their proposal to the FET Flagship Programme, an initiative that aims to facilitate breakthroughs in information technology. The vision of FutureICT is to

integrate the fields of information and communication technologies (ICT), social sciences and complexity science, to develop a new kind of participatory science and technology that will help us to understand, explore and manage the complex, global, socially interactive systems that make up our world today, while at the same time paving the way for a new paradigm of ICT systems that will leverage socio-inspired self-organisation, self-regulation, and collective awareness.

The project could provide us with profound insights into societal behaviour and improve policymaking. The project echoes the Large Hadron Collider at CERN in its scope and vision, only here we are trying to understand the state of the world. The FutureICT project combines the creation of a ‘Planetary Nervous System’ (PNS) where Big Data will be collated and organised, a ‘Living Earth Simulator’ (LES), and the ‘Global Participatory Platform’ (GPP). The LES will simulate the data and provide models for analysis, while the GPP will provide the data, models and methods to everyone. People wil be able to collaborate and research in a very different way. The availability of Big Data to participants will both strengthen our ability to understand complex socio-economic systems, and it could help build a new dialogue between nations in how we solve complex global societal challenges.

FutureICT aim to develop a ‘Global Systems Science’, which will

Continue reading “FuturICT Vision for the Social Sciences, ICT & Complexity Science” »

Oct 27, 2012

Today, a Young Man on Acid Realized that all Matter is Merely Energy Condensed to a…

Posted by in categories: biological, complex systems, cosmology, engineering, existential risks, homo sapiens, human trajectories, humor, information science, particle physics, philosophy, physics


…here’s Tom with the Weather.
That right there is comedian/philosopher Bill Hicks, sadly no longer with us. One imagines he would be pleased and completely unsurprised to learn that serious scientific minds are considering and actually finding support for the theory that our reality could be a kind of simulation. That means, for example, a string of daisy-chained IBM Super-Deep-Blue Gene Quantum Watson computers from 2042 could be running a History of the Universe program, and depending on your solipsistic preferences, either you are or we are the character(s).

It’s been in the news a lot of late, but — no way, right?

Because dude, I’m totally real
Despite being utterly unable to even begin thinking about how to consider what real even means, the everyday average rational person would probably assign this to the sovereign realm of unemployable philosophy majors or under the Whatever, Who Cares? or Oh, That’s Interesting I Gotta Go Now! categories. Okay fine, but on the other side of the intellectual coin, vis-à-vis recent technological advancement, of late it’s actually being seriously considered by serious people using big words they’ve learned at endless college whilst collecting letters after their names and doin’ research and writin’ and gettin’ association memberships and such.

So… why now?

Continue reading “Today, a Young Man on Acid Realized that all Matter is Merely Energy Condensed to a...” »

Oct 23, 2012

The Witch-Hunt of Geophysicists: Society returns to the Dark Ages

Posted by in categories: education, ethics, events, geopolitics, information science, physics

I cannot let the day pass without contributing a comment on the incredible ruling of multiple manslaughter on six top Italian geophysicists for not predicting an earthquake that left 309 people dead in 2009. When those who are entrusted with safeguarding humanity (be it on a local level in this case) are subjected to persecution when they fail to do so, despite acting in the best of their abilities in an inaccurate science, we have surely returned to the dark ages where those who practice science are demonized by the those who misunderstand it.

http://www.aljazeera.com/news/europe/2012/10/20121022151851442575.html

I hope I do not misrepresent other members of staff here at The Lifeboat Foundation, in speaking on behalf of the Foundation in wishing these scientists a successful appeal against a court ruling which has shocked the scientific community, and I stand behind the 5,000 members of the scientific community who sent an open letter to Italy’s President Giorgio Napolitano denouncing the trial. This court ruling was ape-mentality at its worst.

Oct 6, 2012

The decaying web and our disappearing history

Posted by in categories: information science, media & arts, philosophy

On January 28 2011, three days into the fierce protests that would eventually oust the Egyptian president Hosni Mubarak, a Twitter user called Farrah posted a link to a picture that supposedly showed an armed man as he ran on a “rooftop during clashes between police and protesters in Suez”. I say supposedly, because both the tweet and the picture it linked to no longer exist. Instead they have been replaced with error messages that claim the message – and its contents – “doesn’t exist”.

Few things are more explicitly ephemeral than a Tweet. Yet it’s precisely this kind of ephemeral communication – a comment, a status update, sharing or disseminating a piece of media – that lies at the heart of much of modern history as it unfolds. It’s also a vital contemporary historical record that, unless we’re careful, we risk losing almost before we’ve been able to gauge its importance.

Consider a study published this September by Hany SalahEldeen and Michael L Nelson, two computer scientists at Old Dominion University. Snappily titled “Losing My Revolution: How Many Resources Shared on Social Media Have Been Lost?”, the paper took six seminal news events from the last few years – the H1N1 virus outbreak, Michael Jackson’s death, the Iranian elections and protests, Barack Obama’s Nobel Peace Prize, the Egyptian revolution, and the Syrian uprising – and established a representative sample of tweets from Twitter’s entire corpus discussing each event specifically.

It then analysed the resources being linked to by these tweets, and whether these resources were still accessible, had been preserved in a digital archive, or had ceased to exist. The findings were striking: one year after an event, on average, about 11% of the online content referenced by social media had been lost and just 20% archived. What’s equally striking, moreover, is the steady continuation of this trend over time. After two and a half years, 27% had been lost and 41% archived.

Continue reading “The decaying web and our disappearing history”

Oct 5, 2012

Want to Get 70 Billion Copies of Your Book In Print? Print It In DNA

Posted by in categories: biological, biotech/medical, chemistry, futurism, information science, media & arts

I have been meaning to read a book coming out soon called Regenesis: How Synthetic Biology Will Reinvent Nature and Ourselves. It’s written by Harvard biologist George Church and science writer Ed Regis. Church is doing stunning work on a number of fronts, from creating synthetic microbes to sequencing human genomes, so I definitely am interested in what he has to say. I don’t know how many other people will be, so I have no idea how well the book will do. But in a tour de force of biochemical publishing, he has created 70 billion copies. Instead of paper and ink, or pdf’s and pixels, he’s used DNA.

Much as pdf’s are built on a digital system of 1s and 0s, DNA is a string of nucleotides, which can be one of four different types. Church and his colleagues turned his whole book–including illustrations–into a 5.27 MB file–which they then translated into a sequence of DNA. They stored the DNA on a chip and then sequenced it to read the text. The book is broken up into little chunks of DNA, each of which has a portion of the book itself as well as an address to indicate where it should go. They recovered the book with only 10 wrong bits out of 5.27 million. Using standard DNA-copying methods, they duplicated the DNA into 70 billion copies.

Scientists have stored little pieces of information in DNA before, but Church’s book is about 1,000 times bigger. I doubt anyone would buy a DNA edition of Regenesis on Amazon, since they’d need some expensive equipment and a lot of time to translate it into a format our brains can comprehend. But the costs are crashing, and DNA is a far more stable medium than that hard drive on your desk that you’re waiting to die. In fact, Regenesis could endure for centuries in its genetic form. Perhaps librarians of the future will need to get a degree in biology…

Link to Church’s paper

Source

Oct 2, 2012

In Conversation with Albert-lászló Barabási on Thinking in Network Terms

Posted by in categories: complex systems, information science

One question that fascinated me in the last two years is, can we ever use data to control systems? Could we go as far as, not only describe and quantify and mathematically formulate and perhaps predict the behavior of a system, but could you use this knowledge to be able to control a complex system, to control a social system, to control an economic system?

We always lived in a connected world, except we were not so much aware of it. We were aware of it down the line, that we’re not independent from our environment, that we’re not independent of the people around us. We are not independent of the many economic and other forces. But for decades we never perceived connectedness as being quantifiable, as being something that we can describe, that we can measure, that we have ways of quantifying the process. That has changed drastically in the last decade, at many, many different levels.

Continue reading “Thinking in Network Terms” and watch the hour long video interview

Aug 19, 2012

Artilects Soon to Come

Posted by in categories: complex systems, counterterrorism, cybercrime/malcode, defense, engineering, ethics, events, evolution, existential risks, futurism, information science, military, neuroscience, supercomputing

Whether via spintronics or some quantum breakthrough, artificial intelligence and the bizarre idea of intellects far greater than ours will soon have to be faced.

http://www.sciencedaily.com/releases/2012/08/120819153743.htm

Aug 13, 2012

The Electric Septic Spintronic Artilect

Posted by in categories: biological, biotech/medical, business, chemistry, climatology, complex systems, counterterrorism, defense, economics, education, engineering, ethics, events, evolution, existential risks, futurism, geopolitics, homo sapiens, human trajectories, information science, military, neuroscience, nuclear weapons, policy, robotics/AI, scientific freedom, singularity, space, supercomputing, sustainability, transparency

AI scientist Hugo de Garis has prophesied the next great historical conflict will be between those who would build gods and those who would stop them.

It seems to be happening before our eyes as the incredible pace of scientific discovery leaves our imaginations behind.

We need only flush the toilet to power the artificial mega mind coming into existence within the next few decades. I am actually not intentionally trying to write anything bizarre- it is just this strange planet we are living on.

http://www.sciencedaily.com/releases/2012/08/120813155525.htm

http://www.sciencedaily.com/releases/2012/08/120813123034.htm

May 25, 2012

OpenOffice / LibreOffice & A Warning For Futurists

Posted by in categories: complex systems, futurism, human trajectories, information science, open access, open source

I spend most of my time thinking about software, and occasionally I come across issues that are relevant to futurists. I wrote my book about the future of software in OpenOffice, and needed many of its features. It might not be the only writing / spreadsheet / diagramming / presentation, etc. tool in your toolbox, but it is a worthy one. OpenDocument Format (ODF) is the best open standard for these sorts of scenarios and LibreOffice is currently the premier tool to handle that format. I suspect many of the readers of Lifeboat have a variant installed, but don’t know much of the details of what is going on.

The OpenOffice situation has been a mess for many years. Sun didn’t foster a community of developers around their work. In fact, they didn’t listen to the community when it told them what to do. So about 18 months ago, after Oracle purchased Sun and made the situation worse, the LibreOffice fork was created with most of the best outside developers. LibreOffice quickly became the version embraced by the Linux community as many of the outside developers were funded by the Linux distros themselves. After realizing their mess and watching LibreOffice take off within the free software community, Oracle decided to fire all their engineers (50) and hand the trademark and a copy of the code over to IBM / Apache.

Now it would be natural to imagine that this should be handed over to LibreOffice, and have all interested parties join up with this effort. But that is not what is happening. There are employees out there whose job it is to help Linux, but they are actually hurting it. You can read more details on a Linux blog article I wrote here. I also post this message as a reminder about how working together efficiently is critical to have faster progress on complicated things.

Apr 15, 2012

Risk Assessment is Hard (computationally and otherwise)

Posted by in categories: existential risks, information science, policy

How hard is to assess which risks to mitigate? It turns out to be pretty hard.

Let’s start with a model of risk so simplified as to be completely unrealistic, yet will still retain a key feature. Suppose that we managed to translate every risk into some single normalized unit of “cost of expected harm”. Let us also suppose that we could bring together all of the payments that could be made to avoid risks. A mitigation policy given these simplifications must be pretty easy: just buy each of the “biggest for your dollar” risks.

Not so fast.

The problem with this is that many risk mitigation measures are discrete. Either you buy the air filter or you don’t. Either your town filters its water a certain way or it doesn’t. Either we have the infrastructure to divert the asteroid or we don’t. When risk mitigation measures become discrete, then allocating the costs becomes trickier. Given a budget of 80 “harms” to reduce, and risks of 50, 40, and 35, then buying the 50 leaves 15 “harms” that you were willing to pay to avoid left on the table.

Continue reading “Risk Assessment is Hard (computationally and otherwise)” »

Page 83 of 84First7778798081828384