Recent Wikileaks episode has highlighted the immense control national governments and private companies have on what content can be hosted. Within days of being identified by the U.S. government as a problem, private companies in charge of hosting and providing banking services to Wikileaks withdrew support, largely neutering organization’s ability to raise funds, and host content.
Successful attempts to cut Internet in Egypt and Libya also pose questions of a similar nature.
So two questions follow – should anything be done about it? And if so, what? The answer to the first is not as clear, but on balance, perhaps such (what is effectively) absolute discretionary control over the fate of â€˜hostileâ€™ information/or technology should not be the allowed. As to the second question – Given many of the hosting, banking companies, etc. essential to disseminating content are privately held, and susceptible to both government and market pressures, dissemination engine ought to be independent of those as much as possible (bottlenecks remain: most pipes are owned by governments or corporations). Here are three ideas â€“
1) Create an international server farm on which content can be hosted by anyone but only removed after due process, set internationally. (NGO supported farms may work as well.)
2) We already have ways to disseminate content without centralized hosting â€“ P2P â€“ but these systems lack a browser that collates torrents and builds a webpage in live time. Such a â€˜torrentâ€™ based browser can vastly improve the ability of P2P networks to host content.
3) For Libya/Egypt etc. the problem is of a different nature. We need applications like â€˜Twitterâ€™ to continue to function even if the artery to central servers goes down. This can be handled by building applications in a manner that they can be run on edge servers with local data. I believe this kind of redundancy can also be useful for businesses.