Open Source: The Antidote for “Too Big to Fail”

Michael Tiemann | | October 20, 2011

If you look at the evolution of the IT landscape over the past 30 years, you see two distinct trends: the continued growth of the IT dinosaurs (mainframe computing and mainframe wannabes like Sun) and the emergence of highly modular, adaptable systems, which, by their very process of evolution, not only best suit the current needs, but plant the seeds for the next computer revolution. In the 1980s, modular UNIX systems sowed the seeds for Linux, which in the 1990s sowed the seeds for the rapid spread and adoption of the World Wide Web, which in the 2000s, sowed the seeds for companies like, Google, Facebook, and Twitter to aggregate and disseminate content as never before.

In the old days, when missions were presumed to be fixed, one could perform a fixed evaluation of a system and deem it fit or unfit for service. Today, when any single idea can, overnight, undermine critical infrastructure (Stuxnet), rewrite fundamental security assumptions (Anonymous), and overthrow governments (Wikileaks and the Arab Spring), today's "mission critical" systems are tomorrow's failures of imagination. Today, there are far too many IT systems that, for all intents and purposes, are "too big to fail," and that in and of itself represents a systemic risk that must be addressed...