Hadoop: the defacto standard for 'Big Data' processing

Dave Rosenberg | C/Net News | June 13, 2012

Hadoop - the open-source software, which has emerged as the de facto standard for big data processing, may be what tips enterprise in the favor of open source, according to some high-level execs.

This is a big-data week in Silicon Valley, kicking off last night with a Churchill Club event here called "The Elephant in the Enterprise: What Role will Hadoop Play?" and featuring a high-powered group of big-data executives.

Hadoop, the open-source software that has emerged as the de facto standard for big data processing, may be what tips enterprise in the favor of open source. The desire to get more data and find value in it has become a business priority, and Hadoop is playing a major role in making sense of data.