---

Been there, forked that: What the Unix-Linux schism can teach us about Hadoop’s future

Hadoop is fast becoming the preferred way to store and process big data. By T-System’s estimates, in five years, 80 percent of all new data will first land in Hadoop’s distributed file system (HDFS) or in alternative Object Storage architectures.

Yet with the excitement around this open source framework, enterprise users risk overlooking that all Hadoop flavors are not created equal. Choosing one implementation over another can mean veering off the path of genuine open source software and instead heading down the dead-end street of expensive vendor lock-in and stunted innovation.

Get the Free Newsletter!

Subscribe to Developer Insider for top news, trends, & analysis