---

Hadoop: How Open Source can Whittle Big Data to Size

“The challenge essentially comes down to this: How do you store
the massive amounts of often-unstructured data generated by end
users and then transform it into meaningful, useful
information?

“One tool that enterprises have turned to to help with this is
Hadoop, an open source framework for the distributed processing of
large amounts of data.

“Hadoop lets organisations “analyse much greater amounts of
information than they could previously,” says its creator, Doug
Cutting. ‘Hadoop was developed out of the technologies that search
engines use to analyse the entire Web. Now it’s being used in lots
of other places.’

“In January this year Hadoop finally hit version 1.0. The
software is now developed under the aegis of the Apache Software
Foundation.”


Complete Story

Get the Free Newsletter!

Subscribe to Developer Insider for top news, trends, & analysis