Linux Today: Linux News On Internet Time.

More on LinuxToday

Hadoop: How Open Source can Whittle Big Data to Size

Mar 06, 2012, 23:00 (0 Talkback[s])
(Other stories by Rohan Pearce)


Re-Imagining Linux Platforms to Meet the Needs of Cloud Service Providers

"The challenge essentially comes down to this: How do you store the massive amounts of often-unstructured data generated by end users and then transform it into meaningful, useful information?

"One tool that enterprises have turned to to help with this is Hadoop, an open source framework for the distributed processing of large amounts of data.

"Hadoop lets organisations "analyse much greater amounts of information than they could previously," says its creator, Doug Cutting. 'Hadoop was developed out of the technologies that search engines use to analyse the entire Web. Now it's being used in lots of other places.'

"In January this year Hadoop finally hit version 1.0. The software is now developed under the aegis of the Apache Software Foundation."

Complete Story

Related Stories: