Linux Today: Linux News On Internet Time.
Search Linux Today
Linux News Sections:  Developer -  High Performance -  Infrastructure -  IT Management -  Security -  Storage -
Linux Today Navigation
LT Home
Contribute
Contribute
Link to Us
Linux Jobs


Top White Papers

More on LinuxToday


Hadoop: How Open Source can Whittle Big Data to Size

Mar 06, 2012, 23:00 (0 Talkback[s])
(Other stories by Rohan Pearce)

"The challenge essentially comes down to this: How do you store the massive amounts of often-unstructured data generated by end users and then transform it into meaningful, useful information?

"One tool that enterprises have turned to to help with this is Hadoop, an open source framework for the distributed processing of large amounts of data.

"Hadoop lets organisations "analyse much greater amounts of information than they could previously," says its creator, Doug Cutting. 'Hadoop was developed out of the technologies that search engines use to analyse the entire Web. Now it's being used in lots of other places.'

"In January this year Hadoop finally hit version 1.0. The software is now developed under the aegis of the Apache Software Foundation."

Complete Story

Related Stories: