“Low quality code is not cheaper; it is vastly more expensive, even in the short term. Bad code slows everyone down from the minute that it is written. It creates a continuous and copious drag on further progress. It requires armies of coders to overcome that drag; and those armies must grow exponentially to maintain constant velocity against that drag.” Robert C. Martin
Wednesday, April 6, 2011
Notes From Scandev2011-Apache Hadoop
If you have chunks of data and have to do some manipulation or analysis on that, you probably need to use Hadoop. You don't have to, but life is easier with them. In SDC2011 Josh Devins from Nokia explained how to do log analysis with Hadoop and Pig.(If you are interested in working in Nokia on these cool stuff you may contact them) In some cases Log analysis is a really resource consuming process as you know. By using HDFS-Hadoop-MapReduce technologies you can save your time for your kids.
I don't want to explain every detail of these technologies but if you are in trouble these technologies may save your life. Before you need to use, you have to learn first. If you will be aware of the tools you have, than you can be aware of the problems arising in front of your eyes before they smack on your face. (Google, Facebook, Yahoo, Microsoft and more others are using... so they know something I guess ;) )
If you are interested-->
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment