All the question that scared me now i am trying to scare them .. so that they cant scare others :)
Wednesday, July 25, 2012
CloudFront: HOW TO RUN MAPREDUCE PROGRAMS USING ECLIPSE
CloudFront: HOW TO RUN MAPREDUCE PROGRAMS USING ECLIPSE: Hadoop provides us a plugin for Eclipse that helps us to connect our Hadoop cluster to Eclipse. We can then run MapReduce jobs and browse Hd...
Monday, July 23, 2012
SharePoint Orange: How to : working with HBase Coprocessor
SharePoint Orange: How to : working with HBase Coprocessor: HBase Coprocessor : It allows user code to get executed at each region(for a table) in region server. Clients only get the final responses...
Monday, July 16, 2012
High available hadoop cluster
One of the problems which is being discussed with the Hadoop cluster was that is not high available due to the name node failure, as it single point of failure, so here is new version of
Hadoop that is coming with high avalibility option that is two
NameNode , where the switching of
NameNode will happen automatically as one
NameNode goes down, the other will take charge meanwhile we can check the problem with the name node,
The proposed design follows:
So you can thin it of two cluster interconnected using switches and high speed communication line.
so mainly this will be dependent a service called heartbeat that will check the aliveness of NameNode if one goes down the redirection will be done automatically.
We can see the it as follow too:
Subscribe to:
Posts (Atom)
Featured Posts
Enable shared folders in ubuntu in vmware?
To enable Shared Folders in Ubuntu (VM) on VMware , follow these steps: Step 1: Enable Shared Folders in VMware Settings Power...

-
Configuration config = HBaseConfiguration.create(); Job job = new Job(config,"ExampleReadWrite"); job.setJarByClass(MyReadWriteJo...
-
All data is retrieved through a WitsmlServer instance which represents the WITSML server in the client program. There are three differe...