hat are the configuration parameters in a “MapReduce” program ?
- The main configuration parameters in “MapReduce” framework are:
- Input location of Jobs in the distributed file system
- Output location of Jobs in the distributed file system
- The input format of data
- The output format of data
- The class which contains the map function
- The class which contains the reduce function
- JAR file which contains the mapper, reducer and the driver classes
Sample Code
public static void main(String [] args) throws Exception
{
Configuration c=new Configuration();
String[] files=new GenericOptionsParser(c,args).getRemainingArgs();
Path input=new Path(files[0]);
Path output=new Path(files[1]);
Job j=new Job(c,"wordcount");
j.setJarByClass(WordCount.class);
j.setMapperClass(MapForWordCount.class);
j.setReducerClass(ReduceForWordCount.class);
j.setOutputKeyClass(Text.class);
j.setOutputValueClass(IntWritable.class);
FileInputFormat.addInputPath(j, input);
FileOutputFormat.setOutputPath(j, output);
System.exit(j.waitForCompletion(true)?0:1);
}
Categorized in:
Tagged in:
Accenture interview questions and answers, AT&T interview questions and answers, Atos interview questions and answers, Capgemini interview questions and answers, CASTING NETWORKS INDIA PVT LIMITED interview questions and answers, CGI Group Inc interview questions and answers, Collabera Technologiesinterview questions and answers, configuring xml files in hadoop, Dell International Services India Pvt Ltd interview questions and answers, Ernst & Young interview questions and answers, Flipkart interview questions and answers, Genpact interview questions and answers, hadoop configuration files, Hadoop Configuration parameters, hadoop configuration parameters list, hadoop job configuration parameters, Hadoop MapReduce Configuration Parameters, hadoop wordcount example source code, hadoop-mapreduce-examples.jar wordcount, hdfs-default.xml location, hdfs-site.xml configuration file, how does mapreduce work, how to run mapreduce program in hadoop, IBM interview questions and answers, Indecomm Global Services interview questions and answers, install configure and run hadoop and hdfs, L&T Infotech interview questions and answers, main configuration parameters in mapreduce, mapreduce code, mapreduce coding interview questions, mapreduce components, mapreduce configuration parameters, mapreduce exam questions, mapreduce example, mapreduce example java, mapreduce example problems, mapreduce examples other than word count, mapreduce interview questions, mapreduce parameters, mapreduce programming in java examples, mapreduce simple example, mapreduce word count example, Mindtree interview questions and answers, NetApp interview questions and answers, Passing Runtime parameters in MapReduce, R Systems interview questions and answers, RBS India Development Centre Pvt Ltd interview questions and answers, SAP Labs India Pvt Ltd interview questions and answers, Settings for the MapReduce program, Tata Consultancy Service interview questions and answers, Tech Mahindra interview questions and answers, Trigent Software interview questions and answers, UnitedHealth Group interview questions and answers, Virtusa Consulting Services Pvt Ltd interview questions and answers, Wells Fargo interview questions and answers, What are the main configuration parameters in a MapReduce, what are the main configuration parameters in a mapreduce program, What is the use of a configuration class and object in Hadoop, which of the file contains the configuration setting for nodemanager and resourcemanager, which of the following are contain configuration for hdfs daemons, Wipro Infotech interview questions and answers, Wipro interview questions and answers, wordcount program in java, Xoriant Solutions Pvt Ltd interview questions and answers, ZS Associates interview questions and answers