Question

Hi iam very much new to this hadoop,iam trying to execute a simple chain of mapreduce job,but in my code reducer is not getting executed.here is the simple code what i have written

this is my mapper code

@Override
    public void map(LongWritable arg0, Text arg1,
            OutputCollector<Text, IntWritable> arg2, Reporter arg3)
            throws IOException {
        // TODO Auto-generated method stub
        System.out.println("in first mapper");

    }

this is my simple reducer code

@Override
public void reduce(Text arg0, Iterator<IntWritable> arg1,
        OutputCollector<Text, IntWritable> arg2, Reporter arg3)
        throws IOException {
    // TODO Auto-generated method stub
    System.out.println("in reducer");

}

this is main class which runs the job

JobConf jobConf = new JobConf(jobrunner.class);
jobConf.setJobName("Chaining");

FileInputFormat.setInputPaths(jobConf, new Path("hdfs://localhost:9000/employee_data.txt"));
FileOutputFormat.setOutputPath(jobConf,new Path("hdfs://localhost:9000/chain3.txt"));

JobConf conf1 = new JobConf(false);

ChainMapper.addMapper(jobConf,chainmap.class,LongWritable.class,Text.class,Text.class,IntWritable.class,true,conf1);

JobConf conf2 = new JobConf(false);

ChainReducer.setReducer(jobConf, chainreduce.class,Text.class,IntWritable.class,Text.class,IntWritable.class,true,conf2);

JobClient.runJob(jobConf);

Dont know where im going wrong.the sysout in reducer is not getting printed.any help on this?

Was it helpful?

Solution

Probable reason: you are not outputting anything from the mapper for the reducers to run at the first place. Try using outputCollector.collect(key, value); from the mapper to write something for the reducers to actually run.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top