Pergunta

I am new to hadoop , i am following haddop definitive guide for learning. I was doing unit testing with MRunit, but while doing testing for reduce task i am facing compilation error.

Below is my reduce java file : MaxTemperatureReducer.java

package org.priya.mapred.mapred;

import java.io.IOException;
import java.util.Iterator;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
//import org.apache.hadoop.mapred.MRBench.Reduce;
import org.apache.hadoop.mapreduce.Reducer;

public class MaxTemperatureReducer extends Reducer<Text, IntWritable , Text, IntWritable> {

    public void reduce(Text key,Iterator<IntWritable> values, Context context) throws InterruptedException ,IOException
    {
        int maxValue = Integer.MIN_VALUE;
        while(values.hasNext())
        {
            IntWritable value =values.next();
            if(maxValue >= value.get())
            {
                maxValue= value.get();
            }
        }

        context.write(key, new IntWritable(maxValue));

    }

}

Below is my Junit testing file:MaxTemperatureReducerTest.java

package org.priya.mapred.mapred;

import static org.junit.Assert.*;
import java.util.ArrayList;
import org.junit.Test;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Reducer;
//import org.apache.hadoop.mrunit.ReduceDriver;
import org.apache.hadoop.mrunit.ReduceDriver;
//import org.apache.hadoop.mrunit.mapreduce.MapReduceDriver;

public class MaxTemperatureReducerTest {

    @Test
    public void reducerTestValid()
    {   
        ArrayList<IntWritable> listOfValues = new ArrayList<IntWritable>();
        listOfValues.add(new IntWritable(20));
        listOfValues.add(new IntWritable(30));
        listOfValues.add(new IntWritable(40));
        listOfValues.add(new IntWritable(60));
        new ReduceDriver<Text ,IntWritable , Text,  IntWritable>()
                        .withReducer(new MaxTemperatureReducer())
                        .withInput(new Text("1950"),listOfValues )
                        .withOutput(new Text("1950"), new IntWritable(60));



    }

}

while i am passing an instance of reduceclass i.e new MaxTemperatureReducer() to my reducerdriver using ,withReducer() method of driver class. I am getting below compilation error.

The method withReducer(Reducer<Text,IntWritable,Text,IntWritable>) in the type ReduceDriver<Text,IntWritable,Text,IntWritable> is not applicable for the arguments (MaxTemperatureReducer)

Please help me out , as i can see MaxTemperatureMapper class has extended the Reducer class, and i can not understatnd why withReducer() method is not accepting the MaxTemperatureReducer instance.

Thanks, Priyaranjan

Foi útil?

Solução

Your reducer has to implement : http://hadoop.apache.org/docs/current2/api/org/apache/hadoop/mapred/Reducer.html

You are extending : org.apache.hadoop.mapreduce.Reducer.

Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top