FAQ
Hello,

I am new to Hadoop and I think I'm doing something silly. I sent this
e-mail from another account which isn't registered to hadoop user group.

I am getting the following error in my reducer.

10/11/15 15:29:11 WARN mapred.LocalJobRunner: job_local_0001
java.io.IOException: wrong value class: class
org.apache.hadoop.io.Text is not class org.apache.hadoop.io.IntWritable

Here is my reduce class:

public static class BFIDAReducer
extends Reducer<Text,IntWritable,Text,Text> {
private Text result = new Text();

public void reduce(Text key, Iterable<IntWritable> values,
Context context
) throws IOException, InterruptedException {
Text result = new Text();
GameFunctions gf = GameFunctions.getInstance();


String line = "";

for(IntWritable val: values)
{
line = line + val.toString() + ",";
}

if(line.length() > 1)
line = (String) line.subSequence(0, line.length() - 1);

if (gf.isSolved(key.toString(), size))
solved = true;

result.set(line);
context.write(key, result);
}
}

And here is my partial code from job configuration:

job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(IntWritable.class);

Can anyone help me?



I know I'll have more question in near future.

Thanks in advance.

Arindam

Search Discussions

  • Arindam Khaled at Nov 16, 2010 at 8:51 am
    Hello,

    I am new to Hadoop. I am getting the following error in my reducer.

    10/11/15 15:29:11 WARN mapred.LocalJobRunner: job_local_0001
    java.io.IOException: wrong value class: class org.apache.hadoop.io.Text is
    not class org.apache.hadoop.io.IntWritable

    Here is my reduce class:

    public static class BFIDAReducer
    extends Reducer<Text,IntWritable,Text,Text> {
    private Text result = new Text();

    public void reduce(Text key, Iterable<IntWritable> values,
    Context context
    ) throws IOException, InterruptedException {
    Text result = new Text();
    GameFunctions gf = GameFunctions.getInstance();


    String line = "";

    for(IntWritable val: values)
    {
    line = line + val.toString() + ",";
    }

    if(line.length() > 1)
    line = (String) line.subSequence(0, line.length() - 1);

    if (gf.isSolved(key.toString(), size))
    solved = true;

    result.set(line);
    context.write(key, result);
    }
    }

    And here is my partial code from job configuration:

    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(Text.class);
    job.setMapOutputKeyClass(Text.class);
    job.setMapOutputValueClass(IntWritable.class);

    Can anyone help me?

    Thanks in advance.

    Arindam
  • Alex Baranau at Nov 17, 2010 at 7:43 am
    The message refers to the value not being an IntWritable, which is an
    *input* value type of your reducer (and the output value type of your
    mapper). Looks like you have a problem with mapper, not reducer.

    Alex Baranau
    ----
    Sematext :: http://sematext.com/ :: Solr - Lucene - Nutch - Hadoop - HBase
    On Mon, Nov 15, 2010 at 11:50 PM, Arindam Khaled wrote:

    Hello,

    I am new to Hadoop. I am getting the following error in my reducer.

    10/11/15 15:29:11 WARN mapred.LocalJobRunner: job_local_0001
    java.io.IOException: wrong value class: class org.apache.hadoop.io.Text is
    not class org.apache.hadoop.io.IntWritable

    Here is my reduce class:

    public static class BFIDAReducer
    extends Reducer<Text,IntWritable,Text,Text> {
    private Text result = new Text();

    public void reduce(Text key, Iterable<IntWritable> values,
    Context context
    ) throws IOException, InterruptedException {
    Text result = new Text();
    GameFunctions gf = GameFunctions.getInstance();


    String line = "";

    for(IntWritable val: values)
    {
    line = line + val.toString() + ",";
    }

    if(line.length() > 1)
    line = (String) line.subSequence(0, line.length() - 1);

    if (gf.isSolved(key.toString(), size))
    solved = true;

    result.set(line);
    context.write(key, result);
    }
    }

    And here is my partial code from job configuration:

    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(Text.class);
    job.setMapOutputKeyClass(Text.class);
    job.setMapOutputValueClass(IntWritable.class);

    Can anyone help me?

    Thanks in advance.

    Arindam
  • Arindam Khaled at Nov 16, 2010 at 4:15 pm
    This website has answered my question somewhat:

    http://blog.pfa-labs.com/2010/01/first-stab-at-hadoop-and-map-reduce.html

    When I comment out the combiner class, it seems to work fine. Thanks.

    ----- Original Message -----
    From: "Arindam Khaled" <akhaled@utdallas.edu>
    To: common-user@hadoop.apache.org
    Sent: Monday, November 15, 2010 6:05:58 PM
    Subject: wrong value class error

    Hello,

    I am new to Hadoop and I think I'm doing something silly. I sent this
    e-mail from another account which isn't registered to hadoop user group.

    I am getting the following error in my reducer.

    10/11/15 15:29:11 WARN mapred.LocalJobRunner: job_local_0001
    java.io.IOException: wrong value class: class
    org.apache.hadoop.io.Text is not class org.apache.hadoop.io.IntWritable

    Here is my reduce class:

    public static class BFIDAReducer
    extends Reducer<Text,IntWritable,Text,Text> {
    private Text result = new Text();

    public void reduce(Text key, Iterable<IntWritable> values,
    Context context
    ) throws IOException, InterruptedException {
    Text result = new Text();
    GameFunctions gf = GameFunctions.getInstance();


    String line = "";

    for(IntWritable val: values)
    {
    line = line + val.toString() + ",";
    }

    if(line.length() > 1)
    line = (String) line.subSequence(0, line.length() - 1);

    if (gf.isSolved(key.toString(), size))
    solved = true;

    result.set(line);
    context.write(key, result);
    }
    }

    And here is my partial code from job configuration:

    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(Text.class);
    job.setMapOutputKeyClass(Text.class);
    job.setMapOutputValueClass(IntWritable.class);

    Can anyone help me?



    I know I'll have more question in near future.

    Thanks in advance.

    Arindam
  • Harsh J at Nov 16, 2010 at 4:39 pm
    Hi,
    On Tue, Nov 16, 2010 at 9:39 PM, Arindam Khaled wrote:
    When I comment out the combiner class, it seems to work fine. Thanks.
    That isn't a solution, but sure avoids the error. You need to
    implement a proper Combiner class that emits the same Key and Value
    pair as your Mapper should. Your Reducer logic emits out <Text, Text>,
    which was the issue if you utilized the same class for Combiner too.

    But do know that the Combiner may be called 0...N times per Mapper.

    --
    Harsh J
    www.harshj.com

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedNov 16, '10 at 12:07a
activeNov 17, '10 at 7:43a
posts5
users4
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase