FAQ
hi, all.
I am a new hadoop beginner, I try to construct a map and reduce task to run, however encountered an exception while continue going further.
Exception:
java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, recieved org.apache.hadoop.io.LongWritable
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:845)
at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:541)
at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
at org.apache.hadoop.mapreduce.Mapper.map(Mapper.java:124)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:621)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177)
Mapper:
public class TaskMapper extends Mapper<LongWritable , Text ,Text ,Text>{
public void Map(LongWritable key, Text value, Context context)
{
try {
System.out.println(value.toString());
context.write(new Text(Calendar.getInstance().getTime().toString()),value);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Reducer:
public class TaskReducer extends Reducer<Text , Text ,Text ,Text>{
public void reduce(Text key,Iterable<Text> value, Context context)
{
try {
context.write(key,new Text(value.iterator().hashCode()+""));
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}

}
Main(Entry)
public class TaskEntry extends Configured implements Tool{

@Override
public int run(String[] arg0) throws Exception {
Job job=new Job(getConf());
job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(Text.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
//job.setInputFormatClass(KeyValueTextInputFormat.class);
FileInputFormat.addInputPath(job, new Path(arg0[0]));
FileOutputFormat.setOutputPath(job, new Path(arg0[1]));
job.setMapperClass(TaskMapper.class);
job.setReducerClass(TaskReducer.class);
return job.waitForCompletion(true)?0:1;
}
public static void main(String args[]) throws Exception
{
int exitCode=ToolRunner.run(new TaskEntry(), args);
System.exit(exitCode);
}

}


Any help is greatly appreciated!
Thanks in advance.

James, Teng (Teng Linxiao)
eRL, CDC, eBay, Shanghai
Extension: 86-21-28913530
MSN: tenglinxiao@hotmail.com
Skype: James,Teng
Email: xteng@ebay.com
[cid:image002.gif@01CC3FBD.676D31C0]

Search Discussions

  • Devaraj K at Jul 12, 2011 at 11:40 am
    Hi Teng,



    As per the exception stack trace, it is not invoking the TaskMapper.map()
    method and it is invoking the default Mapper.map() method. Can you recheck
    the configurations and job code whether it is properly copied or not?



    Devaraj K

    ----------------------------------------------------------------------------
    ---------------------------------------------------------
    This e-mail and its attachments contain confidential information from
    HUAWEI, which
    is intended only for the person or entity whose address is listed above. Any
    use of the
    information contained herein in any way (including, but not limited to,
    total or partial
    disclosure, reproduction, or dissemination) by persons other than the
    intended
    recipient(s) is prohibited. If you receive this e-mail in error, please
    notify the sender by
    phone or email immediately and delete it!ss



    _____

    From: Teng, James
    Sent: Tuesday, July 12, 2011 12:16 PM
    To: common-user@hadoop.apache.org
    Subject: FW: type mismatch error



    hi, all.

    I am a new hadoop beginner, I try to construct a map and reduce task to run,
    however encountered an exception while continue going further.

    Exception:

    java.io.IOException: Type mismatch in key from map: expected
    org.apache.hadoop.io.Text, recieved org.apache.hadoop.io.LongWritable

    at
    org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:845)

    at
    org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:541)

    at
    org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputCont
    ext.java:80)

    at org.apache.hadoop.mapreduce.Mapper.map(Mapper.java:124)

    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)

    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:621)

    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)

    at
    org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177)

    Mapper:





    Any help is greatly appreciated!

    Thanks in advance.



    James, Teng (Teng Linxiao)

    eRL, CDC, eBay, Shanghai

    Extension: 86-21-28913530

    MSN: tenglinxiao@hotmail.com

    Skype: James,Teng

    Email: xteng@ebay.com
  • Harsh J at Jul 12, 2011 at 1:00 pm
    Yep, best to annotate your code with the @Override annotation so that
    you can detect troubles like these easily.
    On Tue, Jul 12, 2011 at 5:07 PM, Devaraj K wrote:
    Hi Teng,



    As per the exception stack trace, it is not invoking the TaskMapper.map()
    method and it is invoking the default Mapper.map() method. Can you recheck
    the configurations and job code whether it is properly copied or not?



    Devaraj K

    ----------------------------------------------------------------------------
    ---------------------------------------------------------
    This e-mail and its attachments contain confidential information from
    HUAWEI, which
    is intended only for the person or entity whose address is listed above. Any
    use of the
    information contained herein in any way (including, but not limited to,
    total or partial
    disclosure, reproduction, or dissemination) by persons other than the
    intended
    recipient(s) is prohibited. If you receive this e-mail in error, please
    notify the sender by
    phone or email immediately and delete it!ss



    _____

    From: Teng, James
    Sent: Tuesday, July 12, 2011 12:16 PM
    To: common-user@hadoop.apache.org
    Subject: FW: type mismatch error



    hi, all.

    I am a new hadoop beginner, I try to construct a map and reduce task to run,
    however encountered an exception while continue going further.

    Exception:

    java.io.IOException: Type mismatch in key from map: expected
    org.apache.hadoop.io.Text, recieved org.apache.hadoop.io.LongWritable

    at
    org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:845)

    at
    org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:541)

    at
    org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputCont
    ext.java:80)

    at org.apache.hadoop.mapreduce.Mapper.map(Mapper.java:124)

    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)

    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:621)

    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)

    at
    org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177)

    Mapper:





    Any help is greatly appreciated!

    Thanks in advance.



    James, Teng (Teng Linxiao)

    eRL,   CDC,    eBay,    Shanghai

    Extension:        86-21-28913530

    MSN:       tenglinxiao@hotmail.com

    Skype:                James,Teng

    Email:            xteng@ebay.com




    --
    Harsh J
  • Joey Echeverria at Jul 12, 2011 at 1:16 pm
    Your map method is misnamed. It should be in all lower case.

    -Joey
    On Jul 12, 2011 2:46 AM, "Teng, James" wrote:

    hi, all.
    I am a new hadoop beginner, I try to construct a map and reduce task to
    run, however encountered an exception while continue going further.
    Exception:
    java.io.IOException: Type mismatch in key from map: expected
    org.apache.hadoop.io.Text, recieved org.apache.hadoop.io.LongWritable
    at
    org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:845)
    at
    org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:541)
    at
    org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
    at org.apache.hadoop.mapreduce.Mapper.map(Mapper.java:124)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:621)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
    at
    org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177)
    Mapper:
    public class TaskMapper extends Mapper<LongWritable , Text ,Text ,Text>{
    public void Map(LongWritable key, Text value, Context context)
    {
    try {
    System.out.println(value.toString());
    context.write(new
    Text(Calendar.getInstance().getTime().toString()),value);
    } catch (IOException e) {
    // TODO Auto-generated catch block
    e.printStackTrace();
    } catch (InterruptedException e) {
    // TODO Auto-generated catch block
    e.printStackTrace();
    }
    }
    }
    Reducer:
    public class TaskReducer extends Reducer<Text , Text ,Text ,Text>{
    public void reduce(Text key,Iterable<Text> value, Context context)
    {
    try {
    context.write(key,new Text(value.iterator().hashCode()+""));
    } catch (IOException e) {
    // TODO Auto-generated catch block
    e.printStackTrace();
    } catch (InterruptedException e) {
    // TODO Auto-generated catch block
    e.printStackTrace();
    }
    }

    }
    Main(Entry)
    public class TaskEntry extends Configured implements Tool{

    @Override
    public int run(String[] arg0) throws Exception {
    Job job=new Job(getConf());
    job.setMapOutputKeyClass(Text.class);
    job.setMapOutputValueClass(Text.class);
    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(Text.class);
    //job.setInputFormatClass(KeyValueTextInputFormat.class);
    FileInputFormat.addInputPath(job, new Path(arg0[0]));
    FileOutputFormat.setOutputPath(job, new Path(arg0[1]));
    job.setMapperClass(TaskMapper.class);
    job.setReducerClass(TaskReducer.class);
    return job.waitForCompletion(true)?0:1;
    }
    public static void main(String args[]) throws Exception
    {
    int exitCode=ToolRunner.run(new TaskEntry(), args);
    System.exit(exitCode);
    }

    }


    Any help is greatly appreciated!
    Thanks in advance.

    James, Teng (Teng Linxiao)
    eRL, CDC, eBay, Shanghai
    Extension: 86-21-28913530
    MSN: tenglinxiao@hotmail.com Skype: James,Teng
    Email: xteng@ebay.com [cid:image002.gif@01CC3FBD.676D31C0]

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedJul 12, '11 at 6:46a
activeJul 12, '11 at 1:16p
posts4
users4
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase