Hi,
I guess I am not the first one to see the following exception when
trying to initialize a LineRecordReader. However, so far I could't
figure out a workaround for this problem.
I saw that this problem was fixed in the svn, but when I checked out one
of the 0.23.0 versions (I can't remember which) I couldn't get the
servers started. (Not sure if that was due to the revision, or me
messing up the setup).
So maybe someone can point me to a known workaround for this problem or
at least hinting a revision that other people got to work?
Kind regards,
Claus
ps: Essentially I am just doing this:
public static class MyInputFormat extends
FileInputFormat<Text, ByteWritable>
{
@Override
public RecordReader<Text, ByteWritable>
createRecordReader(InputSplit inputSplit, TaskAttemptContext
taskAttemptContext)
throws IOException, InterruptedException
{
MyRecordReader result = new MyRecordReader();
result.initialize(inputSplit, taskAttemptContext);
return result;
}
}
public static class MyRecordReader extends
RecordReader<Text, ByteWritable>
{
LineRecordReader myReader = new LineRecordReader();
...
@Override
public void initialize(InputSplit inputSplit,
TaskAttemptContext taskAttemptContext)
throws IOException, InterruptedException
{
myReader.initialize(inputSplit, taskAttemptContext); //
EXCEPTION THROWN HERE
}
...
}
job.setInputFormatClass(MyInputFormat.class);
The exception is:
java.lang.Exception: java.lang.ClassCastException:
org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl cannot be cast
to org.apache.hadoop.mapreduce.MapContext
at
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:371)
~[hadoop-mapred-0.21.0.jar:na]
java.lang.ClassCastException:
org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl cannot be cast
to org.apache.hadoop.mapreduce.MapContext
at
org.apache.hadoop.mapreduce.lib.input.LineRecordReader.initialize(LineRecordReader.java:75)
~[hadoop-mapred-0.21.0.jar:na]
(rest of the output is within my code)