FAQ
Hi, having got a few of the examples working on a small cluster of
machines I tried writing my own map reduce task to run. Its basically
similar to the PiEstimator (infact I copied much of the configuration
code over with small modifications). I've packaged it all up in a jar
file and tried the following, but get an exception:ray10% bin/hadoop jar
euler.jar euler 0.1 3
T = 3, step size = 0.1
Wrote input for T = 0.1

[etc etc...]

Wrote input for T = 2.9999993
Starting Job
Done first bit //This was just a debugging println to check it could see
the JobClient class
java.lang.NoSuchMethodError:
org.apache.hadoop.mapred.JobClient.runJob(Lorg/apache/hadoop/mapred/JobConf;)Lorg/apache/hadoop/mapred/RunningJob;
at imperial.oliverhaggarty.EulerMapRed.launch(EulerMapRed.java:199)
at imperial.oliverhaggarty.EulerMapRed.main(EulerMapRed.java:220)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:585)
at
org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71)
at
org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:143)
at
imperial.oliverhaggarty.ExampleDriver.main(ExampleDriver.java:18)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:585)
at org.apache.hadoop.util.RunJar.main(RunJar.java:155)

The PiEstimator works fine, and uses the same method call that is
causing an exception in my own code (see below for code example). If
anyone has any ideas as to what the problem is I would be very grateful.

Thanks,
Ollie


The appropriate part of the code is:
private static void launch(float step, int Trange) throws IOException {
Configuration conf = new Configuration();
JobConf jobConf = new JobConf(conf, EulerMapRed.class);

jobConf.setJobName("euler");

// turn off speculative execution, because DFS doesn't handle
// multiple writers to the same file.
jobConf.setSpeculativeExecution(false);
jobConf.setInputFormat(SequenceFileInputFormat.class);

jobConf.setOutputKeyClass(FloatWritable.class);
jobConf.setOutputValueClass(FloatWritable.class);
jobConf.setOutputFormat(SequenceFileOutputFormat.class);

jobConf.setMapperClass(EulerMapper.class);
jobConf.setReducerClass(EulerReducer.class);

Path tmpDir = new Path("test-mini-mr");
Path inDir = new Path(tmpDir, "in");
Path outDir = new Path(tmpDir, "out");
FileSystem fileSys = FileSystem.get(jobConf);
fileSys.delete(tmpDir);
if (!fileSys.mkdirs(inDir)) {
throw new IOException("Mkdirs failed to create " + inDir.toString());
}

jobConf.setInputPath(inDir);
jobConf.setOutputPath(outDir);

jobConf.setNumMapTasks(Trange);

for(float T = step; T < Trange; T += step) {
Path file = new Path(inDir, "part"+T);
SequenceFile.Writer writer = SequenceFile.createWriter(fileSys,
jobConf,
file, FloatWritable.class, FloatWritable.class,
CompressionType.NONE);
writer.append(new FloatWritable(T), new FloatWritable(0));
writer.close();
System.out.println("Wrote input for T = "+T);
}

try {
System.out.println("Starting Job");
long startTime = System.currentTimeMillis();

JobClient.getTaskOutputFilter(jobConf);//just put in to //check
it can see the JobClient class
System.out.println("Done first bit");
JobClient.runJob(jobConf); //This line is causing the //problem
System.out.println("Job Finished in "+
(float)(System.currentTimeMillis() - startTime)/1000.0
+ " seconds");

} finally {
fileSys.delete(tmpDir);
}

Search Discussions

  • Tom White at Jun 26, 2007 at 7:04 pm
    I've seen this exception when compiling against one version of hadoop
    (e.g. 0.12.3) and running against another (e.g. 0.13.0). Could this be
    the case?

    Hope this helps,

    Tom
    On 26/06/07, Oliver Haggarty wrote:
    Hi, having got a few of the examples working on a small cluster of
    machines I tried writing my own map reduce task to run. Its basically
    similar to the PiEstimator (infact I copied much of the configuration
    code over with small modifications). I've packaged it all up in a jar
    file and tried the following, but get an exception:ray10% bin/hadoop jar
    euler.jar euler 0.1 3
    T = 3, step size = 0.1
    Wrote input for T = 0.1

    [etc etc...]

    Wrote input for T = 2.9999993
    Starting Job
    Done first bit //This was just a debugging println to check it could see
    the JobClient class
    java.lang.NoSuchMethodError:
    org.apache.hadoop.mapred.JobClient.runJob(Lorg/apache/hadoop/mapred/JobConf;)Lorg/apache/hadoop/mapred/RunningJob;
    at imperial.oliverhaggarty.EulerMapRed.launch(EulerMapRed.java:199)
    at imperial.oliverhaggarty.EulerMapRed.main(EulerMapRed.java:220)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    at
    org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71)
    at
    org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:143)
    at
    imperial.oliverhaggarty.ExampleDriver.main(ExampleDriver.java:18)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:155)

    The PiEstimator works fine, and uses the same method call that is
    causing an exception in my own code (see below for code example). If
    anyone has any ideas as to what the problem is I would be very grateful.

    Thanks,
    Ollie


    The appropriate part of the code is:
    private static void launch(float step, int Trange) throws IOException {
    Configuration conf = new Configuration();
    JobConf jobConf = new JobConf(conf, EulerMapRed.class);

    jobConf.setJobName("euler");

    // turn off speculative execution, because DFS doesn't handle
    // multiple writers to the same file.
    jobConf.setSpeculativeExecution(false);
    jobConf.setInputFormat(SequenceFileInputFormat.class);

    jobConf.setOutputKeyClass(FloatWritable.class);
    jobConf.setOutputValueClass(FloatWritable.class);
    jobConf.setOutputFormat(SequenceFileOutputFormat.class);

    jobConf.setMapperClass(EulerMapper.class);
    jobConf.setReducerClass(EulerReducer.class);

    Path tmpDir = new Path("test-mini-mr");
    Path inDir = new Path(tmpDir, "in");
    Path outDir = new Path(tmpDir, "out");
    FileSystem fileSys = FileSystem.get(jobConf);
    fileSys.delete(tmpDir);
    if (!fileSys.mkdirs(inDir)) {
    throw new IOException("Mkdirs failed to create " + inDir.toString());
    }

    jobConf.setInputPath(inDir);
    jobConf.setOutputPath(outDir);

    jobConf.setNumMapTasks(Trange);

    for(float T = step; T < Trange; T += step) {
    Path file = new Path(inDir, "part"+T);
    SequenceFile.Writer writer = SequenceFile.createWriter(fileSys,
    jobConf,
    file, FloatWritable.class, FloatWritable.class,
    CompressionType.NONE);
    writer.append(new FloatWritable(T), new FloatWritable(0));
    writer.close();
    System.out.println("Wrote input for T = "+T);
    }

    try {
    System.out.println("Starting Job");
    long startTime = System.currentTimeMillis();

    JobClient.getTaskOutputFilter(jobConf);//just put in to //check
    it can see the JobClient class
    System.out.println("Done first bit");
    JobClient.runJob(jobConf); //This line is causing the //problem
    System.out.println("Job Finished in "+
    (float)(System.currentTimeMillis() - startTime)/1000.0
    + " seconds");

    } finally {
    fileSys.delete(tmpDir);
    }
  • Oliver Haggarty at Jun 27, 2007 at 12:55 pm
    Thanks Tom, that was exactly what I was doing - I had version 0.12.3
    installed, but downloaded the latest version from svn into Eclipse for
    development. I've now installed version 0.13.0 and the program runs.
    Just need to fix the bugs now! Thanks very much for your help.

    Ollie

    Tom White wrote:
    I've seen this exception when compiling against one version of hadoop
    (e.g. 0.12.3) and running against another (e.g. 0.13.0). Could this be
    the case?

    Hope this helps,

    Tom
    On 26/06/07, Oliver Haggarty wrote:
    Hi, having got a few of the examples working on a small cluster of
    machines I tried writing my own map reduce task to run. Its basically
    similar to the PiEstimator (infact I copied much of the configuration
    code over with small modifications). I've packaged it all up in a jar
    file and tried the following, but get an exception:ray10% bin/hadoop jar
    euler.jar euler 0.1 3
    T = 3, step size = 0.1
    Wrote input for T = 0.1

    [etc etc...]

    Wrote input for T = 2.9999993
    Starting Job
    Done first bit //This was just a debugging println to check it could see
    the JobClient class
    java.lang.NoSuchMethodError:
    org.apache.hadoop.mapred.JobClient.runJob(Lorg/apache/hadoop/mapred/JobConf;)Lorg/apache/hadoop/mapred/RunningJob;

    at
    imperial.oliverhaggarty.EulerMapRed.launch(EulerMapRed.java:199)
    at
    imperial.oliverhaggarty.EulerMapRed.main(EulerMapRed.java:220)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:585)
    at
    org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71)

    at
    org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:143)
    at
    imperial.oliverhaggarty.ExampleDriver.main(ExampleDriver.java:18)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:585)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:155)

    The PiEstimator works fine, and uses the same method call that is
    causing an exception in my own code (see below for code example). If
    anyone has any ideas as to what the problem is I would be very grateful.

    Thanks,
    Ollie


    The appropriate part of the code is:
    private static void launch(float step, int Trange) throws IOException {
    Configuration conf = new Configuration();
    JobConf jobConf = new JobConf(conf, EulerMapRed.class);

    jobConf.setJobName("euler");

    // turn off speculative execution, because DFS doesn't handle
    // multiple writers to the same file.
    jobConf.setSpeculativeExecution(false);
    jobConf.setInputFormat(SequenceFileInputFormat.class);

    jobConf.setOutputKeyClass(FloatWritable.class);
    jobConf.setOutputValueClass(FloatWritable.class);
    jobConf.setOutputFormat(SequenceFileOutputFormat.class);

    jobConf.setMapperClass(EulerMapper.class);
    jobConf.setReducerClass(EulerReducer.class);

    Path tmpDir = new Path("test-mini-mr");
    Path inDir = new Path(tmpDir, "in");
    Path outDir = new Path(tmpDir, "out");
    FileSystem fileSys = FileSystem.get(jobConf);
    fileSys.delete(tmpDir);
    if (!fileSys.mkdirs(inDir)) {
    throw new IOException("Mkdirs failed to create " +
    inDir.toString());
    }

    jobConf.setInputPath(inDir);
    jobConf.setOutputPath(outDir);

    jobConf.setNumMapTasks(Trange);

    for(float T = step; T < Trange; T += step) {
    Path file = new Path(inDir, "part"+T);
    SequenceFile.Writer writer =
    SequenceFile.createWriter(fileSys,
    jobConf,
    file, FloatWritable.class, FloatWritable.class,
    CompressionType.NONE);
    writer.append(new FloatWritable(T), new
    FloatWritable(0));
    writer.close();
    System.out.println("Wrote input for T = "+T);
    }

    try {
    System.out.println("Starting Job");
    long startTime = System.currentTimeMillis();

    JobClient.getTaskOutputFilter(jobConf);//just put in
    to //check
    it can see the JobClient class
    System.out.println("Done first bit");
    JobClient.runJob(jobConf); //This line is causing the
    //problem
    System.out.println("Job Finished in "+
    (float)(System.currentTimeMillis() -
    startTime)/1000.0
    + " seconds");

    } finally {
    fileSys.delete(tmpDir);
    }

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedJun 26, '07 at 1:26p
activeJun 27, '07 at 12:55p
posts3
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

Oliver Haggarty: 2 posts Tom White: 1 post

People

Translate

site design / logo © 2022 Grokbase