Hi guys !

I was trying to generate job trace and topology trace.
I have hadoop set up for hduser at /usr/local/hadoop and ran wordcount
program as hduser .
I have mapreduce component set up in eclipse for user "arun".
I set for a configuration :
Class: org.apache.hadoop.tools.rumen.TraceBuilder
Args : /home/arun/Documents/jh.json /home/arun/Documents/top.json
/usr/local/hadoop/logs/history

When i run it i get the following error even when i have given :
${HADOOP_HOME} chmod -R 777 /logs

11/12/08 11:55:17 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
11/12/08 11:55:18 WARN fs.FSInputChecker: Problem opening checksum file:
file:/usr/local/hadoop/logs/history/job_201109191122_0002_1316411924369_hduser_word+count.
Ignoring exception: java.io.EOFException
at java.io.DataInputStream.readFully(DataInputStream.java:180)
at java.io.DataInputStream.readFully(DataInputStream.java:152)
at
org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.(ChecksumFileSystem.java:311)
at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:534)
at
org.apache.hadoop.tools.rumen.PossiblyDecompressedInputStream.(DefaultInputDemuxer.java:42)
at org.apache.hadoop.tools.rumen.TraceBuilder.run(TraceBuilder.java:225)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:69)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:83)
at org.apache.hadoop.tools.rumen.TraceBuilder.main(TraceBuilder.java:142)

Any help ?

Arun

Search Discussions

  • Amar Kamat at Dec 8, 2011 at 6:43 am
    Arun,
    Did you modify the job history file manually? Looks like HDFS is not able to match the job history file contents to its checksum. Try deleting the checksum file. Note that the checksum file is a hidden file.
    Amar


    On 12/8/11 12:03 PM, "arun k" wrote:

    Hi guys !

    I was trying to generate job trace and topology trace.
    I have hadoop set up for hduser at /usr/local/hadoop and ran wordcount program as hduser .
    I have mapreduce component set up in eclipse for user "arun".
    I set for a configuration :
    Class: org.apache.hadoop.tools.rumen.TraceBuilder
    Args : /home/arun/Documents/jh.json /home/arun/Documents/top.json /usr/local/hadoop/logs/history

    When i run it i get the following error even when i have given : ${HADOOP_HOME} chmod -R 777 /logs

    11/12/08 11:55:17 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    11/12/08 11:55:18 WARN fs.FSInputChecker: Problem opening checksum file: file:/usr/local/hadoop/logs/history/job_201109191122_0002_1316411924369_hduser_word+count. Ignoring exception: java.io.EOFException
    at java.io.DataInputStream.readFully(DataInputStream.java:180)
    at java.io.DataInputStream.readFully(DataInputStream.java:152)
    at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.(ChecksumFileSystem.java:311)
    at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:534)
    at org.apache.hadoop.tools.rumen.PossiblyDecompressedInputStream.(DefaultInputDemuxer.java:42)
    at org.apache.hadoop.tools.rumen.TraceBuilder.run(TraceBuilder.java:225)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:69)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:83)
    at org.apache.hadoop.tools.rumen.TraceBuilder.main(TraceBuilder.java:142)

    Any help ?

    Arun
  • Arun k at Dec 8, 2011 at 2:14 pm
    Amar,

    I didn't modfiy Jobhistory file but i deleted hidden checksumfiels and it
    worked.
    I have a 3 node cluster set up.
    For generating a single job i am running say wordcount. Is there any way
    where i can generate good number of jobs(say 50,100,..) at a time ?
    How can i generate jobs trace for 50, 100, so on by keeping two nodes under
    one rack and the other in second rack in my cluster ?
    HOw do i control the placement of nodes in different rack and generate job
    and topology trace ?


    Arun
    On Thu, Dec 8, 2011 at 12:11 PM, Amar Kamat wrote:

    Arun,
    Did you modify the job history file manually? Looks like HDFS is not able
    to match the job history file contents to its checksum. Try deleting the
    checksum file. Note that the checksum file is a hidden file.
    Amar



    On 12/8/11 12:03 PM, "arun k" wrote:

    Hi guys !

    I was trying to generate job trace and topology trace.
    I have hadoop set up for hduser at /usr/local/hadoop and ran wordcount
    program as hduser .
    I have mapreduce component set up in eclipse for user "arun".
    I set for a configuration :
    Class: org.apache.hadoop.tools.rumen.TraceBuilder
    Args : /home/arun/Documents/jh.json /home/arun/Documents/top.json
    /usr/local/hadoop/logs/history

    When i run it i get the following error even when i have given :
    ${HADOOP_HOME} chmod -R 777 /logs

    11/12/08 11:55:17 WARN util.NativeCodeLoader: Unable to load native-hadoop
    library for your platform... using builtin-java classes where applicable
    11/12/08 11:55:18 WARN fs.FSInputChecker: Problem opening checksum file:
    file:/usr/local/hadoop/logs/history/job_201109191122_0002_1316411924369_hduser_word+count.
    Ignoring exception: java.io.EOFException
    at java.io.DataInputStream.readFully(DataInputStream.java:180)
    at java.io.DataInputStream.readFully(DataInputStream.java:152)
    at
    org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:145)
    at
    org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:311)
    at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:534)
    at
    org.apache.hadoop.tools.rumen.PossiblyDecompressedInputStream.<init>(PossiblyDecompressedInputStream.java:42)
    at
    org.apache.hadoop.tools.rumen.DefaultInputDemuxer.bindTo(DefaultInputDemuxer.java:42)
    at org.apache.hadoop.tools.rumen.TraceBuilder.run(TraceBuilder.java:225)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:69)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:83)
    at org.apache.hadoop.tools.rumen.TraceBuilder.main(TraceBuilder.java:142)

    Any help ?

    Arun


  • Arun k at Dec 8, 2011 at 2:25 pm
    Amar,

    Job trace and topology trace are generated but the TraceBuilder program
    terminates after displaying

    11/12/08 11:55:17 WARN util.NativeCodeLoader: Unable to load native-hadoop
    library for your platform... using builtin-java classes where applicable

    It doesn't indicate whether it was a success and doesn't give any other
    info. Is it Ok ?
    I have used the generated trace with Mumak and it works fine.

    Arun

    On Thu, Dec 8, 2011 at 7:43 PM, arun k wrote:

    Amar,

    I didn't modfiy Jobhistory file but i deleted hidden checksumfiels and it
    worked.
    I have a 3 node cluster set up.
    For generating a single job i am running say wordcount. Is there any way
    where i can generate good number of jobs(say 50,100,..) at a time ?
    How can i generate jobs trace for 50, 100, so on by keeping two nodes
    under one rack and the other in second rack in my cluster ?
    HOw do i control the placement of nodes in different rack and generate job
    and topology trace ?


    Arun

    On Thu, Dec 8, 2011 at 12:11 PM, Amar Kamat wrote:

    Arun,
    Did you modify the job history file manually? Looks like HDFS is not able
    to match the job history file contents to its checksum. Try deleting the
    checksum file. Note that the checksum file is a hidden file.
    Amar



    On 12/8/11 12:03 PM, "arun k" wrote:

    Hi guys !

    I was trying to generate job trace and topology trace.
    I have hadoop set up for hduser at /usr/local/hadoop and ran wordcount
    program as hduser .
    I have mapreduce component set up in eclipse for user "arun".
    I set for a configuration :
    Class: org.apache.hadoop.tools.rumen.TraceBuilder
    Args : /home/arun/Documents/jh.json /home/arun/Documents/top.json
    /usr/local/hadoop/logs/history

    When i run it i get the following error even when i have given :
    ${HADOOP_HOME} chmod -R 777 /logs

    11/12/08 11:55:17 WARN util.NativeCodeLoader: Unable to load
    native-hadoop library for your platform... using builtin-java classes where
    applicable
    11/12/08 11:55:18 WARN fs.FSInputChecker: Problem opening checksum file:
    file:/usr/local/hadoop/logs/history/job_201109191122_0002_1316411924369_hduser_word+count.
    Ignoring exception: java.io.EOFException
    at java.io.DataInputStream.readFully(DataInputStream.java:180)
    at java.io.DataInputStream.readFully(DataInputStream.java:152)
    at
    org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:145)
    at
    org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:311)
    at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:534)
    at
    org.apache.hadoop.tools.rumen.PossiblyDecompressedInputStream.<init>(PossiblyDecompressedInputStream.java:42)
    at
    org.apache.hadoop.tools.rumen.DefaultInputDemuxer.bindTo(DefaultInputDemuxer.java:42)
    at org.apache.hadoop.tools.rumen.TraceBuilder.run(TraceBuilder.java:225)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:69)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:83)
    at org.apache.hadoop.tools.rumen.TraceBuilder.main(TraceBuilder.java:142)

    Any help ?

    Arun


Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupmapreduce-user @
categorieshadoop
postedDec 8, '11 at 6:34a
activeDec 8, '11 at 2:25p
posts4
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

Arun k: 3 posts Amar Kamat: 1 post

People

Translate

site design / logo © 2022 Grokbase