FAQ
There is a proper decommissioning process to remove dead nodes. See the FAQ
link here:
http://wiki.apache.org/hadoop/FAQ#I_want_to_make_a_large_cluster_smaller_by_
taking_out_a_bunch_of_nodes_simultaneously._How_can_this_be_done.3F

For a fact $HADOOP_HOME/conf/slaves is not used by the name node to keep
track of datanodes/tasktracker. It is merely used by the stop/start hadoop
scripts to know which nodes to start datanode / tasktracker services.
Similarly there is confusion regarding understanding the
$HADOOP_HOME/conf/master file. That file contains the details of the machine
where secondary name node is running, not the name node/job tracker.

With regards to not all java/hadoop processes getting killed, this may be
happening due to hadoop loosing track of pid files. By default the pid files
are configured to be created in the /tmp directory. If these pid files get
deleted then stop/start scripts cannot detect running hadoop processes. I
suggest changing location of pid files to a persistent location like
/var/hadoop/. The $HADOOP_HOME/conf/hadoop-env.sh file has details on
configuring the PID location

- Sudhir


On 12/7/10 5:07 PM, "common-user-digest-help@hadoop.apache.org"
wrote:
From: Tali K <ncherryus@hotmail.com>
Date: Tue, 7 Dec 2010 10:40:16 -0800
To: <core-user@hadoop.apache.org>
Subject: Help: 1) Hadoop processes still are running after we stopped
hadoop.2) How to exclude a dead node?


1)When I stopped hadoop, we checked all the nodes and found that 2 or 3
java/hadoop processes were still running on each node. So we went to each
node and did a 'killall java' - in some cases I had to do 'killall -9 java'.
My question : why is is this happening and what would be recommendations , how
to make sure that there is no hadoop processes running after I stopped hadoop
with stop-all.sh?

2) Also we have a dead node. We removed this node from
$HADOOP_HOME/conf/slaves. This file is supposed to tell the namenode
which machines are supposed to be datanodes/tasktrackers.
We started hadoop again, and were surprised to see a dead node in hadoop
'report' ("$HADOOP_HOME/bin/hadoop dfsadmin -report|less")
It is only after blocking a deadnode and restarting hadoop, deadnode no longer
showed up in hreport.
Any recommendations, how to deal with dead nodes?

iCrossing Privileged and Confidential Information
This email message is for the sole use of the intended recipient(s) and may contain confidential and privileged information of iCrossing. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message.

Search Discussions

  • Sudhir Vallamkondu at Dec 8, 2010 at 3:59 am
    I second Ed's answer. Try unistalling whatever you installed and start
    fresh. Whenever I see this error when trying to installing a native bridge,
    this solution always worked for me.


    On 12/7/10 5:07 PM, "common-user-digest-help@hadoop.apache.org"
    wrote:
    From: Edward Capriolo <edlinuxguru@gmail.com>
    Date: Tue, 7 Dec 2010 17:22:03 -0500
    To: <common-user@hadoop.apache.org>
    Subject: Re: HDFS and libhfds

    2010/12/7 Petrucci Andreas <petrucci_2005@hotmail.com>:
    hello there, im trying to compile libhdfs in order  but there are some
    problems. According to http://wiki.apache.org/hadoop/MountableHDFS  i have
    already installes fuse. With ant compile-c++-libhdfs -Dlibhdfs=1 the buils is
    successful.

    However when i try ant package -Djava5.home=... -Dforrest.home=... the build
    fails and the output is the below :

    [exec]
    [exec] Exception in thread "main" java.lang.UnsupportedClassVersionError:
    Bad version number in .class file
    [exec]     at java.lang.ClassLoader.defineClass1(Native Method)
    [exec]     at java.lang.ClassLoader.defineClass(ClassLoader.java:620)
    [exec]     at
    java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
    [exec]     at
    java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
    [exec]     at java.net.URLClassLoader.access$100(URLClassLoader.java:56)
    [exec]     at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
    [exec]     at java.security.AccessController.doPrivileged(Native Method)
    [exec]     at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
    [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    [exec]     at
    sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
    [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
    [exec]     at
    org.apache.avalon.excalibur.logger.DefaultLogTargetFactoryManager.configure(D
    efaultLogTargetFactoryManager.java:113)
    [exec]     at
    org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.j
    ava:201)
    [exec]     at
    org.apache.avalon.excalibur.logger.LogKitLoggerManager.setupTargetFactoryMana
    ger(LogKitLoggerManager.java:436)
    [exec]     at
    org.apache.avalon.excalibur.logger.LogKitLoggerManager.configure(LogKitLogger
    Manager.java:400)
    [exec]     at
    org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.j
    ava:201)
    [exec]     at
    org.apache.cocoon.core.CoreUtil.initLogger(CoreUtil.java:607)
    [exec]     at org.apache.cocoon.core.CoreUtil.init(CoreUtil.java:169)
    [exec]     at org.apache.cocoon.core.CoreUtil.<init>(CoreUtil.java:115)
    [exec]     at
    org.apache.cocoon.bean.CocoonWrapper.initialize(CocoonWrapper.java:128)
    [exec]     at
    org.apache.cocoon.bean.CocoonBean.initialize(CocoonBean.java:97)
    [exec]     at org.apache.cocoon.Main.main(Main.java:310)
    [exec] Java Result: 1
    [exec]
    [exec]   Copying broken links file to site root.
    [exec]
    [exec]
    [exec] BUILD FAILED
    [exec] /apache-forrest-0.8/main/targets/site.xml:175: Warning: Could not
    find file /hadoop-0.20.2/src/docs/build/tmp/brokenlinks.xml to copy.
    [exec]
    [exec] Total time: 4 seconds

    BUILD FAILED
    /hadoop-0.20.2/build.xml:867: exec returned: 1


    any ideas what's wrong???
    I never saw this usage:
    -Djava5.home
    Try
    export JAVA_HOME=/usr/java

    " Bad version number in .class file " means you are mixing and
    matching java versions somehow.

    iCrossing Privileged and Confidential Information
    This email message is for the sole use of the intended recipient(s) and may contain confidential and privileged information of iCrossing. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message.
  • Sudhir Vallamkondu at Dec 8, 2010 at 5:45 am
    I second Ed's answer. Try unistalling whatever you installed and start
    fresh. Whenever I see this error when trying to installing a native bridge,
    this solution always worked for me.


    On 12/7/10 5:07 PM, "common-user-digest-help@hadoop.apache.org"
    wrote:
    From: Edward Capriolo <edlinuxguru@gmail.com>
    Date: Tue, 7 Dec 2010 17:22:03 -0500
    To: <common-user@hadoop.apache.org>
    Subject: Re: HDFS and libhfds

    2010/12/7 Petrucci Andreas <petrucci_2005@hotmail.com>:
    hello there, im trying to compile libhdfs in order  but there are some
    problems. According to http://wiki.apache.org/hadoop/MountableHDFS  i have
    already installes fuse. With ant compile-c++-libhdfs -Dlibhdfs=1 the buils is
    successful.

    However when i try ant package -Djava5.home=... -Dforrest.home=... the build
    fails and the output is the below :

    [exec]
    [exec] Exception in thread "main" java.lang.UnsupportedClassVersionError:
    Bad version number in .class file
    [exec]     at java.lang.ClassLoader.defineClass1(Native Method)
    [exec]     at java.lang.ClassLoader.defineClass(ClassLoader.java:620)
    [exec]     at
    java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
    [exec]     at
    java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
    [exec]     at java.net.URLClassLoader.access$100(URLClassLoader.java:56)
    [exec]     at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
    [exec]     at java.security.AccessController.doPrivileged(Native Method)
    [exec]     at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
    [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    [exec]     at
    sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
    [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
    [exec]     at
    org.apache.avalon.excalibur.logger.DefaultLogTargetFactoryManager.configure(D
    efaultLogTargetFactoryManager.java:113)
    [exec]     at
    org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.j
    ava:201)
    [exec]     at
    org.apache.avalon.excalibur.logger.LogKitLoggerManager.setupTargetFactoryMana
    ger(LogKitLoggerManager.java:436)
    [exec]     at
    org.apache.avalon.excalibur.logger.LogKitLoggerManager.configure(LogKitLogger
    Manager.java:400)
    [exec]     at
    org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.j
    ava:201)
    [exec]     at
    org.apache.cocoon.core.CoreUtil.initLogger(CoreUtil.java:607)
    [exec]     at org.apache.cocoon.core.CoreUtil.init(CoreUtil.java:169)
    [exec]     at org.apache.cocoon.core.CoreUtil.<init>(CoreUtil.java:115)
    [exec]     at
    org.apache.cocoon.bean.CocoonWrapper.initialize(CocoonWrapper.java:128)
    [exec]     at
    org.apache.cocoon.bean.CocoonBean.initialize(CocoonBean.java:97)
    [exec]     at org.apache.cocoon.Main.main(Main.java:310)
    [exec] Java Result: 1
    [exec]
    [exec]   Copying broken links file to site root.
    [exec]
    [exec]
    [exec] BUILD FAILED
    [exec] /apache-forrest-0.8/main/targets/site.xml:175: Warning: Could not
    find file /hadoop-0.20.2/src/docs/build/tmp/brokenlinks.xml to copy.
    [exec]
    [exec] Total time: 4 seconds

    BUILD FAILED
    /hadoop-0.20.2/build.xml:867: exec returned: 1


    any ideas what's wrong???
    I never saw this usage:
    -Djava5.home
    Try
    export JAVA_HOME=/usr/java

    " Bad version number in .class file " means you are mixing and
    matching java versions somehow.

    iCrossing Privileged and Confidential Information
    This email message is for the sole use of the intended recipient(s) and may contain confidential and privileged information of iCrossing. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message.
  • Petrucci Andreas at Dec 8, 2010 at 6:59 am
    thanks for the replies, this solved my problems

    http://mail-archives.apache.org/mod_mbox/hadoop-common-user/200909.mbox/%3C6F5C1D715B2DA5498A628E6B9C124F0401452213AA@hasmsx504.ger.corp.intel.com%3E

    ...i think i should write a post in my blog about this night with hdfs, libhdfs and fuse...
    Date: Tue, 7 Dec 2010 22:44:39 -0700
    Subject: Re: HDFS and libhfds
    From: Sudhir.Vallamkondu@icrossing.com
    To: common-user@hadoop.apache.org

    I second Ed's answer. Try unistalling whatever you installed and start
    fresh. Whenever I see this error when trying to installing a native bridge,
    this solution always worked for me.


    On 12/7/10 5:07 PM, "common-user-digest-help@hadoop.apache.org"
    wrote:
    From: Edward Capriolo <edlinuxguru@gmail.com>
    Date: Tue, 7 Dec 2010 17:22:03 -0500
    To: <common-user@hadoop.apache.org>
    Subject: Re: HDFS and libhfds

    2010/12/7 Petrucci Andreas <petrucci_2005@hotmail.com>:
    hello there, im trying to compile libhdfs in order but there are some
    problems. According to http://wiki.apache.org/hadoop/MountableHDFS i have
    already installes fuse. With ant compile-c++-libhdfs -Dlibhdfs=1 the buils is
    successful.

    However when i try ant package -Djava5.home=... -Dforrest.home=... the build
    fails and the output is the below :

    [exec]
    [exec] Exception in thread "main" java.lang.UnsupportedClassVersionError:
    Bad version number in .class file
    [exec] at java.lang.ClassLoader.defineClass1(Native Method)
    [exec] at java.lang.ClassLoader.defineClass(ClassLoader.java:620)
    [exec] at
    java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
    [exec] at
    java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
    [exec] at java.net.URLClassLoader.access$100(URLClassLoader.java:56)
    [exec] at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
    [exec] at java.security.AccessController.doPrivileged(Native Method)
    [exec] at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
    [exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    [exec] at
    sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
    [exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
    [exec] at
    org.apache.avalon.excalibur.logger.DefaultLogTargetFactoryManager.configure(D
    efaultLogTargetFactoryManager.java:113)
    [exec] at
    org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.j
    ava:201)
    [exec] at
    org.apache.avalon.excalibur.logger.LogKitLoggerManager.setupTargetFactoryMana
    ger(LogKitLoggerManager.java:436)
    [exec] at
    org.apache.avalon.excalibur.logger.LogKitLoggerManager.configure(LogKitLogger
    Manager.java:400)
    [exec] at
    org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.j
    ava:201)
    [exec] at
    org.apache.cocoon.core.CoreUtil.initLogger(CoreUtil.java:607)
    [exec] at org.apache.cocoon.core.CoreUtil.init(CoreUtil.java:169)
    [exec] at org.apache.cocoon.core.CoreUtil.<init>(CoreUtil.java:115)
    [exec] at
    org.apache.cocoon.bean.CocoonWrapper.initialize(CocoonWrapper.java:128)
    [exec] at
    org.apache.cocoon.bean.CocoonBean.initialize(CocoonBean.java:97)
    [exec] at org.apache.cocoon.Main.main(Main.java:310)
    [exec] Java Result: 1
    [exec]
    [exec] Copying broken links file to site root.
    [exec]
    [exec]
    [exec] BUILD FAILED
    [exec] /apache-forrest-0.8/main/targets/site.xml:175: Warning: Could not
    find file /hadoop-0.20.2/src/docs/build/tmp/brokenlinks.xml to copy.
    [exec]
    [exec] Total time: 4 seconds

    BUILD FAILED
    /hadoop-0.20.2/build.xml:867: exec returned: 1


    any ideas what's wrong???
    I never saw this usage:
    -Djava5.home
    Try
    export JAVA_HOME=/usr/java

    " Bad version number in .class file " means you are mixing and
    matching java versions somehow.

    iCrossing Privileged and Confidential Information
    This email message is for the sole use of the intended recipient(s) and may contain confidential and privileged information of iCrossing. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message.
  • Konstantin Boudnik at Dec 8, 2010 at 7:06 am
    Feel free to update https://issues.apache.org/jira/browse/HDFS-1519 if
    you find it suitable.


    2010/12/7 Petrucci Andreas <petrucci_2005@hotmail.com>:
    thanks for the replies, this solved my problems

    http://mail-archives.apache.org/mod_mbox/hadoop-common-user/200909.mbox/%3C6F5C1D715B2DA5498A628E6B9C124F0401452213AA@hasmsx504.ger.corp.intel.com%3E

    ...i think i should write a post in my blog about this night with hdfs, libhdfs and fuse...
    Date: Tue, 7 Dec 2010 22:44:39 -0700
    Subject: Re: HDFS and libhfds
    From: Sudhir.Vallamkondu@icrossing.com
    To: common-user@hadoop.apache.org

    I second Ed's answer. Try unistalling whatever you installed and start
    fresh. Whenever I see this error when trying to installing a native bridge,
    this solution always worked for me.


    On 12/7/10 5:07 PM, "common-user-digest-help@hadoop.apache.org"
    wrote:
    From: Edward Capriolo <edlinuxguru@gmail.com>
    Date: Tue, 7 Dec 2010 17:22:03 -0500
    To: <common-user@hadoop.apache.org>
    Subject: Re: HDFS and libhfds

    2010/12/7 Petrucci Andreas <petrucci_2005@hotmail.com>:
    hello there, im trying to compile libhdfs in order  but there are some
    problems. According to http://wiki.apache.org/hadoop/MountableHDFS  i have
    already installes fuse. With ant compile-c++-libhdfs -Dlibhdfs=1 the buils is
    successful.

    However when i try ant package -Djava5.home=... -Dforrest.home=... the build
    fails and the output is the below :

    [exec]
    [exec] Exception in thread "main" java.lang.UnsupportedClassVersionError:
    Bad version number in .class file
    [exec]     at java.lang.ClassLoader.defineClass1(Native Method)
    [exec]     at java.lang.ClassLoader.defineClass(ClassLoader.java:620)
    [exec]     at
    java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
    [exec]     at
    java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
    [exec]     at java.net.URLClassLoader.access$100(URLClassLoader.java:56)
    [exec]     at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
    [exec]     at java.security.AccessController.doPrivileged(Native Method)
    [exec]     at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
    [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    [exec]     at
    sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
    [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
    [exec]     at
    org.apache.avalon.excalibur.logger.DefaultLogTargetFactoryManager.configure(D
    efaultLogTargetFactoryManager.java:113)
    [exec]     at
    org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.j
    ava:201)
    [exec]     at
    org.apache.avalon.excalibur.logger.LogKitLoggerManager.setupTargetFactoryMana
    ger(LogKitLoggerManager.java:436)
    [exec]     at
    org.apache.avalon.excalibur.logger.LogKitLoggerManager.configure(LogKitLogger
    Manager.java:400)
    [exec]     at
    org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.j
    ava:201)
    [exec]     at
    org.apache.cocoon.core.CoreUtil.initLogger(CoreUtil.java:607)
    [exec]     at org.apache.cocoon.core.CoreUtil.init(CoreUtil.java:169)
    [exec]     at org.apache.cocoon.core.CoreUtil.<init>(CoreUtil.java:115)
    [exec]     at
    org.apache.cocoon.bean.CocoonWrapper.initialize(CocoonWrapper.java:128)
    [exec]     at
    org.apache.cocoon.bean.CocoonBean.initialize(CocoonBean.java:97)
    [exec]     at org.apache.cocoon.Main.main(Main.java:310)
    [exec] Java Result: 1
    [exec]
    [exec]   Copying broken links file to site root.
    [exec]
    [exec]
    [exec] BUILD FAILED
    [exec] /apache-forrest-0.8/main/targets/site.xml:175: Warning: Could not
    find file /hadoop-0.20.2/src/docs/build/tmp/brokenlinks.xml to copy.
    [exec]
    [exec] Total time: 4 seconds

    BUILD FAILED
    /hadoop-0.20.2/build.xml:867: exec returned: 1


    any ideas what's wrong???
    I never saw this usage:
    -Djava5.home
    Try
    export JAVA_HOME=/usr/java

    " Bad version number in .class file " means you are mixing and
    matching java versions somehow.

    iCrossing Privileged and Confidential Information
    This email message is for the sole use of the intended recipient(s) and may contain confidential and privileged information of iCrossing. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message.
  • Sudhir Vallamkondu at Dec 8, 2010 at 4:05 am
    Try this and see if it works

    Open the build.xml file and the add env JAVA_HOME in compile-core-native
    target. After adding the change should look like below

    <exec dir="${build.native}" executable="sh" failonerror="true">
    <env key="OS_NAME" value="${os.name}"/>
    <env key="JAVA_HOME" value="/usr/java"/>
    <env key="OS_ARCH" value="${os.arch}"/>
    <env key="JVM_DATA_MODEL" value="${sun.arch.data.model}"/>
    <env key="HADOOP_NATIVE_SRCDIR" value="${native.src.dir}"/>
    <arg line="${native.src.dir}/configure"/>
    </exec>


    On 12/7/10 5:07 PM, "common-user-digest-help@hadoop.apache.org"
    wrote:
    From: Petrucci Andreas <petrucci_2005@hotmail.com>
    Date: Wed, 8 Dec 2010 02:06:26 +0200
    To: <common-user@hadoop.apache.org>
    Subject: RE: HDFS and libhfds


    yes, my JAVA_HOME is properly set. however in hadoop 0.20.2 that i'm using
    when i run from HADOOP_HOME the command ant compile-contrib -Dlibhdfs=1
    -Dcompile.c++=1 then the tail of the output is the following :

    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In
    function 'hdfsUtime':
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1488:
    error: 'JNIEnv' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1488:
    error: 'env' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1490:
    error: 'errno' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1494:
    error: 'jobject' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1494:
    error: expected ';' before 'jFS'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1497:
    error: expected ';' before 'jPath'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1498:
    error: 'jPath' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1503:
    error: 'jlong' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1503:
    error: expected ';' before 'jmtime'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1504:
    error: expected ';' before 'jatime'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1507:
    error: 'jthrowable' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1507:
    error: expected ';' before 'jExc'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1508:
    error: 'jExc' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1508:
    error: 'jFS' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1510:
    error: 'jmtime' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1510:
    error: 'jatime' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In
    function 'hdfsGetHosts':
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1533:
    error: 'JNIEnv' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1533:
    error: 'env' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1535:
    error: 'errno' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1539:
    error: 'jobject' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1539:
    error: expected ';' before 'jFS'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1542:
    error: expected ';' before 'jPath'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1543:
    error: 'jPath' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1547:
    error: 'jvalue' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1547:
    error: expected ';' before 'jFSVal'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1548:
    error: 'jthrowable' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1548:
    error: expected ';' before 'jFSExc'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1549:
    error: 'jFSVal' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1549:
    error: 'jFSExc' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1549:
    error: 'jFS' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1559:
    error: expected ';' before 'jFileStatus'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1563:
    error: 'jobjectArray' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1563:
    error: expected ';' before 'jBlockLocations'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1564:
    error: expected ';' before 'jVal'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1565:
    error: expected ';' before 'jExc'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1566:
    error: 'jVal' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1566:
    error: 'jExc' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1570:
    error: 'jFileStatus' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1577:
    error: 'jBlockLocations' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1581:
    error: 'jsize' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1581:
    error: expected ';' before 'jNumFileBlocks'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1583:
    error: 'jNumFileBlocks' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1597:
    error: expected ';' before 'jFileBlock'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1600:
    error: expected ';' before 'jVal'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1601:
    error: expected ';' before 'jFileBlockHosts'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1602:
    error: 'jFileBlock' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1613:
    error: 'jFileBlockHosts' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1616:
    error: expected ';' before 'jNumBlockHosts'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1617:
    error: 'jNumBlockHosts' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1633:
    error: 'jstring' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1633:
    error: expected ';' before 'jHost'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1637:
    error: 'jHost' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1638:
    warning: implicit declaration of function 'strdup'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1638:
    warning: incompatible implicit declaration of built-in function 'strdup'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In
    function 'hdfsGetDefaultBlockSize':
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1677:
    error: 'JNIEnv' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1677:
    error: 'env' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1679:
    error: 'errno' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1683:
    error: 'jobject' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1683:
    error: expected ';' before 'jFS'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1687:
    error: 'jvalue' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1687:
    error: expected ';' before 'jVal'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1688:
    error: 'jthrowable' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1688:
    error: expected ';' before 'jExc'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1689:
    error: 'jVal' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1689:
    error: 'jExc' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1689:
    error: 'jFS' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In
    function 'hdfsGetCapacity':
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1708:
    error: 'JNIEnv' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1708:
    error: 'env' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1710:
    error: 'errno' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1714:
    error: 'jobject' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1714:
    error: expected ';' before 'jFS'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1716:
    error: 'jFS' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1717:
    warning: implicit declaration of function 'globalClassReference'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1724:
    error: 'jvalue' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1724:
    error: expected ';' before 'jVal'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1725:
    error: 'jthrowable' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1725:
    error: expected ';' before 'jExc'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1726:
    error: 'jVal' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1726:
    error: 'jExc' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In
    function 'hdfsGetUsed':
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1744:
    error: 'JNIEnv' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1744:
    error: 'env' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1746:
    error: 'errno' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1750:
    error: 'jobject' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1750:
    error: expected ';' before 'jFS'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1752:
    error: 'jFS' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1760:
    error: 'jvalue' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1760:
    error: expected ';' before 'jVal'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1761:
    error: 'jthrowable' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1761:
    error: expected ';' before 'jExc'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1762:
    error: 'jVal' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1762:
    error: 'jExc' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: At
    top level:
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1775:
    error: expected ')' before '*' token
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1909:
    error: expected ')' before '*' token
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In
    function 'hdfsListDirectory':
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1962:
    error: 'JNIEnv' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1962:
    error: 'env' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1964:
    error: 'errno' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1968:
    error: 'jobject' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1968:
    error: expected ';' before 'jFS'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1971:
    error: expected ';' before 'jPath'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1972:
    error: 'jPath' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1978:
    error: 'jobjectArray' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1978:
    error: expected ';' before 'jPathList'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1979:
    error: 'jvalue' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1979:
    error: expected ';' before 'jVal'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1980:
    error: 'jthrowable' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1980:
    error: expected ';' before 'jExc'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1981:
    error: 'jVal' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1981:
    error: 'jExc' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1981:
    error: 'jFS' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1989:
    error: 'jPathList' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1992:
    error: 'jsize' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1992:
    error: expected ';' before 'jPathListSize'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1993:
    error: 'jPathListSize' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2007:
    error: expected ';' before 'i'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2008:
    error: expected ';' before 'tmpStat'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2009:
    error: 'i' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2010:
    error: 'tmpStat' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2011:
    warning: implicit declaration of function 'getFileInfoFromStat'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In
    function 'hdfsGetPathInfo':
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2041:
    error: 'JNIEnv' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2041:
    error: 'env' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2043:
    error: 'errno' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2047:
    error: 'jobject' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2047:
    error: expected ';' before 'jFS'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2050:
    error: expected ';' before 'jPath'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2051:
    error: 'jPath' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2056:
    warning: implicit declaration of function 'getFileInfo'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2056:
    error: 'jFS' undeclared (first use in this function)
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In
    function 'hdfsFreeFileInfo':
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2080:
    error: 'hdfsFileInfo' has no member named 'mOwner'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2081:
    error: 'hdfsFileInfo' has no member named 'mOwner'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2083:
    error: 'hdfsFileInfo' has no member named 'mGroup'
    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2084:
    error: 'hdfsFileInfo' has no member named 'mGroup'
    [exec] make: *** [hdfs.lo] Error 1

    BUILD FAILED
    /home/hy59045/sfakiana/hadoop-0.20.2/build.xml:1478: exec returned: 2


    any ideas???
    thnx in advance
    From: cos@apache.org
    Date: Tue, 7 Dec 2010 14:29:03 -0800
    Subject: Re: HDFS and libhfds
    To: common-user@hadoop.apache.org

    It is seems that you're trying to run ant with java5. Make sure your
    JAVA_HOME is set properly.
    --
    Take care,
    Konstantin (Cos) Boudnik



    2010/12/7 Petrucci Andreas <petrucci_2005@hotmail.com>:
    hello there, im trying to compile libhdfs in order but there are some
    problems. According to http://wiki.apache.org/hadoop/MountableHDFS i have
    already installes fuse. With ant compile-c++-libhdfs -Dlibhdfs=1 the buils
    is successful.

    However when i try ant package -Djava5.home=... -Dforrest.home=... the build
    fails and the output is the below :

    [exec]
    [exec] Exception in thread "main"
    java.lang.UnsupportedClassVersionError: Bad version number in .class file
    [exec] at java.lang.ClassLoader.defineClass1(Native Method)
    [exec] at java.lang.ClassLoader.defineClass(ClassLoader.java:620)
    [exec] at
    java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
    [exec] at
    java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
    [exec] at java.net.URLClassLoader.access$100(URLClassLoader.java:56)
    [exec] at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
    [exec] at java.security.AccessController.doPrivileged(Native Method)
    [exec] at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
    [exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    [exec] at
    sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
    [exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
    [exec] at
    org.apache.avalon.excalibur.logger.DefaultLogTargetFactoryManager.configure(
    DefaultLogTargetFactoryManager.java:113)
    [exec] at
    org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.
    java:201)
    [exec] at
    org.apache.avalon.excalibur.logger.LogKitLoggerManager.setupTargetFactoryMan
    ager(LogKitLoggerManager.java:436)
    [exec] at
    org.apache.avalon.excalibur.logger.LogKitLoggerManager.configure(LogKitLogge
    rManager.java:400)
    [exec] at
    org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.
    java:201)
    [exec] at
    org.apache.cocoon.core.CoreUtil.initLogger(CoreUtil.java:607)
    [exec] at org.apache.cocoon.core.CoreUtil.init(CoreUtil.java:169)
    [exec] at org.apache.cocoon.core.CoreUtil.<init>(CoreUtil.java:115)
    [exec] at
    org.apache.cocoon.bean.CocoonWrapper.initialize(CocoonWrapper.java:128)
    [exec] at
    org.apache.cocoon.bean.CocoonBean.initialize(CocoonBean.java:97)
    [exec] at org.apache.cocoon.Main.main(Main.java:310)
    [exec] Java Result: 1
    [exec]
    [exec] Copying broken links file to site root.
    [exec]
    [exec]
    [exec] BUILD FAILED
    [exec] /apache-forrest-0.8/main/targets/site.xml:175: Warning: Could not
    find file /hadoop-0.20.2/src/docs/build/tmp/brokenlinks.xml to copy.
    [exec]
    [exec] Total time: 4 seconds

    BUILD FAILED
    /hadoop-0.20.2/build.xml:867: exec returned: 1


    any ideas what's wrong???

    iCrossing Privileged and Confidential Information
    This email message is for the sole use of the intended recipient(s) and may contain confidential and privileged information of iCrossing. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message.
  • Sudhir Vallamkondu at Dec 8, 2010 at 5:44 am
    There is a proper decommissioning process to remove dead nodes. See the FAQ
    link here:
    http://wiki.apache.org/hadoop/FAQ#I_want_to_make_a_large_cluster_smaller_by_
    taking_out_a_bunch_of_nodes_simultaneously._How_can_this_be_done.3F

    For a fact $HADOOP_HOME/conf/slaves is not used by the name node to keep
    track of datanodes/tasktracker. It is merely used by the stop/start hadoop
    scripts to know which nodes to start datanode / tasktracker services.
    Similarly there is confusion regarding understanding the
    $HADOOP_HOME/conf/master file. That file contains the details of the machine
    where secondary name node is running, not the name node/job tracker.

    With regards to not all java/hadoop processes getting killed, this may be
    happening due to hadoop loosing track of pid files. By default the pid files
    are configured to be created in the /tmp directory. If these pid files get
    deleted then stop/start scripts cannot detect running hadoop processes. I
    suggest changing location of pid files to a persistent location like
    /var/hadoop/. The $HADOOP_HOME/conf/hadoop-env.sh file has details on
    configuring the PID location

    - Sudhir


    On 12/7/10 5:07 PM, "common-user-digest-help@hadoop.apache.org"
    wrote:
    From: Tali K <ncherryus@hotmail.com>
    Date: Tue, 7 Dec 2010 10:40:16 -0800
    To: <core-user@hadoop.apache.org>
    Subject: Help: 1) Hadoop processes still are running after we stopped
    hadoop.2) How to exclude a dead node?


    1)When I stopped hadoop, we checked all the nodes and found that 2 or 3
    java/hadoop processes were still running on each node. So we went to each
    node and did a 'killall java' - in some cases I had to do 'killall -9 java'.
    My question : why is is this happening and what would be recommendations , how
    to make sure that there is no hadoop processes running after I stopped hadoop
    with stop-all.sh?

    2) Also we have a dead node. We removed this node from
    $HADOOP_HOME/conf/slaves. This file is supposed to tell the namenode
    which machines are supposed to be datanodes/tasktrackers.
    We started hadoop again, and were surprised to see a dead node in hadoop
    'report' ("$HADOOP_HOME/bin/hadoop dfsadmin -report|less")
    It is only after blocking a deadnode and restarting hadoop, deadnode no longer
    showed up in hreport.
    Any recommendations, how to deal with dead nodes?

    iCrossing Privileged and Confidential Information
    This email message is for the sole use of the intended recipient(s) and may contain confidential and privileged information of iCrossing. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message.
  • Li ping at Dec 8, 2010 at 6:18 am
    I am not sure I have fully understand your post.
    You mean the conf/slaves only be used for stop/start script to start or stop
    the datanode/tasktracker?
    And the conf/master only contains the information about the secondary
    namenode?

    Thanks
    On Wed, Dec 8, 2010 at 1:44 PM, Sudhir Vallamkondu wrote:

    There is a proper decommissioning process to remove dead nodes. See the FAQ
    link here:

    http://wiki.apache.org/hadoop/FAQ#I_want_to_make_a_large_cluster_smaller_by_
    taking_out_a_bunch_of_nodes_simultaneously._How_can_this_be_done.3F

    For a fact $HADOOP_HOME/conf/slaves is not used by the name node to keep
    track of datanodes/tasktracker. It is merely used by the stop/start hadoop
    scripts to know which nodes to start datanode / tasktracker services.
    Similarly there is confusion regarding understanding the
    $HADOOP_HOME/conf/master file. That file contains the details of the
    machine
    where secondary name node is running, not the name node/job tracker.

    With regards to not all java/hadoop processes getting killed, this may be
    happening due to hadoop loosing track of pid files. By default the pid
    files
    are configured to be created in the /tmp directory. If these pid files get
    deleted then stop/start scripts cannot detect running hadoop processes. I
    suggest changing location of pid files to a persistent location like
    /var/hadoop/. The $HADOOP_HOME/conf/hadoop-env.sh file has details on
    configuring the PID location

    - Sudhir


    On 12/7/10 5:07 PM, "common-user-digest-help@hadoop.apache.org"
    wrote:
    From: Tali K <ncherryus@hotmail.com>
    Date: Tue, 7 Dec 2010 10:40:16 -0800
    To: <core-user@hadoop.apache.org>
    Subject: Help: 1) Hadoop processes still are running after we stopped
    hadoop.2) How to exclude a dead node?


    1)When I stopped hadoop, we checked all the nodes and found that 2 or 3
    java/hadoop processes were still running on each node. So we went to each
    node and did a 'killall java' - in some cases I had to do 'killall -9 java'.
    My question : why is is this happening and what would be recommendations , how
    to make sure that there is no hadoop processes running after I stopped hadoop
    with stop-all.sh?

    2) Also we have a dead node. We removed this node from
    $HADOOP_HOME/conf/slaves. This file is supposed to tell the namenode
    which machines are supposed to be datanodes/tasktrackers.
    We started hadoop again, and were surprised to see a dead node in hadoop
    'report' ("$HADOOP_HOME/bin/hadoop dfsadmin -report|less")
    It is only after blocking a deadnode and restarting hadoop, deadnode no longer
    showed up in hreport.
    Any recommendations, how to deal with dead nodes?

    iCrossing Privileged and Confidential Information
    This email message is for the sole use of the intended recipient(s) and may
    contain confidential and privileged information of iCrossing. Any
    unauthorized review, use, disclosure or distribution is prohibited. If you
    are not the intended recipient, please contact the sender by reply email and
    destroy all copies of the original message.


    --
    -----李平

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedDec 8, '10 at 3:55a
activeDec 8, '10 at 7:06a
posts8
users4
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase