FAQ
Sending this to general to attract urgent attention.
Both HDFS and MapReduce are not compiling since
HADOOP-6904 and its hdfs and MP counterparts were committed.
The problem is not with this patch as described below, but I think those
commits should be reversed if Common integration build cannot be
restored promptly.

Thanks,
--Konstantin


On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
wrote:
I see Hadoop-common-trunk-Commit is failing and not sending any emails.
It times out on native compilation and aborts.
Therefore changes are not integrated, and now it lead to hdfs and mapreduce
both not compiling.
Can somebody please take a look at this.
The last few lines of the build are below.

Thanks
--Konstantin

[javah] [Loaded /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]

[javah] [Loaded /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
[javah] [Forcefully writing file /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]

[exec] checking for gcc... gcc
[exec] checking whether the C compiler works... yes
[exec] checking for C compiler default output file name... a.out
[exec] checking for suffix of executables...

Build timed out. Aborting
Build was aborted
[FINDBUGS] Skipping publisher since build result is ABORTED
Publishing Javadoc
Archiving artifacts
Recording test results
No test report files were found. Configuration error?

Recording fingerprints
[exec] Terminated
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
No emails were triggered.
Finished: ABORTED


Search Discussions

  • Eli Collins at Jan 31, 2011 at 9:31 pm
    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed. Trees from trunk are compiling against each other
    for me (eg each installed to a local maven repo), perhaps the upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I think those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs and mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]

    [javah] [Loaded /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]

    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name... a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED


  • Ted Dunning at Jan 31, 2011 at 9:37 pm
    The has been a problem with more than one build failing (Mahout is the one
    that I saw first) due to a change in maven version which meant that the
    clover license isn't being found properly. At least, that is the tale I
    heard from infra.
    On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins wrote:

    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed. Trees from trunk are compiling against each other
    for me (eg each installed to a local maven repo), perhaps the upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I think those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs and
    mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
    [javah] [Loaded
    /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name... a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED


  • Konstantin Shvachko at Jan 31, 2011 at 9:57 pm
    Current trunk for HDFS and MapReduce are not compiling at the moment. Try to
    build trunk.
    This is the result of that changes to common api introduced by HADOOP-6904
    are not promoted to HDFS and MR trunks.
    HDFS-1335 and MAPREDUCE-2263 depend on these changes.

    Common is not promoted to HDFS and MR because Hadoop-Common-trunk-Commit
    build is broken. See here.
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit/

    As I see the last successful build was on 01/19, which integrated
    HADOOP-6864.
    I think this is when JNI changes were introduced, which cannot be digested
    by Hudson since then.

    Anybody with gcc active could you please verify if the problem is caused by
    HADOOP-6864.

    Thanks,
    --Konstantin
    On Mon, Jan 31, 2011 at 1:36 PM, Ted Dunning wrote:

    The has been a problem with more than one build failing (Mahout is the one
    that I saw first) due to a change in maven version which meant that the
    clover license isn't being found properly. At least, that is the tale I
    heard from infra.
    On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins wrote:

    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed. Trees from trunk are compiling against each other
    for me (eg each installed to a local maven repo), perhaps the upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I think
    those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any
    emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs and
    mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
    [javah] [Loaded
    /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name... a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED


  • Todd Lipcon at Jan 31, 2011 at 11:11 pm

    On Mon, Jan 31, 2011 at 1:57 PM, Konstantin Shvachko wrote:


    Anybody with gcc active could you please verify if the problem is caused by
    HADOOP-6864.
    I can build common trunk just fine on CentOS 5.5 including native.

    I think the issue is somehow isolated to the build machines. Anyone know
    what OS they've got? Or can I swing an account on the box where the failures
    are happening?

    -Todd

    On Mon, Jan 31, 2011 at 1:36 PM, Ted Dunning wrote:

    The has been a problem with more than one build failing (Mahout is the one
    that I saw first) due to a change in maven version which meant that the
    clover license isn't being found properly. At least, that is the tale I
    heard from infra.
    On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins wrote:

    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed. Trees from trunk are compiling against each other
    for me (eg each installed to a local maven repo), perhaps the upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I think
    those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any
    emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs and
    mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
    [javah] [Loaded
    /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name...
    a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED




    --
    Todd Lipcon
    Software Engineer, Cloudera
  • Jakob Homan at Jan 31, 2011 at 11:12 pm
    By manually installing a new core jar into the cache, I can compile
    trunk. Looks like we just need to kick a new Core into maven. Are
    there instructions somewhere for committers to do this? I know Nigel
    and Owen know how, but I don't know if the knowledge is diffused past
    them.
    -Jakob


    On Mon, Jan 31, 2011 at 1:57 PM, Konstantin Shvachko
    wrote:
    Current trunk for HDFS and MapReduce are not compiling at the moment. Try to
    build trunk.
    This is the result of that changes to common api introduced by HADOOP-6904
    are not promoted to HDFS and MR trunks.
    HDFS-1335 and MAPREDUCE-2263 depend on these changes.

    Common is not promoted to HDFS and MR because Hadoop-Common-trunk-Commit
    build is broken. See here.
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit/

    As I see the last successful build was on 01/19, which integrated
    HADOOP-6864.
    I think this is when JNI changes were introduced, which cannot be digested
    by Hudson since then.

    Anybody with gcc active could you please verify if the problem is caused by
    HADOOP-6864.

    Thanks,
    --Konstantin
    On Mon, Jan 31, 2011 at 1:36 PM, Ted Dunning wrote:

    The has been a problem with more than one build failing (Mahout is the one
    that I saw first) due to a change in maven version which meant that the
    clover license isn't being found properly.  At least, that is the tale I
    heard from infra.
    On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins wrote:

    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed.  Trees from trunk are compiling against each other
    for me (eg each installed to a local maven repo), perhaps the upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I think
    those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any
    emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs and
    mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
    [javah] [Loaded
    /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name... a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED


  • Giridharan Kesavan at Jan 31, 2011 at 11:34 pm
    ant mvn-deploy would publish snapshot artifact to the apache maven repository as long you have the right credentials in ~/.m2/settings.xml.

    For settings.xml template pls look at http://wiki.apache.org/hadoop/HowToRelease

    I'm pushing the latest common artifacts now.

    -Giri


    On Jan 31, 2011, at 3:11 PM, Jakob Homan wrote:

    By manually installing a new core jar into the cache, I can compile
    trunk. Looks like we just need to kick a new Core into maven. Are
    there instructions somewhere for committers to do this? I know Nigel
    and Owen know how, but I don't know if the knowledge is diffused past
    them.
    -Jakob


    On Mon, Jan 31, 2011 at 1:57 PM, Konstantin Shvachko
    wrote:
    Current trunk for HDFS and MapReduce are not compiling at the moment. Try to
    build trunk.
    This is the result of that changes to common api introduced by HADOOP-6904
    are not promoted to HDFS and MR trunks.
    HDFS-1335 and MAPREDUCE-2263 depend on these changes.

    Common is not promoted to HDFS and MR because Hadoop-Common-trunk-Commit
    build is broken. See here.
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit/

    As I see the last successful build was on 01/19, which integrated
    HADOOP-6864.
    I think this is when JNI changes were introduced, which cannot be digested
    by Hudson since then.

    Anybody with gcc active could you please verify if the problem is caused by
    HADOOP-6864.

    Thanks,
    --Konstantin
    On Mon, Jan 31, 2011 at 1:36 PM, Ted Dunning wrote:

    The has been a problem with more than one build failing (Mahout is the one
    that I saw first) due to a change in maven version which meant that the
    clover license isn't being found properly. At least, that is the tale I
    heard from infra.
    On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins wrote:

    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed. Trees from trunk are compiling against each other
    for me (eg each installed to a local maven repo), perhaps the upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I think
    those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any
    emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs and
    mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
    [javah] [Loaded
    /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name... a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED


  • Konstantin Shvachko at Feb 1, 2011 at 12:28 am
    Giri
    looks like the last run you started failed the same way as previous ones.
    Any thoughts on what's going on?
    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 3:33 PM, Giridharan Kesavan
    wrote:
    ant mvn-deploy would publish snapshot artifact to the apache maven
    repository as long you have the right credentials in ~/.m2/settings.xml.

    For settings.xml template pls look at
    http://wiki.apache.org/hadoop/HowToRelease

    I'm pushing the latest common artifacts now.

    -Giri


    On Jan 31, 2011, at 3:11 PM, Jakob Homan wrote:

    By manually installing a new core jar into the cache, I can compile
    trunk. Looks like we just need to kick a new Core into maven. Are
    there instructions somewhere for committers to do this? I know Nigel
    and Owen know how, but I don't know if the knowledge is diffused past
    them.
    -Jakob


    On Mon, Jan 31, 2011 at 1:57 PM, Konstantin Shvachko
    wrote:
    Current trunk for HDFS and MapReduce are not compiling at the moment.
    Try to
    build trunk.
    This is the result of that changes to common api introduced by
    HADOOP-6904
    are not promoted to HDFS and MR trunks.
    HDFS-1335 and MAPREDUCE-2263 depend on these changes.

    Common is not promoted to HDFS and MR because Hadoop-Common-trunk-Commit
    build is broken. See here.
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit/
    As I see the last successful build was on 01/19, which integrated
    HADOOP-6864.
    I think this is when JNI changes were introduced, which cannot be
    digested
    by Hudson since then.

    Anybody with gcc active could you please verify if the problem is caused
    by
    HADOOP-6864.

    Thanks,
    --Konstantin
    On Mon, Jan 31, 2011 at 1:36 PM, Ted Dunning wrote:

    The has been a problem with more than one build failing (Mahout is the
    one
    that I saw first) due to a change in maven version which meant that the
    clover license isn't being found properly. At least, that is the tale
    I
    heard from infra.
    On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins wrote:

    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed. Trees from trunk are compiling against each other
    for me (eg each installed to a local maven repo), perhaps the upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I think
    those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any
    emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs and
    mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
    [javah] [Loaded
    /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name...
    a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED


  • Giridharan Kesavan at Feb 1, 2011 at 12:41 am
    Konstantin,

    I think I need to restart the slave which is running the commit build. For now I have published the common artifact manually from commandline.

    Thanks,
    Giri
    On Jan 31, 2011, at 4:27 PM, Konstantin Shvachko wrote:

    Giri
    looks like the last run you started failed the same way as previous ones.
    Any thoughts on what's going on?
    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 3:33 PM, Giridharan Kesavan
    wrote:
    ant mvn-deploy would publish snapshot artifact to the apache maven
    repository as long you have the right credentials in ~/.m2/settings.xml.

    For settings.xml template pls look at
    http://wiki.apache.org/hadoop/HowToRelease

    I'm pushing the latest common artifacts now.

    -Giri


    On Jan 31, 2011, at 3:11 PM, Jakob Homan wrote:

    By manually installing a new core jar into the cache, I can compile
    trunk. Looks like we just need to kick a new Core into maven. Are
    there instructions somewhere for committers to do this? I know Nigel
    and Owen know how, but I don't know if the knowledge is diffused past
    them.
    -Jakob


    On Mon, Jan 31, 2011 at 1:57 PM, Konstantin Shvachko
    wrote:
    Current trunk for HDFS and MapReduce are not compiling at the moment.
    Try to
    build trunk.
    This is the result of that changes to common api introduced by
    HADOOP-6904
    are not promoted to HDFS and MR trunks.
    HDFS-1335 and MAPREDUCE-2263 depend on these changes.

    Common is not promoted to HDFS and MR because Hadoop-Common-trunk-Commit
    build is broken. See here.
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit/
    As I see the last successful build was on 01/19, which integrated
    HADOOP-6864.
    I think this is when JNI changes were introduced, which cannot be
    digested
    by Hudson since then.

    Anybody with gcc active could you please verify if the problem is caused
    by
    HADOOP-6864.

    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 1:36 PM, Ted Dunning <tdunning@maprtech.com>
    wrote:
    The has been a problem with more than one build failing (Mahout is the
    one
    that I saw first) due to a change in maven version which meant that the
    clover license isn't being found properly. At least, that is the tale
    I
    heard from infra.
    On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins wrote:

    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed. Trees from trunk are compiling against each other
    for me (eg each installed to a local maven repo), perhaps the upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I think
    those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any
    emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs and
    mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
    [javah] [Loaded
    /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name...
    a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED


  • Konstantin Shvachko at Feb 1, 2011 at 12:58 am
    Thanks, Giri.
    --Konst

    On Mon, Jan 31, 2011 at 4:40 PM, Giridharan Kesavan
    wrote:
    Konstantin,

    I think I need to restart the slave which is running the commit build. For
    now I have published the common artifact manually from commandline.

    Thanks,
    Giri
    On Jan 31, 2011, at 4:27 PM, Konstantin Shvachko wrote:

    Giri
    looks like the last run you started failed the same way as previous ones.
    Any thoughts on what's going on?
    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 3:33 PM, Giridharan Kesavan
    wrote:
    ant mvn-deploy would publish snapshot artifact to the apache maven
    repository as long you have the right credentials in ~/.m2/settings.xml.

    For settings.xml template pls look at
    http://wiki.apache.org/hadoop/HowToRelease

    I'm pushing the latest common artifacts now.

    -Giri


    On Jan 31, 2011, at 3:11 PM, Jakob Homan wrote:

    By manually installing a new core jar into the cache, I can compile
    trunk. Looks like we just need to kick a new Core into maven. Are
    there instructions somewhere for committers to do this? I know Nigel
    and Owen know how, but I don't know if the knowledge is diffused past
    them.
    -Jakob


    On Mon, Jan 31, 2011 at 1:57 PM, Konstantin Shvachko
    wrote:
    Current trunk for HDFS and MapReduce are not compiling at the moment.
    Try to
    build trunk.
    This is the result of that changes to common api introduced by
    HADOOP-6904
    are not promoted to HDFS and MR trunks.
    HDFS-1335 and MAPREDUCE-2263 depend on these changes.

    Common is not promoted to HDFS and MR because
    Hadoop-Common-trunk-Commit
    build is broken. See here.
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit/
    As I see the last successful build was on 01/19, which integrated
    HADOOP-6864.
    I think this is when JNI changes were introduced, which cannot be
    digested
    by Hudson since then.

    Anybody with gcc active could you please verify if the problem is
    caused
    by
    HADOOP-6864.

    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 1:36 PM, Ted Dunning <tdunning@maprtech.com>
    wrote:
    The has been a problem with more than one build failing (Mahout is
    the
    one
    that I saw first) due to a change in maven version which meant that
    the
    clover license isn't being found properly. At least, that is the
    tale
    I
    heard from infra.
    On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins wrote:

    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed. Trees from trunk are compiling against each other
    for me (eg each installed to a local maven repo), perhaps the
    upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I think
    those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any
    emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs and
    mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
    [javah] [Loaded
    /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name...
    a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED


  • Konstantin Shvachko at Feb 1, 2011 at 8:41 am
    Giri,

    Looking at configuration of Hadoop-Common-trunk-Commit/
    There seems to be errors in the Post-build Actions.
    It is complaining that
    'trunk' exists but not 'trunk/artifacts/...'
    Is it possible that this misconfiguration is the reason of failures?

    --Konstantin


    On Mon, Jan 31, 2011 at 4:40 PM, Giridharan Kesavan
    wrote:
    Konstantin,

    I think I need to restart the slave which is running the commit build. For
    now I have published the common artifact manually from commandline.

    Thanks,
    Giri
    On Jan 31, 2011, at 4:27 PM, Konstantin Shvachko wrote:

    Giri
    looks like the last run you started failed the same way as previous ones.
    Any thoughts on what's going on?
    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 3:33 PM, Giridharan Kesavan
    wrote:
    ant mvn-deploy would publish snapshot artifact to the apache maven
    repository as long you have the right credentials in ~/.m2/settings.xml.

    For settings.xml template pls look at
    http://wiki.apache.org/hadoop/HowToRelease

    I'm pushing the latest common artifacts now.

    -Giri


    On Jan 31, 2011, at 3:11 PM, Jakob Homan wrote:

    By manually installing a new core jar into the cache, I can compile
    trunk. Looks like we just need to kick a new Core into maven. Are
    there instructions somewhere for committers to do this? I know Nigel
    and Owen know how, but I don't know if the knowledge is diffused past
    them.
    -Jakob


    On Mon, Jan 31, 2011 at 1:57 PM, Konstantin Shvachko
    wrote:
    Current trunk for HDFS and MapReduce are not compiling at the moment.
    Try to
    build trunk.
    This is the result of that changes to common api introduced by
    HADOOP-6904
    are not promoted to HDFS and MR trunks.
    HDFS-1335 and MAPREDUCE-2263 depend on these changes.

    Common is not promoted to HDFS and MR because
    Hadoop-Common-trunk-Commit
    build is broken. See here.
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit/
    As I see the last successful build was on 01/19, which integrated
    HADOOP-6864.
    I think this is when JNI changes were introduced, which cannot be
    digested
    by Hudson since then.

    Anybody with gcc active could you please verify if the problem is
    caused
    by
    HADOOP-6864.

    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 1:36 PM, Ted Dunning <tdunning@maprtech.com>
    wrote:
    The has been a problem with more than one build failing (Mahout is
    the
    one
    that I saw first) due to a change in maven version which meant that
    the
    clover license isn't being found properly. At least, that is the
    tale
    I
    heard from infra.
    On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins wrote:

    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed. Trees from trunk are compiling against each other
    for me (eg each installed to a local maven repo), perhaps the
    upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I think
    those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any
    emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs and
    mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
    [javah] [Loaded
    /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name...
    a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED


  • Giridharan Kesavan at Feb 1, 2011 at 7:28 pm
    Konstantin,

    trunk/artifacts gets populated when the jar and the tar ant target are successful.

    The main reason for the build failure so far is the build abort time configuration. It was set to 30mins.
    I have increased the build abort time and the builds are going on fine
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit


    Thanks,
    Giri
    On Feb 1, 2011, at 12:40 AM, Konstantin Shvachko wrote:

    Giri,

    Looking at configuration of Hadoop-Common-trunk-Commit/
    There seems to be errors in the Post-build Actions.
    It is complaining that
    'trunk' exists but not 'trunk/artifacts/...'
    Is it possible that this misconfiguration is the reason of failures?

    --Konstantin


    On Mon, Jan 31, 2011 at 4:40 PM, Giridharan Kesavan
    wrote:
    Konstantin,

    I think I need to restart the slave which is running the commit build. For
    now I have published the common artifact manually from commandline.

    Thanks,
    Giri
    On Jan 31, 2011, at 4:27 PM, Konstantin Shvachko wrote:

    Giri
    looks like the last run you started failed the same way as previous ones.
    Any thoughts on what's going on?
    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 3:33 PM, Giridharan Kesavan
    wrote:
    ant mvn-deploy would publish snapshot artifact to the apache maven
    repository as long you have the right credentials in ~/.m2/settings.xml.

    For settings.xml template pls look at
    http://wiki.apache.org/hadoop/HowToRelease

    I'm pushing the latest common artifacts now.

    -Giri


    On Jan 31, 2011, at 3:11 PM, Jakob Homan wrote:

    By manually installing a new core jar into the cache, I can compile
    trunk. Looks like we just need to kick a new Core into maven. Are
    there instructions somewhere for committers to do this? I know Nigel
    and Owen know how, but I don't know if the knowledge is diffused past
    them.
    -Jakob


    On Mon, Jan 31, 2011 at 1:57 PM, Konstantin Shvachko
    wrote:
    Current trunk for HDFS and MapReduce are not compiling at the moment.
    Try to
    build trunk.
    This is the result of that changes to common api introduced by
    HADOOP-6904
    are not promoted to HDFS and MR trunks.
    HDFS-1335 and MAPREDUCE-2263 depend on these changes.

    Common is not promoted to HDFS and MR because
    Hadoop-Common-trunk-Commit
    build is broken. See here.
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit/
    As I see the last successful build was on 01/19, which integrated
    HADOOP-6864.
    I think this is when JNI changes were introduced, which cannot be
    digested
    by Hudson since then.

    Anybody with gcc active could you please verify if the problem is
    caused
    by
    HADOOP-6864.

    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 1:36 PM, Ted Dunning <tdunning@maprtech.com>
    wrote:
    The has been a problem with more than one build failing (Mahout is
    the
    one
    that I saw first) due to a change in maven version which meant that
    the
    clover license isn't being found properly. At least, that is the
    tale
    I
    heard from infra.

    On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins <eli@cloudera.com>
    wrote:
    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed. Trees from trunk are compiling against each other
    for me (eg each installed to a local maven repo), perhaps the
    upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I think
    those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any
    emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs and
    mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
    [javah] [Loaded
    /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name...
    a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED


  • Konstantin Shvachko at Feb 1, 2011 at 7:45 pm
    Giri,
    Thanks a lot for fixing this.
    I see it is working now.
    --Konstantin

    On Tue, Feb 1, 2011 at 11:27 AM, Giridharan Kesavan
    wrote:
    Konstantin,

    trunk/artifacts gets populated when the jar and the tar ant target are
    successful.

    The main reason for the build failure so far is the build abort time
    configuration. It was set to 30mins.
    I have increased the build abort time and the builds are going on fine

    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit


    Thanks,
    Giri
    On Feb 1, 2011, at 12:40 AM, Konstantin Shvachko wrote:

    Giri,

    Looking at configuration of Hadoop-Common-trunk-Commit/
    There seems to be errors in the Post-build Actions.
    It is complaining that
    'trunk' exists but not 'trunk/artifacts/...'
    Is it possible that this misconfiguration is the reason of failures?

    --Konstantin


    On Mon, Jan 31, 2011 at 4:40 PM, Giridharan Kesavan
    wrote:
    Konstantin,

    I think I need to restart the slave which is running the commit build.
    For
    now I have published the common artifact manually from commandline.

    Thanks,
    Giri
    On Jan 31, 2011, at 4:27 PM, Konstantin Shvachko wrote:

    Giri
    looks like the last run you started failed the same way as previous
    ones.
    Any thoughts on what's going on?
    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 3:33 PM, Giridharan Kesavan
    wrote:
    ant mvn-deploy would publish snapshot artifact to the apache maven
    repository as long you have the right credentials in
    ~/.m2/settings.xml.
    For settings.xml template pls look at
    http://wiki.apache.org/hadoop/HowToRelease

    I'm pushing the latest common artifacts now.

    -Giri


    On Jan 31, 2011, at 3:11 PM, Jakob Homan wrote:

    By manually installing a new core jar into the cache, I can compile
    trunk. Looks like we just need to kick a new Core into maven. Are
    there instructions somewhere for committers to do this? I know Nigel
    and Owen know how, but I don't know if the knowledge is diffused past
    them.
    -Jakob


    On Mon, Jan 31, 2011 at 1:57 PM, Konstantin Shvachko
    wrote:
    Current trunk for HDFS and MapReduce are not compiling at the
    moment.
    Try to
    build trunk.
    This is the result of that changes to common api introduced by
    HADOOP-6904
    are not promoted to HDFS and MR trunks.
    HDFS-1335 and MAPREDUCE-2263 depend on these changes.

    Common is not promoted to HDFS and MR because
    Hadoop-Common-trunk-Commit
    build is broken. See here.
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit/
    As I see the last successful build was on 01/19, which integrated
    HADOOP-6864.
    I think this is when JNI changes were introduced, which cannot be
    digested
    by Hudson since then.

    Anybody with gcc active could you please verify if the problem is
    caused
    by
    HADOOP-6864.

    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 1:36 PM, Ted Dunning <tdunning@maprtech.com
    wrote:
    The has been a problem with more than one build failing (Mahout is
    the
    one
    that I saw first) due to a change in maven version which meant that
    the
    clover license isn't being found properly. At least, that is the
    tale
    I
    heard from infra.

    On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins <eli@cloudera.com>
    wrote:
    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed. Trees from trunk are compiling against each
    other
    for me (eg each installed to a local maven repo), perhaps the
    upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I
    think
    those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any
    emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs
    and
    mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
    [javah] [Loaded
    /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name...
    a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED


  • Nigel Daley at Feb 8, 2011 at 4:33 pm
    Hmm, haven't seen Hudson post build failures to the common-dev list lately.

    Ian, can you check that hudson@hudson.apache.org is still subscribed to common-dev@. If not, please add it.

    Thx,
    Nige

    On Feb 1, 2011, at 11:27 AM, Giridharan Kesavan wrote:

    Konstantin,

    trunk/artifacts gets populated when the jar and the tar ant target are successful.

    The main reason for the build failure so far is the build abort time configuration. It was set to 30mins.
    I have increased the build abort time and the builds are going on fine
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit


    Thanks,
    Giri
    On Feb 1, 2011, at 12:40 AM, Konstantin Shvachko wrote:

    Giri,

    Looking at configuration of Hadoop-Common-trunk-Commit/
    There seems to be errors in the Post-build Actions.
    It is complaining that
    'trunk' exists but not 'trunk/artifacts/...'
    Is it possible that this misconfiguration is the reason of failures?

    --Konstantin


    On Mon, Jan 31, 2011 at 4:40 PM, Giridharan Kesavan
    wrote:
    Konstantin,

    I think I need to restart the slave which is running the commit build. For
    now I have published the common artifact manually from commandline.

    Thanks,
    Giri
    On Jan 31, 2011, at 4:27 PM, Konstantin Shvachko wrote:

    Giri
    looks like the last run you started failed the same way as previous ones.
    Any thoughts on what's going on?
    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 3:33 PM, Giridharan Kesavan
    wrote:
    ant mvn-deploy would publish snapshot artifact to the apache maven
    repository as long you have the right credentials in ~/.m2/settings.xml.

    For settings.xml template pls look at
    http://wiki.apache.org/hadoop/HowToRelease

    I'm pushing the latest common artifacts now.

    -Giri


    On Jan 31, 2011, at 3:11 PM, Jakob Homan wrote:

    By manually installing a new core jar into the cache, I can compile
    trunk. Looks like we just need to kick a new Core into maven. Are
    there instructions somewhere for committers to do this? I know Nigel
    and Owen know how, but I don't know if the knowledge is diffused past
    them.
    -Jakob


    On Mon, Jan 31, 2011 at 1:57 PM, Konstantin Shvachko
    wrote:
    Current trunk for HDFS and MapReduce are not compiling at the moment.
    Try to
    build trunk.
    This is the result of that changes to common api introduced by
    HADOOP-6904
    are not promoted to HDFS and MR trunks.
    HDFS-1335 and MAPREDUCE-2263 depend on these changes.

    Common is not promoted to HDFS and MR because
    Hadoop-Common-trunk-Commit
    build is broken. See here.
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit/
    As I see the last successful build was on 01/19, which integrated
    HADOOP-6864.
    I think this is when JNI changes were introduced, which cannot be
    digested
    by Hudson since then.

    Anybody with gcc active could you please verify if the problem is
    caused
    by
    HADOOP-6864.

    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 1:36 PM, Ted Dunning <tdunning@maprtech.com>
    wrote:
    The has been a problem with more than one build failing (Mahout is
    the
    one
    that I saw first) due to a change in maven version which meant that
    the
    clover license isn't being found properly. At least, that is the
    tale
    I
    heard from infra.

    On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins <eli@cloudera.com>
    wrote:
    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed. Trees from trunk are compiling against each other
    for me (eg each installed to a local maven repo), perhaps the
    upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I think
    those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any
    emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs and
    mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
    [javah] [Loaded
    /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name...
    a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED


  • Ian Holsman at Feb 8, 2011 at 7:31 pm
    Hi Nige.
    I've subscribed it now. as it's wasn't on the subscriber list. we do have hudson@lucene.zones.apache.org.

    but I haven't seen any moderation notices for it either... so I'm not sure it is generating emails.
    On Feb 9, 2011, at 3:32 AM, Nigel Daley wrote:

    Hmm, haven't seen Hudson post build failures to the common-dev list lately.

    Ian, can you check that hudson@hudson.apache.org is still subscribed to common-dev@. If not, please add it.

    Thx,
    Nige

    On Feb 1, 2011, at 11:27 AM, Giridharan Kesavan wrote:

    Konstantin,

    trunk/artifacts gets populated when the jar and the tar ant target are successful.

    The main reason for the build failure so far is the build abort time configuration. It was set to 30mins.
    I have increased the build abort time and the builds are going on fine
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit


    Thanks,
    Giri
    On Feb 1, 2011, at 12:40 AM, Konstantin Shvachko wrote:

    Giri,

    Looking at configuration of Hadoop-Common-trunk-Commit/
    There seems to be errors in the Post-build Actions.
    It is complaining that
    'trunk' exists but not 'trunk/artifacts/...'
    Is it possible that this misconfiguration is the reason of failures?

    --Konstantin


    On Mon, Jan 31, 2011 at 4:40 PM, Giridharan Kesavan
    wrote:
    Konstantin,

    I think I need to restart the slave which is running the commit build. For
    now I have published the common artifact manually from commandline.

    Thanks,
    Giri
    On Jan 31, 2011, at 4:27 PM, Konstantin Shvachko wrote:

    Giri
    looks like the last run you started failed the same way as previous ones.
    Any thoughts on what's going on?
    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 3:33 PM, Giridharan Kesavan
    wrote:
    ant mvn-deploy would publish snapshot artifact to the apache maven
    repository as long you have the right credentials in ~/.m2/settings.xml.

    For settings.xml template pls look at
    http://wiki.apache.org/hadoop/HowToRelease

    I'm pushing the latest common artifacts now.

    -Giri


    On Jan 31, 2011, at 3:11 PM, Jakob Homan wrote:

    By manually installing a new core jar into the cache, I can compile
    trunk. Looks like we just need to kick a new Core into maven. Are
    there instructions somewhere for committers to do this? I know Nigel
    and Owen know how, but I don't know if the knowledge is diffused past
    them.
    -Jakob


    On Mon, Jan 31, 2011 at 1:57 PM, Konstantin Shvachko
    wrote:
    Current trunk for HDFS and MapReduce are not compiling at the moment.
    Try to
    build trunk.
    This is the result of that changes to common api introduced by
    HADOOP-6904
    are not promoted to HDFS and MR trunks.
    HDFS-1335 and MAPREDUCE-2263 depend on these changes.

    Common is not promoted to HDFS and MR because
    Hadoop-Common-trunk-Commit
    build is broken. See here.
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit/
    As I see the last successful build was on 01/19, which integrated
    HADOOP-6864.
    I think this is when JNI changes were introduced, which cannot be
    digested
    by Hudson since then.

    Anybody with gcc active could you please verify if the problem is
    caused
    by
    HADOOP-6864.

    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 1:36 PM, Ted Dunning <tdunning@maprtech.com>
    wrote:
    The has been a problem with more than one build failing (Mahout is
    the
    one
    that I saw first) due to a change in maven version which meant that
    the
    clover license isn't being found properly. At least, that is the
    tale
    I
    heard from infra.

    On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins <eli@cloudera.com>
    wrote:
    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed. Trees from trunk are compiling against each other
    for me (eg each installed to a local maven repo), perhaps the
    upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I think
    those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any
    emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs and
    mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
    [javah] [Loaded
    /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name...
    a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED


  • Nigel Daley at Feb 8, 2011 at 7:36 pm
    ah, that's the old address. I wonder how many of our lists have the old one...

    nige
    On Feb 8, 2011, at 11:31 AM, Ian Holsman wrote:

    Hi Nige.
    I've subscribed it now. as it's wasn't on the subscriber list. we do have hudson@lucene.zones.apache.org.

    but I haven't seen any moderation notices for it either... so I'm not sure it is generating emails.
    On Feb 9, 2011, at 3:32 AM, Nigel Daley wrote:

    Hmm, haven't seen Hudson post build failures to the common-dev list lately.

    Ian, can you check that hudson@hudson.apache.org is still subscribed to common-dev@. If not, please add it.

    Thx,
    Nige

    On Feb 1, 2011, at 11:27 AM, Giridharan Kesavan wrote:

    Konstantin,

    trunk/artifacts gets populated when the jar and the tar ant target are successful.

    The main reason for the build failure so far is the build abort time configuration. It was set to 30mins.
    I have increased the build abort time and the builds are going on fine
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit


    Thanks,
    Giri
    On Feb 1, 2011, at 12:40 AM, Konstantin Shvachko wrote:

    Giri,

    Looking at configuration of Hadoop-Common-trunk-Commit/
    There seems to be errors in the Post-build Actions.
    It is complaining that
    'trunk' exists but not 'trunk/artifacts/...'
    Is it possible that this misconfiguration is the reason of failures?

    --Konstantin


    On Mon, Jan 31, 2011 at 4:40 PM, Giridharan Kesavan
    wrote:
    Konstantin,

    I think I need to restart the slave which is running the commit build. For
    now I have published the common artifact manually from commandline.

    Thanks,
    Giri
    On Jan 31, 2011, at 4:27 PM, Konstantin Shvachko wrote:

    Giri
    looks like the last run you started failed the same way as previous ones.
    Any thoughts on what's going on?
    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 3:33 PM, Giridharan Kesavan
    wrote:
    ant mvn-deploy would publish snapshot artifact to the apache maven
    repository as long you have the right credentials in ~/.m2/settings.xml.

    For settings.xml template pls look at
    http://wiki.apache.org/hadoop/HowToRelease

    I'm pushing the latest common artifacts now.

    -Giri


    On Jan 31, 2011, at 3:11 PM, Jakob Homan wrote:

    By manually installing a new core jar into the cache, I can compile
    trunk. Looks like we just need to kick a new Core into maven. Are
    there instructions somewhere for committers to do this? I know Nigel
    and Owen know how, but I don't know if the knowledge is diffused past
    them.
    -Jakob


    On Mon, Jan 31, 2011 at 1:57 PM, Konstantin Shvachko
    wrote:
    Current trunk for HDFS and MapReduce are not compiling at the moment.
    Try to
    build trunk.
    This is the result of that changes to common api introduced by
    HADOOP-6904
    are not promoted to HDFS and MR trunks.
    HDFS-1335 and MAPREDUCE-2263 depend on these changes.

    Common is not promoted to HDFS and MR because
    Hadoop-Common-trunk-Commit
    build is broken. See here.
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit/
    As I see the last successful build was on 01/19, which integrated
    HADOOP-6864.
    I think this is when JNI changes were introduced, which cannot be
    digested
    by Hudson since then.

    Anybody with gcc active could you please verify if the problem is
    caused
    by
    HADOOP-6864.

    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 1:36 PM, Ted Dunning <tdunning@maprtech.com>
    wrote:
    The has been a problem with more than one build failing (Mahout is
    the
    one
    that I saw first) due to a change in maven version which meant that
    the
    clover license isn't being found properly. At least, that is the
    tale
    I
    heard from infra.

    On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins <eli@cloudera.com>
    wrote:
    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed. Trees from trunk are compiling against each other
    for me (eg each installed to a local maven repo), perhaps the
    upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I think
    those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any
    emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs and
    mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
    [javah] [Loaded
    /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name...
    a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED


  • Ian Holsman at Feb 8, 2011 at 11:21 pm
    I'll check it out later on today.
    even if it is sending from the wrong email address, I should be able to moderate them through regardless. (I haven't seen any recently)

    can you see if hudson is working on common-dev first?

    regards
    Ian
    On Feb 9, 2011, at 6:35 AM, Nigel Daley wrote:

    ah, that's the old address. I wonder how many of our lists have the old one...

    nige
    On Feb 8, 2011, at 11:31 AM, Ian Holsman wrote:

    Hi Nige.
    I've subscribed it now. as it's wasn't on the subscriber list. we do have hudson@lucene.zones.apache.org.

    but I haven't seen any moderation notices for it either... so I'm not sure it is generating emails.
    On Feb 9, 2011, at 3:32 AM, Nigel Daley wrote:

    Hmm, haven't seen Hudson post build failures to the common-dev list lately.

    Ian, can you check that hudson@hudson.apache.org is still subscribed to common-dev@. If not, please add it.

    Thx,
    Nige

    On Feb 1, 2011, at 11:27 AM, Giridharan Kesavan wrote:

    Konstantin,

    trunk/artifacts gets populated when the jar and the tar ant target are successful.

    The main reason for the build failure so far is the build abort time configuration. It was set to 30mins.
    I have increased the build abort time and the builds are going on fine
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit


    Thanks,
    Giri
    On Feb 1, 2011, at 12:40 AM, Konstantin Shvachko wrote:

    Giri,

    Looking at configuration of Hadoop-Common-trunk-Commit/
    There seems to be errors in the Post-build Actions.
    It is complaining that
    'trunk' exists but not 'trunk/artifacts/...'
    Is it possible that this misconfiguration is the reason of failures?

    --Konstantin


    On Mon, Jan 31, 2011 at 4:40 PM, Giridharan Kesavan
    wrote:
    Konstantin,

    I think I need to restart the slave which is running the commit build. For
    now I have published the common artifact manually from commandline.

    Thanks,
    Giri
    On Jan 31, 2011, at 4:27 PM, Konstantin Shvachko wrote:

    Giri
    looks like the last run you started failed the same way as previous ones.
    Any thoughts on what's going on?
    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 3:33 PM, Giridharan Kesavan
    wrote:
    ant mvn-deploy would publish snapshot artifact to the apache maven
    repository as long you have the right credentials in ~/.m2/settings.xml.

    For settings.xml template pls look at
    http://wiki.apache.org/hadoop/HowToRelease

    I'm pushing the latest common artifacts now.

    -Giri


    On Jan 31, 2011, at 3:11 PM, Jakob Homan wrote:

    By manually installing a new core jar into the cache, I can compile
    trunk. Looks like we just need to kick a new Core into maven. Are
    there instructions somewhere for committers to do this? I know Nigel
    and Owen know how, but I don't know if the knowledge is diffused past
    them.
    -Jakob


    On Mon, Jan 31, 2011 at 1:57 PM, Konstantin Shvachko
    wrote:
    Current trunk for HDFS and MapReduce are not compiling at the moment.
    Try to
    build trunk.
    This is the result of that changes to common api introduced by
    HADOOP-6904
    are not promoted to HDFS and MR trunks.
    HDFS-1335 and MAPREDUCE-2263 depend on these changes.

    Common is not promoted to HDFS and MR because
    Hadoop-Common-trunk-Commit
    build is broken. See here.
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit/
    As I see the last successful build was on 01/19, which integrated
    HADOOP-6864.
    I think this is when JNI changes were introduced, which cannot be
    digested
    by Hudson since then.

    Anybody with gcc active could you please verify if the problem is
    caused
    by
    HADOOP-6864.

    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 1:36 PM, Ted Dunning <tdunning@maprtech.com>
    wrote:
    The has been a problem with more than one build failing (Mahout is
    the
    one
    that I saw first) due to a change in maven version which meant that
    the
    clover license isn't being found properly. At least, that is the
    tale
    I
    heard from infra.

    On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins <eli@cloudera.com>
    wrote:
    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed. Trees from trunk are compiling against each other
    for me (eg each installed to a local maven repo), perhaps the
    upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I think
    those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any
    emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs and
    mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
    [javah] [Loaded
    /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name...
    a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED


  • Nigel Daley at Feb 11, 2011 at 6:50 am
    can you see if hudson is working on common-dev first?
    confirmed.

    nige
    On Feb 8, 2011, at 3:21 PM, Ian Holsman wrote:

    I'll check it out later on today.
    even if it is sending from the wrong email address, I should be able to moderate them through regardless. (I haven't seen any recently)

    can you see if hudson is working on common-dev first?

    regards
    Ian
    On Feb 9, 2011, at 6:35 AM, Nigel Daley wrote:

    ah, that's the old address. I wonder how many of our lists have the old one...

    nige
    On Feb 8, 2011, at 11:31 AM, Ian Holsman wrote:

    Hi Nige.
    I've subscribed it now. as it's wasn't on the subscriber list. we do have hudson@lucene.zones.apache.org.

    but I haven't seen any moderation notices for it either... so I'm not sure it is generating emails.
    On Feb 9, 2011, at 3:32 AM, Nigel Daley wrote:

    Hmm, haven't seen Hudson post build failures to the common-dev list lately.

    Ian, can you check that hudson@hudson.apache.org is still subscribed to common-dev@. If not, please add it.

    Thx,
    Nige

    On Feb 1, 2011, at 11:27 AM, Giridharan Kesavan wrote:

    Konstantin,

    trunk/artifacts gets populated when the jar and the tar ant target are successful.

    The main reason for the build failure so far is the build abort time configuration. It was set to 30mins.
    I have increased the build abort time and the builds are going on fine
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit


    Thanks,
    Giri
    On Feb 1, 2011, at 12:40 AM, Konstantin Shvachko wrote:

    Giri,

    Looking at configuration of Hadoop-Common-trunk-Commit/
    There seems to be errors in the Post-build Actions.
    It is complaining that
    'trunk' exists but not 'trunk/artifacts/...'
    Is it possible that this misconfiguration is the reason of failures?

    --Konstantin


    On Mon, Jan 31, 2011 at 4:40 PM, Giridharan Kesavan
    wrote:
    Konstantin,

    I think I need to restart the slave which is running the commit build. For
    now I have published the common artifact manually from commandline.

    Thanks,
    Giri
    On Jan 31, 2011, at 4:27 PM, Konstantin Shvachko wrote:

    Giri
    looks like the last run you started failed the same way as previous ones.
    Any thoughts on what's going on?
    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 3:33 PM, Giridharan Kesavan
    wrote:
    ant mvn-deploy would publish snapshot artifact to the apache maven
    repository as long you have the right credentials in ~/.m2/settings.xml.

    For settings.xml template pls look at
    http://wiki.apache.org/hadoop/HowToRelease

    I'm pushing the latest common artifacts now.

    -Giri


    On Jan 31, 2011, at 3:11 PM, Jakob Homan wrote:

    By manually installing a new core jar into the cache, I can compile
    trunk. Looks like we just need to kick a new Core into maven. Are
    there instructions somewhere for committers to do this? I know Nigel
    and Owen know how, but I don't know if the knowledge is diffused past
    them.
    -Jakob


    On Mon, Jan 31, 2011 at 1:57 PM, Konstantin Shvachko
    wrote:
    Current trunk for HDFS and MapReduce are not compiling at the moment.
    Try to
    build trunk.
    This is the result of that changes to common api introduced by
    HADOOP-6904
    are not promoted to HDFS and MR trunks.
    HDFS-1335 and MAPREDUCE-2263 depend on these changes.

    Common is not promoted to HDFS and MR because
    Hadoop-Common-trunk-Commit
    build is broken. See here.
    https://hudson.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-Common-trunk-Commit/
    As I see the last successful build was on 01/19, which integrated
    HADOOP-6864.
    I think this is when JNI changes were introduced, which cannot be
    digested
    by Hudson since then.

    Anybody with gcc active could you please verify if the problem is
    caused
    by
    HADOOP-6864.

    Thanks,
    --Konstantin

    On Mon, Jan 31, 2011 at 1:36 PM, Ted Dunning <tdunning@maprtech.com>
    wrote:
    The has been a problem with more than one build failing (Mahout is
    the
    one
    that I saw first) due to a change in maven version which meant that
    the
    clover license isn't being found properly. At least, that is the
    tale
    I
    heard from infra.

    On Mon, Jan 31, 2011 at 1:31 PM, Eli Collins <eli@cloudera.com>
    wrote:
    Hey Konstantin,

    The only build breakage I saw from HADOOP-6904 is MAPREDUCE-2290,
    which was fixed. Trees from trunk are compiling against each other
    for me (eg each installed to a local maven repo), perhaps the
    upstream
    maven repo hasn't been updated with the latest bits yet.

    Thanks,
    Eli

    On Mon, Jan 31, 2011 at 12:14 PM, Konstantin Shvachko
    wrote:
    Sending this to general to attract urgent attention.
    Both HDFS and MapReduce are not compiling since
    HADOOP-6904 and its hdfs and MP counterparts were committed.
    The problem is not with this patch as described below, but I think
    those
    commits should be reversed if Common integration build cannot be
    restored promptly.

    Thanks,
    --Konstantin


    On Fri, Jan 28, 2011 at 5:53 PM, Konstantin Shvachko
    wrote:
    I see Hadoop-common-trunk-Commit is failing and not sending any
    emails.
    It times out on native compilation and aborts.
    Therefore changes are not integrated, and now it lead to hdfs and
    mapreduce
    both not compiling.
    Can somebody please take a look at this.
    The last few lines of the build are below.

    Thanks
    --Konstantin

    [javah] [Loaded
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]
    [javah] [Loaded
    /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
    [javah] [Forcefully writing file
    /grid/0/hudson/hudson-slave/workspace/Hadoop-Common-trunk-Commit/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]
    [exec] checking for gcc... gcc
    [exec] checking whether the C compiler works... yes
    [exec] checking for C compiler default output file name...
    a.out
    [exec] checking for suffix of executables...

    Build timed out. Aborting
    Build was aborted
    [FINDBUGS] Skipping publisher since build result is ABORTED
    Publishing Javadoc
    Archiving artifacts
    Recording test results
    No test report files were found. Configuration error?

    Recording fingerprints
    [exec] Terminated
    Publishing Clover coverage report...
    No Clover report will be published due to a Build Failure
    No emails were triggered.
    Finished: ABORTED


Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupgeneral @
categorieshadoop
postedJan 31, '11 at 8:15p
activeFeb 11, '11 at 6:50a
posts18
users8
websitehadoop.apache.org
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase