FAQ
Hi all !

I have downloaded hadoop-0.21.I am behind my college proxy.
Installed :-
ivy version : 2.1.0~rc2-3ubuntu1
Ant version : 1.7.1-4ubuntu1.1

I get the following error while building mumak :

$cd /home/arun/Documents/hadoop-0.21.0/mapred
$ant package
Buildfile: build.xml

clover.setup:

clover.info:
[echo]
[echo] Clover not found. Code coverage reports disabled.
[echo]

clover:

ivy-download:
[get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-
2.1.0.jar
[get] To: /home/arun/Documents/hadoop-0.21.0/mapred/ivy/ivy-2.1.0.jar
[get] Error getting
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar to
/home/arun/Documents/hadoop-0.21.0/mapred/ivy/ivy-2.1.0.jar
From internet also this link
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
is not accessible.

Any Help !

Thanks,
Arun K

Search Discussions

  • Harsh J at Jul 29, 2011 at 4:25 pm
    So set a proxy?

    http://ant.apache.org/manual/proxy.html
    On Fri, Jul 29, 2011 at 3:32 PM, Arun K wrote:
    Hi all !

    I have downloaded hadoop-0.21.I am behind my college proxy.
    Installed :-
    ivy version : 2.1.0~rc2-3ubuntu1
    Ant version : 1.7.1-4ubuntu1.1

    I get the following error while building mumak :

    $cd /home/arun/Documents/hadoop-0.21.0/mapred
    $ant package
    Buildfile: build.xml

    clover.setup:

    clover.info:
    [echo]
    [echo]      Clover not found. Code coverage reports disabled.
    [echo]

    clover:

    ivy-download:
    [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-
    2.1.0.jar
    [get] To: /home/arun/Documents/hadoop-0.21.0/mapred/ivy/ivy-2.1.0.jar
    [get] Error getting
    http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar to
    /home/arun/Documents/hadoop-0.21.0/mapred/ivy/ivy-2.1.0.jar

    From internet also this link
    http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
    is not accessible.

    Any Help !

    Thanks,
    Arun K


    --
    Harsh J
  • Arun k at Jul 30, 2011 at 5:31 am
    Hi all !
    I have added the following code to build.xml and tried to build : $ant
    package.
    I have also tried to remove removed the entire ivy2 (~/.ivy2/* ) directory
    and rebuild but couldn't succeed.
    <setproxy proxyhost="192.168.0.90" proxyport="8080"
    proxyuser="ranam" proxypassword="passwd" nonproxyhosts="xyz.svn.com
    "/>
    I get the error UNRESOLVED DEPENDENCIES.
    I have attached the log file.
    Any idea ?

    Thanks,
    Arun K

    On Fri, Jul 29, 2011 at 9:51 PM, Harsh J wrote:

    So set a proxy?

    http://ant.apache.org/manual/proxy.html
    On Fri, Jul 29, 2011 at 3:32 PM, Arun K wrote:
    Hi all !

    I have downloaded hadoop-0.21.I am behind my college proxy.
    Installed :-
    ivy version : 2.1.0~rc2-3ubuntu1
    Ant version : 1.7.1-4ubuntu1.1

    I get the following error while building mumak :

    $cd /home/arun/Documents/hadoop-0.21.0/mapred
    $ant package
    Buildfile: build.xml

    clover.setup:

    clover.info:
    [echo]
    [echo] Clover not found. Code coverage reports disabled.
    [echo]

    clover:

    ivy-download:
    [get] Getting:
    http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-
    2.1.0.jar
    [get] To:
    /home/arun/Documents/hadoop-0.21.0/mapred/ivy/ivy-2.1.0.jar
    [get] Error getting
    http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar to
    /home/arun/Documents/hadoop-0.21.0/mapred/ivy/ivy-2.1.0.jar

    From internet also this link
    http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
    is not accessible.

    Any Help !

    Thanks,
    Arun K


    --
    Harsh J
  • Harsh J at Jul 30, 2011 at 7:35 am
    Arun,

    Looks like 0.21.0 pieces are still not present in the maven
    repositories online. There was a fix for 0.21.1 but that didn't get
    cut yet (perhaps abandoned).

    If you're looking to get into development, I'd suggest either using
    the stable 0.20 branch or using the trunk, depending on what you're
    gonna be working on.

    If you still want to persist with the 0.21, try to get it directly off
    of the apache svn 0.21 branch of each component (common, mapreduce,
    hdfs).
    On Sat, Jul 30, 2011 at 11:00 AM, arun k wrote:
    Hi all !
    I have added the following code to build.xml and tried to build : $ant
    package.
    I have also tried to remove removed the entire ivy2 (~/.ivy2/* ) directory
    and rebuild but couldn't succeed.
    <setproxy proxyhost="192.168.0.90" proxyport="8080"
    proxyuser="ranam" proxypassword="passwd"
    nonproxyhosts="xyz.svn.com"/>
    I get the error UNRESOLVED DEPENDENCIES.
    I have attached the log file.
    Any idea ?
    Thanks,
    Arun K
    On Fri, Jul 29, 2011 at 9:51 PM, Harsh J wrote:

    So set a proxy?

    http://ant.apache.org/manual/proxy.html
    On Fri, Jul 29, 2011 at 3:32 PM, Arun K wrote:
    Hi all !

    I have downloaded hadoop-0.21.I am behind my college proxy.
    Installed :-
    ivy version : 2.1.0~rc2-3ubuntu1
    Ant version : 1.7.1-4ubuntu1.1

    I get the following error while building mumak :

    $cd /home/arun/Documents/hadoop-0.21.0/mapred
    $ant package
    Buildfile: build.xml

    clover.setup:

    clover.info:
    [echo]
    [echo]      Clover not found. Code coverage reports disabled.
    [echo]

    clover:

    ivy-download:
    [get] Getting:
    http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-
    2.1.0.jar
    [get] To:
    /home/arun/Documents/hadoop-0.21.0/mapred/ivy/ivy-2.1.0.jar
    [get] Error getting
    http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar to
    /home/arun/Documents/hadoop-0.21.0/mapred/ivy/ivy-2.1.0.jar

    From internet also this link
    http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
    is not accessible.

    Any Help !

    Thanks,
    Arun K


    --
    Harsh J


    --
    Harsh J
  • Steve Loughran at Aug 2, 2011 at 10:30 am

    On 30/07/11 06:30, arun k wrote:
    Hi all !
    I have added the following code to build.xml and tried to build : $ant
    package.
    I have also tried to remove removed the entire ivy2 (~/.ivy2/* ) directory
    and rebuild but couldn't succeed.
    <setproxy proxyhost="192.168.0.90" proxyport="8080"
    proxyuser="ranam" proxypassword="passwd" nonproxyhosts="xyz.svn.com
    "/>
    I get the error UNRESOLVED DEPENDENCIES.
    I have attached the log file.
    The artifact is there, so it's a proxy problem

    export $ANT_OPTS = "-Dhttp.proxyHost=proxy -Dhttp.proxyPort=8080
    -Dhttps.proxyHost=proxy -Dhttps.proxyPort=8080"

    These don't set ant properties, they set JVM options, and do work for
    Hadoop builds
  • Dhodapkar, Chinmay at Sep 9, 2011 at 9:55 pm
    Hello,
    I have a setup where a bunch of clients store 'events' in an Hbase table . Also, periodically(once a day), I run a mapreduce job that goes over the table and computes some reports.

    Now my issue is that the next time I don't want mapreduce job to process the 'events' that it has already processed previously. I know that I can mark processed event in the hbase table and the mapper can filter them them out during the next run. But what I would really like/want is that previously processed events don't even hit the mapper.

    One solution I can think of is to backup the hbase table after running the job and then clear the table. But this has lot of problems..
    1) Clients may have inserted events while the job was running.
    2) I could disable and drop the table and then create it again...but then the clients would complain about this short window of unavailability.


    What do people using Hbase (live) + mapreduce typically do. ?

    Thanks!
    Chinmay
  • Eugene Kirpichov at Sep 10, 2011 at 9:24 am
    I believe HBase has some kind of TTL (timeout-based expiry) for
    records and it can clean them up on its own.

    On Sat, Sep 10, 2011 at 1:54 AM, Dhodapkar, Chinmay
    wrote:
    Hello,
    I have a setup where a bunch of clients store 'events' in an Hbase table . Also, periodically(once a day), I run a mapreduce job that goes over the table and computes some reports.

    Now my issue is that the next time I don't want mapreduce job to process the 'events' that it has already processed previously. I know that I can mark processed event in the hbase table and the mapper can filter them them out during the next run. But what I would really like/want is that previously processed events don't even hit the mapper.

    One solution I can think of is to backup the hbase table after running the job and then clear the table. But this has lot of problems..
    1) Clients may have inserted events while the job was running.
    2) I could disable and drop the table and then create it again...but then the clients would complain about this short window of unavailability.


    What do people using Hbase (live) + mapreduce typically do. ?

    Thanks!
    Chinmay


    --
    Eugene Kirpichov
    Principal Engineer, Mirantis Inc. http://www.mirantis.com/
    Editor, http://fprog.ru/
  • Sonal Goyal at Sep 10, 2011 at 5:17 pm
    Chinmay, how are you configuring your job? Have you checked using setScan
    and selecting the keys you care to run MR over? See

    http://ofps.oreilly.com/titles/9781449396107/mapreduce.html

    As a shameless plug - For your reports, see if you want to leverage Crux:
    https://github.com/sonalgoyal/crux

    Best Regards,
    Sonal
    Crux: Reporting for HBase <https://github.com/sonalgoyal/crux>
    Nube Technologies <http://www.nubetech.co>

    <http://in.linkedin.com/in/sonalgoyal>




    On Sat, Sep 10, 2011 at 2:53 PM, Eugene Kirpichov wrote:

    I believe HBase has some kind of TTL (timeout-based expiry) for
    records and it can clean them up on its own.

    On Sat, Sep 10, 2011 at 1:54 AM, Dhodapkar, Chinmay
    wrote:
    Hello,
    I have a setup where a bunch of clients store 'events' in an Hbase table
    . Also, periodically(once a day), I run a mapreduce job that goes over the
    table and computes some reports.
    Now my issue is that the next time I don't want mapreduce job to process
    the 'events' that it has already processed previously. I know that I can
    mark processed event in the hbase table and the mapper can filter them them
    out during the next run. But what I would really like/want is that
    previously processed events don't even hit the mapper.
    One solution I can think of is to backup the hbase table after running
    the job and then clear the table. But this has lot of problems..
    1) Clients may have inserted events while the job was running.
    2) I could disable and drop the table and then create it again...but then
    the clients would complain about this short window of unavailability.

    What do people using Hbase (live) + mapreduce typically do. ?

    Thanks!
    Chinmay


    --
    Eugene Kirpichov
    Principal Engineer, Mirantis Inc. http://www.mirantis.com/
    Editor, http://fprog.ru/
  • Giridharan Kesavan at Jul 29, 2011 at 7:13 pm
    If you already have your ivy2 cache with all the artifacts
    then you can manually download/copy the ivy.jar and place that in the ivy
    folder and do an ant build with -Doffline=true.

    -Giri
    On Fri, Jul 29, 2011 at 3:02 AM, Arun K wrote:

    Hi all !

    I have downloaded hadoop-0.21.I am behind my college proxy.
    Installed :-
    ivy version : 2.1.0~rc2-3ubuntu1
    Ant version : 1.7.1-4ubuntu1.1

    I get the following error while building mumak :

    $cd /home/arun/Documents/hadoop-0.21.0/mapred
    $ant package
    Buildfile: build.xml

    clover.setup:

    clover.info:
    [echo]
    [echo] Clover not found. Code coverage reports disabled.
    [echo]

    clover:

    ivy-download:
    [get] Getting:
    http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-
    2.1.0.jar
    [get] To: /home/arun/Documents/hadoop-0.21.0/mapred/ivy/ivy-2.1.0.jar
    [get] Error getting
    http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar to
    /home/arun/Documents/hadoop-0.21.0/mapred/ivy/ivy-2.1.0.jar

    From internet also this link
    http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
    is not accessible.

    Any Help !

    Thanks,
    Arun K

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedJul 29, '11 at 10:03a
activeSep 10, '11 at 5:17p
posts9
users7
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase