FAQ
Hi -

Is there a way I can start HDFS (the namenode) from a Java main and run unit tests against that? I need to integrate my Java/HDFS program into unit tests, and the unit test machine might not have Hadoop installed. I’m currently running the unit tests by hand with hadoop jar ... My unit tests create a bunch of (small) files in HDFS and manipulate them. I use the fs API for that. I don’t have map/reduce jobs (yet!).

Thanks!

Frank

Search Discussions

  • GOEKE, MATTHEW (AG/1000) at Aug 26, 2011 at 7:00 pm
    It depends on what scope you want your unit tests to operate at. There is a class you might want to look into called MiniMRCluster if you are dead set on having as deep of tests as possible but you can still cover quite a bit with MRUnit and Junit4/Mockito.

    Matt

    -----Original Message-----
    From: Frank Astier
    Sent: Friday, August 26, 2011 1:30 PM
    To: common-user@hadoop.apache.org
    Subject: Hadoop in process?

    Hi -

    Is there a way I can start HDFS (the namenode) from a Java main and run unit tests against that? I need to integrate my Java/HDFS program into unit tests, and the unit test machine might not have Hadoop installed. I'm currently running the unit tests by hand with hadoop jar ... My unit tests create a bunch of (small) files in HDFS and manipulate them. I use the fs API for that. I don't have map/reduce jobs (yet!).

    Thanks!

    Frank
    This e-mail message may contain privileged and/or confidential information, and is intended to be received only by persons entitled
    to receive such information. If you have received this e-mail in error, please notify the sender immediately. Please delete it and
    all attachments from any servers, hard drives or any other media. Other use of this e-mail by you is strictly prohibited.

    All e-mails and attachments sent and received are subject to monitoring, reading and archival by Monsanto, including its
    subsidiaries. The recipient of this e-mail is solely responsible for checking for the presence of "Viruses" or other "Malware".
    Monsanto, along with its subsidiaries, accepts no liability for any damage caused by any such code transmitted by or accompanying
    this e-mail or any attachment.


    The information contained in this email may be subject to the export control laws and regulations of the United States, potentially
    including but not limited to the Export Administration Regulations (EAR) and sanctions regulations issued by the U.S. Department of
    Treasury, Office of Foreign Asset Controls (OFAC). As a recipient of this information you are obligated to comply with all
    applicable U.S. export laws and regulations.
  • Sonal Goyal at Aug 26, 2011 at 7:07 pm
    Hi Frank,

    You can use the ClusterMapReduceCase class from org.apache.hadoop.mapred.

    Here is an example of adapting it to Junit4 and running test dfs and
    cluster.

    https://github.com/sonalgoyal/hiho/blob/master/test/co/nubetech/hiho/common/HihoTestCase.java

    And here is a blog post that discusses this in detail:
    http://nubetech.co/testing-hadoop-map-reduce-jobs

    Best Regards,
    Sonal
    Crux: Reporting for HBase <https://github.com/sonalgoyal/crux>
    Nube Technologies <http://www.nubetech.co>

    <http://in.linkedin.com/in/sonalgoyal>




    On Sat, Aug 27, 2011 at 12:00 AM, Frank Astier wrote:

    Hi -

    Is there a way I can start HDFS (the namenode) from a Java main and run
    unit tests against that? I need to integrate my Java/HDFS program into unit
    tests, and the unit test machine might not have Hadoop installed. I’m
    currently running the unit tests by hand with hadoop jar ... My unit tests
    create a bunch of (small) files in HDFS and manipulate them. I use the fs
    API for that. I don’t have map/reduce jobs (yet!).

    Thanks!

    Frank
  • Harsh J at Aug 29, 2011 at 4:58 am
    Frank,

    Firing up a MiniDFSCluster instance with the hadoop-test and
    hadoop-core jars available lets you run a single-JVM HDFS service.

    Also do know that Hadoop works well on the local file system itself,
    and usage of that is also via the same FileSystem interface, so where
    possible, its best to avoid running services that'd slow your tests
    down and try to use the local FileSystem instead.
    On Sat, Aug 27, 2011 at 12:00 AM, Frank Astier wrote:
    Hi -

    Is there a way I can start HDFS (the namenode) from a Java main and run unit tests against that? I need to integrate my Java/HDFS program into unit tests, and the unit test machine might not have Hadoop installed. I’m currently running the unit tests by hand with hadoop jar ... My unit tests create a bunch of (small) files in HDFS and manipulate them. I use the fs API for that. I don’t have map/reduce jobs (yet!).

    Thanks!

    Frank


    --
    Harsh J

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedAug 26, '11 at 6:34p
activeAug 29, '11 at 4:58a
posts4
users4
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase