FAQ
Dear all,

I want to know if their is any class or any way to access the file list and
meta data from a remote HDFS namenode.
for example, there are two hadoop instances, which mean two namenodes (nn1
and nn2)
If I was super user in both two hadoop instances,
and now I am in nn1, want to get nn2's file list and meta data
Is there any way to get that?

Now I could just try the most traditional way, which is
simon@nn1:~$ ssh nn2 "/hadoop/bin/hadoop fs -lsr "
and then parse the result to get each file's meta data
Is there any class or api I could use with?

My dream way is to make my own jar in nn1, through bin/hadoop jar remote.jar
remote
I can get certain directory inform of remote hdfs

thanks a lot

Best Regards,
Simon

Search Discussions

  • Harsh J at Jan 11, 2011 at 7:43 pm
    You can create multiple FileSystem objects, for different URIs and use
    them to query specific NNs code-wise.
    From command-line, a simple tricks like `user@nn1 $ hadoop dfs -ls
    hdfs://nn2/dir` should work (i.e. pass entire URI of the path you're
    looking for).

    See this exact method for the code question:
    http://hadoop.apache.org/common/docs/current/api/org/apache/hadoop/fs/FileSystem.html#get(java.net.URI,
    org.apache.hadoop.conf.Configuration, java.lang.String)
    On Wed, Jan 12, 2011 at 1:06 AM, simon wrote:
    Dear all,
    I want to know if their is any class or any way to access the file list and
    meta data from a remote HDFS namenode.
    for example, there are two hadoop instances, which mean two namenodes (nn1
    and nn2)
    If I was super user in both two hadoop instances,
    and now I am in nn1, want to get nn2's file list and meta data
    Is there any way to get that?
    Now I could just try the most traditional way, which is
    simon@nn1:~$ ssh nn2 "/hadoop/bin/hadoop fs -lsr "
    and then parse the result to get each file's meta data
    Is there any class or api I could use with?
    My dream way is to make my own jar in nn1, through bin/hadoop jar remote.jar
    remote
    I can get certain directory inform of remote hdfs
    thanks a lot
    Best Regards,
    Simon


    --
    Harsh J
    www.harshj.com
  • Simon at Jan 11, 2011 at 8:04 pm
    Dear Harsh,

    Thanks for your prompt reply,
    I just try the command line way, but it seems doesn't work
    simon@nn1 $ hadoop dfs -ls "hdfs://nn2:/user/simon"
    ls: For input string: ""
    Usage: java FsShell [-ls <path>]
    even if I remove the ditto, the error is the same

    I would try multiple filesystem object way latter,

    Anyway, million thank you's, Harsh


    Best Regards,
    瑞興


    2011/1/12 Harsh J <qwertymaniac@gmail.com>
    You can create multiple FileSystem objects, for different URIs and use
    them to query specific NNs code-wise.

    From command-line, a simple tricks like `user@nn1 $ hadoop dfs -ls
    hdfs://nn2/dir` should work (i.e. pass entire URI of the path you're
    looking for).

    See this exact method for the code question:

    http://hadoop.apache.org/common/docs/current/api/org/apache/hadoop/fs/FileSystem.html#get(java.net.URI
    ,
    org.apache.hadoop.conf.Configuration, java.lang.String)
    On Wed, Jan 12, 2011 at 1:06 AM, simon wrote:
    Dear all,
    I want to know if their is any class or any way to access the file list and
    meta data from a remote HDFS namenode.
    for example, there are two hadoop instances, which mean two namenodes (nn1
    and nn2)
    If I was super user in both two hadoop instances,
    and now I am in nn1, want to get nn2's file list and meta data
    Is there any way to get that?
    Now I could just try the most traditional way, which is
    simon@nn1:~$ ssh nn2 "/hadoop/bin/hadoop fs -lsr "
    and then parse the result to get each file's meta data
    Is there any class or api I could use with?
    My dream way is to make my own jar in nn1, through bin/hadoop jar
    remote.jar
    remote
    I can get certain directory inform of remote hdfs
    thanks a lot
    Best Regards,
    Simon


    --
    Harsh J
    www.harshj.com
  • Gushengchang at Jan 12, 2011 at 1:45 am
    When i use the command, it's ok.Like this:
    ~/hadoop/bin/hadoop dfs -ls hdfs://192.168.68.101:9000/user/hadoop/gusc

    make sure the path.

    Good luck.瑞興


    2011-01-12



    gushengchang



    发件人: simon
    发送时间: 2011-01-12 04:04:19
    收件人: hdfs-user
    抄送:
    主题: Re: How to get another hdfs's file directory list and meta data?(through api)

    Dear Harsh,


    Thanks for your prompt reply,
    I just try the command line way, but it seems doesn't work
    simon@nn1 $ hadoop dfs -ls "hdfs://nn2:/user/simon"
    ls: For input string: ""
    Usage: java FsShell [-ls <path>]
    even if I remove the ditto, the error is the same


    I would try multiple filesystem object way latter,


    Anyway, million thank you's, Harsh


    Best Regards,
    瑞興



    2011/1/12 Harsh J <qwertymaniac@gmail.com>

    You can create multiple FileSystem objects, for different URIs and use
    them to query specific NNs code-wise.

    From command-line, a simple tricks like `user@nn1 $ hadoop dfs -ls
    hdfs://nn2/dir` should work (i.e. pass entire URI of the path you're
    looking for).

    See this exact method for the code question:
    http://hadoop.apache.org/common/docs/current/api/org/apache/hadoop/fs/FileSystem.html#get(java.net.URI,
    org.apache.hadoop.conf.Configuration, java.lang.String)

    On Wed, Jan 12, 2011 at 1:06 AM, simon wrote:
    Dear all,
    I want to know if their is any class or any way to access the file list and
    meta data from a remote HDFS namenode.
    for example, there are two hadoop instances, which mean two namenodes (nn1
    and nn2)
    If I was super user in both two hadoop instances,
    and now I am in nn1, want to get nn2's file list and meta data
    Is there any way to get that?
    Now I could just try the most traditional way, which is
    simon@nn1:~$ ssh nn2 "/hadoop/bin/hadoop fs -lsr "
    and then parse the result to get each file's meta data
    Is there any class or api I could use with?
    My dream way is to make my own jar in nn1, through bin/hadoop jar remote.jar
    remote
    I can get certain directory inform of remote hdfs
    thanks a lot
    Best Regards,
    Simon



    --
    Harsh J
    www.harshj.com
  • Simon at Jan 12, 2011 at 2:18 am
    Dear Chang,

    Thank you so much!!
    I just figured out that I missed the port I have changed in my conf
    so I need to key in the entire URI including the port
    It works, thanks for your testing.


    Best Regards,
    Simon


    2011/1/12 gushengchang <gushengchang@gmail.com>
    When i use the command, it's ok.Like this:
    ~/hadoop/bin/hadoop dfs -ls hdfs://192.168.68.101:9000/user/hadoop/gusc

    make sure the path.

    Good luck.瑞�d


    2011-01-12
    ------------------------------
    gushengchang
    ------------------------------
    *发件人:* simon
    *发送时间:* 2011-01-12 04:04:19
    *收件人:* hdfs-user
    *抄送:*
    *主题:* Re: How to get another hdfs's file directory list and meta
    data?(through api)
    Dear Harsh,

    Thanks for your prompt reply,
    I just try the command line way, but it seems doesn't work
    simon@nn1 $ hadoop dfs -ls "hdfs://nn2:/user/simon"
    ls: For input string: ""
    Usage: java FsShell [-ls <path>]
    even if I remove the ditto, the error is the same

    I would try multiple filesystem object way latter,

    Anyway, million thank you's, Harsh


    Best Regards,
    瑞�d


    2011/1/12 Harsh J <qwertymaniac@gmail.com>
    You can create multiple FileSystem objects, for different URIs and use
    them to query specific NNs code-wise.

    From command-line, a simple tricks like `user@nn1 $ hadoop dfs -ls
    hdfs://nn2/dir` should work (i.e. pass entire URI of the path you're
    looking for).

    See this exact method for the code question:

    http://hadoop.apache.org/common/docs/current/api/org/apache/hadoop/fs/FileSystem.html#get(java.net.URI
    ,
    org.apache.hadoop.conf.Configuration, java.lang.String)
    On Wed, Jan 12, 2011 at 1:06 AM, simon wrote:
    Dear all,
    I want to know if their is any class or any way to access the file list and
    meta data from a remote HDFS namenode.
    for example, there are two hadoop instances, which mean two namenodes (nn1
    and nn2)
    If I was super user in both two hadoop instances,
    and now I am in nn1, want to get nn2's file list and meta data
    Is there any way to get that?
    Now I could just try the most traditional way, which is
    simon@nn1:~$ ssh nn2 "/hadoop/bin/hadoop fs -lsr "
    and then parse the result to get each file's meta data
    Is there any class or api I could use with?
    My dream way is to make my own jar in nn1, through bin/hadoop jar
    remote.jar
    remote
    I can get certain directory inform of remote hdfs
    thanks a lot
    Best Regards,
    Simon


    --
    Harsh J
    www.harshj.com
  • Harsh J at Jan 12, 2011 at 4:17 am
    Hello,

    2011/1/12 simon <randyhaha@gmail.com>:
    Dear Harsh,
    Thanks for your prompt reply,
    I just try the command line way, but it seems doesn't work
    simon@nn1 $ hadoop dfs -ls "hdfs://nn2:/user/simon"
    You're forgetting the port number after nn2 perhaps?

    --
    Harsh J
    www.harshj.com
  • Simon at Jan 12, 2011 at 4:42 am
    Dear Harsh,

    yeah, you're right, I forgot to put my port numbers
    it works now, thank you so much , Harsh


    Best Regards,
    Simon Hsu


    2011/1/12 Harsh J <qwertymaniac@gmail.com>
    Hello,

    2011/1/12 simon <randyhaha@gmail.com>:
    Dear Harsh,
    Thanks for your prompt reply,
    I just try the command line way, but it seems doesn't work
    simon@nn1 $ hadoop dfs -ls "hdfs://nn2:/user/simon"
    You're forgetting the port number after nn2 perhaps?

    --
    Harsh J
    www.harshj.com

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
grouphdfs-user @
categorieshadoop
postedJan 11, '11 at 7:37p
activeJan 12, '11 at 4:42a
posts7
users3
websitehadoop.apache.org...
irc#hadoop

3 users in discussion

Simon: 4 posts Harsh J: 2 posts Gushengchang: 1 post

People

Translate

site design / logo © 2022 Grokbase