FAQ
We have a small cluster with HDFS running on only 8 nodes - I believe that
the partition assigned to hdfs might be getting full and
wonder if the web tools or java api havew a way to look at free space on
hdfs

--
Steven M. Lewis PhD
4221 105th Ave NE
Kirkland, WA 98033
206-384-1340 (cell)
Skype lordjoe_com

Search Discussions

  • Harsh J at Oct 15, 2011 at 3:45 am
    Steve,

    The http://host:50070/dfsnodelist.jsp?whatNodes=LIVE and
    http://host:50070/dfshealth.jsp pages have per-DN disk usage and
    overall disk usage summaries for your HDFS.

    If you want a drill down on per-partition level within a single
    DataNode, I do not think there's an interface or an API present for it
    yet. Your best way for that would be to use the coreutil "df" manually
    on each node for now.

    P.s. Moved this discussion to hdfs-user@ lists, as its not related to
    MR. Bcc'd mapreduce-user@.
    On Sat, Oct 15, 2011 at 5:46 AM, Steve Lewis wrote:
    We have a small cluster with HDFS running on only 8 nodes - I believe that
    the partition assigned to hdfs might be getting full and
    wonder if the web tools or java api havew a way to look at free space on
    hdfs

    --
    Steven M. Lewis PhD
    4221 105th Ave NE
    Kirkland, WA 98033
    206-384-1340 (cell)
    Skype lordjoe_com



    --
    Harsh J
  • Wd at Oct 15, 2011 at 10:46 am
    hadoop dfsadmin -report
    On Sat, Oct 15, 2011 at 8:16 AM, Steve Lewis wrote:
    We have a small cluster with HDFS running on only 8 nodes - I believe that
    the partition assigned to hdfs might be getting full and
    wonder if the web tools or java api havew a way to look at free space on
    hdfs

    --
    Steven M. Lewis PhD
    4221 105th Ave NE
    Kirkland, WA 98033
    206-384-1340 (cell)
    Skype lordjoe_com

  • Uma Maheswara Rao G 72686 at Oct 15, 2011 at 11:52 am
    /** Return the disk usage of the filesystem, including total capacity,
    * used space, and remaining space */
    public DiskStatus getDiskStatus() throws IOException {
    return dfs.getDiskStatus();
    }

    DistributedFileSystem has the above API from java API side.

    Regards,
    Uma

    ----- Original Message -----
    From: wd <wd@wdicc.com>
    Date: Saturday, October 15, 2011 4:16 pm
    Subject: Re: Is there a good way to see how full hdfs is
    To: mapreduce-user@hadoop.apache.org
    hadoop dfsadmin -report

    On Sat, Oct 15, 2011 at 8:16 AM, Steve Lewis
    wrote:
    We have a small cluster with HDFS running on only 8 nodes - I
    believe that
    the partition assigned to hdfs might be getting full and
    wonder if the web tools or java api havew a way to look at free space on
    hdfs

    --
    Steven M. Lewis PhD
    4221 105th Ave NE
    Kirkland, WA 98033
    206-384-1340 (cell)
    Skype lordjoe_com

  • Uma Maheswara Rao G 72686 at Oct 17, 2011 at 3:55 pm
    We can write the simple program and you can call this API.

    Make sure Hadoop jars presents in your class path.
    Just for more clarification, DN will send their stats as parts of hertbeats, So, NN will maintain all the statistics about the diskspace usage for the complete filesystem and etc... This api will give you that stats.

    Regards,
    Uma

    ----- Original Message -----
    From: Ivan.Novick@emc.com
    Date: Monday, October 17, 2011 9:07 pm
    Subject: Re: Is there a good way to see how full hdfs is
    To: common-user@hadoop.apache.org, mapreduce-user@hadoop.apache.org
    Cc: common-dev@hadoop.apache.org
    So is there a client program to call this?

    Can one write their own simple client to call this method from all
    diskson the cluster?

    How about a map reduce job to collect from all disks on the cluster?

    On 10/15/11 4:51 AM, "Uma Maheswara Rao G 72686"
    wrote:
    /** Return the disk usage of the filesystem, including total
    capacity,> * used space, and remaining space */
    public DiskStatus getDiskStatus() throws IOException {
    return dfs.getDiskStatus();
    }

    DistributedFileSystem has the above API from java API side.

    Regards,
    Uma

    ----- Original Message -----
    From: wd <wd@wdicc.com>
    Date: Saturday, October 15, 2011 4:16 pm
    Subject: Re: Is there a good way to see how full hdfs is
    To: mapreduce-user@hadoop.apache.org
    hadoop dfsadmin -report

    On Sat, Oct 15, 2011 at 8:16 AM, Steve Lewis
    wrote:
    We have a small cluster with HDFS running on only 8 nodes - I
    believe that
    the partition assigned to hdfs might be getting full and
    wonder if the web tools or java api havew a way to look at free space on
    hdfs

    --
    Steven M. Lewis PhD
    4221 105th Ave NE
    Kirkland, WA 98033
    206-384-1340 (cell)
    Skype lordjoe_com

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupmapreduce-user @
categorieshadoop
postedOct 15, '11 at 12:16a
activeOct 17, '11 at 3:55p
posts5
users4
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase