FAQ
Hi.

How can I get the free / used space on DFS, via Java?

What are the functions that can be used for that?

Note, I'm using a regular (non-super) user, so I need to do it in a similar
way to dfshealth.jsp, which AFAIK doesn't require any permissions.

Thanks in advance.

Search Discussions

  • Edward Capriolo at Aug 23, 2009 at 3:00 pm

    On Sun, Aug 23, 2009 at 7:22 AM, Stas Oskinwrote:
    Hi.

    How can I get the free / used space on DFS, via Java?

    What are the functions that can be used for that?

    Note, I'm using a regular (non-super) user, so I need to do it in a similar
    way to dfshealth.jsp, which AFAIK doesn't require any permissions.

    Thanks in advance.
    One way you can do this is thought JMX.

    http://www.jointhegrid.com/svn/hadoop-cacti-jtg/trunk/src/com/jointhegrid/hadoopjmx/NameNodeStatistics.java

    That is part of a cacti graphing application for hadoop I wrote
    (http://www.jointhegrid.com/hadoop/)

    Enjoy,
  • Stas Oskin at Aug 23, 2009 at 8:33 pm
    Hi.

    Thank you both for the advices - any idea if these approaches works for
    non-super user?

    Regards.
  • Arvind Sharma at Aug 23, 2009 at 8:54 pm
    The APIs work for the user with which Hadoop was started. And moreover I don't think the User level authentication is there yet in Hadoop (not sure here though) for APIs...




    ________________________________
    From: Stas Oskin <stas.oskin@gmail.com>
    To: common-user@hadoop.apache.org
    Sent: Sunday, August 23, 2009 1:33:38 PM
    Subject: Re: Getting free space percentage on DFS

    Hi.

    Thank you both for the advices - any idea if these approaches works for
    non-super user?

    Regards.
  • Stas Oskin at Aug 24, 2009 at 8:27 am
  • Boris Shkolnik at Aug 25, 2009 at 11:33 pm
    For JMX you can also look at JMXGet.java class. You can use this object to
    get the data thru JMX.

    Boris
    On 8/24/09 1:27 AM, "Stas Oskin" wrote:

    Hi.

    One way you can do this is thought JMX.

    http://www.jointhegrid.com/svn/hadoop-cacti-jtg/trunk/src/com/jointhegrid/had
    oopjmx/NameNodeStatistics.java

    That is part of a cacti graphing application for hadoop I wrote
    (http://www.jointhegrid.com/hadoop/)

    Enjoy,
    I'm going to try this next - you probably meant these links?

    http://www.jointhegrid.com/svn/hadoop-cacti-jtg/trunk/src/com/jointhegrid/hado
    opjmx/FSNamesystemStatus.java

    http://www.jointhegrid.com/svn/hadoop-cacti-jtg/trunk/src/com/jointhegrid/hado
    opjmx/FSDatasetStatus.java
    (inherits
    http://www.jointhegrid.com/svn/hadoop-cacti-jtg/trunk/src/com/jointhegrid/hado
    opjmx/JMXBase.java
    )
  • Arvind Sharma at Aug 23, 2009 at 6:56 pm
    You can try something like this:


    if (_FileSystem instanceof DistributedFileSystem)
    {
    DistributedFileSystem dfs = (DistributedFileSystem) _FileSystems;
    DiskStatus ds = dfs.getDiskStatus();
    long capacity = ds.getCapacity();
    long used = ds.getDfsUsed();
    long remaining = ds.getRemaining();
    long presentCapacity = used + remaining;

    hdfsPercentDiskUsed = Math.round((((1.0 * used) / presentCapacity) * 100));
    }



    Arvind



    ________________________________
    From: Stas Oskin <stas.oskin@gmail.com>
    To: core-user@hadoop.apache.org
    Sent: Sunday, August 23, 2009 4:22:26 AM
    Subject: Getting free space percentage on DFS

    Hi.

    How can I get the free / used space on DFS, via Java?

    What are the functions that can be used for that?

    Note, I'm using a regular (non-super) user, so I need to do it in a similar
    way to dfshealth.jsp, which AFAIK doesn't require any permissions.

    Thanks in advance.
  • Stas Oskin at Aug 24, 2009 at 8:20 am
    Hi.

    2009/8/23 Arvind Sharma <arvind321@yahoo.com>
    You can try something like this:


    if (_FileSystem instanceof DistributedFileSystem)
    {
    DistributedFileSystem dfs = (DistributedFileSystem)
    _FileSystems;
    DiskStatus ds = dfs.getDiskStatus();
    long capacity = ds.getCapacity();
    long used = ds.getDfsUsed();
    long remaining = ds.getRemaining();
    long presentCapacity = used + remaining;

    hdfsPercentDiskUsed = Math.round((((1.0 * used) /
    presentCapacity) * 100));
    }
    I just tested it, and I'm getting the exception that I'm not the superuser
    - "Superuser privilege is required".

    Any idea how to bypass this?

    Thanks in advance!

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedAug 23, '09 at 11:22a
activeAug 25, '09 at 11:33p
posts8
users4
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2021 Grokbase