FAQ
hi,all:

As we known , now hadoop can't run on windows because of the unix shell used by ant and DF utility used by DF.java ,I think the unix shell could be easyly replace with java code or ant script ,And in Java 6, I find the DF utility can be replace with getUsableSpace()and getTotalSpace() of java.io.File. Between the two, you can show how much space
each file system has available and has in total(See the code listed below). May be the DF class should made to more smart to support different OS and JDK .

---------------------------
public class Space{
public static void main(String args[]){
Console console=System.console();
File roots[]=File.listRoots();
for(File root : roots){
console.printf("%s has %,d of %,d free%n", root.getPath(),
root.getUsableSpace(), root.getTotalSpace()
);
}
}
}
---------------------------
ouput:
---------------------------
A:\ has 30,720 of 730,112 free
C:\ has 5,825,671,680 of 39,974,860,288 free
D:\ has 51,128,320 of 100,431,872 free
---------------------------



youlq
you.leiqing@gmail.com
2006-12-28

Search Discussions

  • Owen O'Malley at Dec 28, 2006 at 3:49 am

    On Dec 27, 2006, at 6:48 PM, youlq wrote:

    hi,all:

    As we known , now hadoop can't run on windows because of the unix
    shell used by ant and DF utility used by DF.java ,I think the unix
    shell could be easyly replace with java code or ant script ,And in
    Java 6, I find the DF utility can be replace with getUsableSpace()
    and getTotalSpace() of java.io.File. Between the two, you can show
    how much space
    each file system has available and has in total(See the code listed
    below). May be the DF class should made to more smart to support
    different OS and JDK .
    Actually, Hadoop runs fine on Windows, except when someone makes a
    mistake. *smile* Under Windows, it requires cygwin. I just updated
    the FAQ to include the platform information:
    http://wiki.apache.org/lucene-hadoop/FAQ

    Java 6 has just been released and I think it is too early to make
    Hadoop require Java 6 in order to run. Of course in the long run,
    having a pure Java implementation of all of the required
    functionality is a very good thing.

    -- Owen
  • Konstantin Shvachko at Dec 28, 2006 at 10:54 pm
    DF is the only win-incompatibility if you can run your cluster without
    scripts.
    There is a version of DF.java source file that is compatible with windows.
    http://issues.apache.org/jira/secure/attachment/12337494/DF.java
    Just replace your df.java with this, recompile, and you don't need to
    run cygwin.
    See also HADOOP-33.

    It is good to know that in Java 6 the df problem is solved.

    --Konstantin

    Owen O'Malley wrote:
    On Dec 27, 2006, at 6:48 PM, youlq wrote:

    hi,all:

    As we known , now hadoop can't run on windows because of the
    unix shell used by ant and DF utility used by DF.java ,I think the
    unix shell could be easyly replace with java code or ant script ,And
    in Java 6, I find the DF utility can be replace with
    getUsableSpace() and getTotalSpace() of java.io.File. Between the
    two, you can show how much space
    each file system has available and has in total(See the code listed
    below). May be the DF class should made to more smart to support
    different OS and JDK .

    Actually, Hadoop runs fine on Windows, except when someone makes a
    mistake. *smile* Under Windows, it requires cygwin. I just updated
    the FAQ to include the platform information:
    http://wiki.apache.org/lucene-hadoop/FAQ

    Java 6 has just been released and I think it is too early to make
    Hadoop require Java 6 in order to run. Of course in the long run,
    having a pure Java implementation of all of the required
    functionality is a very good thing.

    -- Owen

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-dev @
categorieshadoop
postedDec 28, '06 at 2:48a
activeDec 28, '06 at 10:54p
posts3
users3
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase