FAQ
Hello I'm trying to compile HDFS/FUSE for mountable HDFS.


When I go through the documented process @
http://wiki.apache.org/hadoop/MountableHDFS I get:

BUILD FAILED
/home/hadoop/hadoop-0.20.203.0/build.xml:614: The following error occurred
while executing this line:
/home/hadoop/hadoop-0.20.203.0/src/contrib/build.xml:30: The following error
occurred while executing this line:
/home/hadoop/hadoop-0.20.203.0/src/contrib/fuse-dfs/build.xml:37: libhdfs.so
does not exist: /home/hadoop/hadoop-0.20.203.0/build/libhdfs/libhdfs.so.
Please check flags -Dlibhdfs=1 -Dfusedfs=1 are set or first try ant
compile-libhdfs -Dlibhdfs=1

The .so files seem to be else where and there is no
$HADOOP_HOME/build/libhdfs directory:

~/hadoop/build$ find . -name *hdfs*so -print
./c++-build/Linux-amd64-64/libhdfs/.libs/libhdfs.so
./c++/Linux-amd64-64/lib/libhdfs.so

So I created a little script:

$ cat build_hdfs_fuse.sh
#!/bin/bash

ant compile-c++-libhdfs -Dlibhdfs=1 -Dislibhdfs=1
ant package
cd build
if [ ! -s libhdfs ] ; then
ln -s c++/Linux-amd64-64/lib libhdfs
fi
cd ..
ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1


This gets further but it stills errors in the compilation process:

[exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\"
-DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\"
-DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1
-DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1
-DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1
-DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t
-I. -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/home/hadoop/jdk1.6.0_25/include
-I/home/hadoop/hadoop-0.20.203.0/src/c++/libhdfs/
-I/home/hadoop/jdk1.6.0_25/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\"
-DPROTECTED_PATHS=\"\" -I/include -Wall -O3 -MT fuse_connect.o -MD -MP -MF
.deps/fuse_connect.Tpo -c -o fuse_connect.o fuse_connect.c
[exec] make[1]: Leaving directory
`/home/hadoop/hadoop-0.20.203.0/src/contrib/fuse-dfs/src'
[exec] fuse_connect.c: In function ‘doConnectAsUser’:
[exec] fuse_connect.c:40: error: too many arguments to function
‘hdfsConnectAsUser’
[exec] make[1]: *** [fuse_connect.o] Error 1
[exec] make: *** [all-recursive] Error 1

BUILD FAILED
/home/hadoop/hadoop-0.20.203.0/build.xml:614: The following error occurred
while executing this line:
/home/hadoop/hadoop-0.20.203.0/src/contrib/build.xml:30: The following error
occurred while executing this line:
/home/hadoop/hadoop-0.20.203.0/src/contrib/fuse-dfs/build.xml:57: exec
returned: 2

I'm kinda of stuck. I've had no luck on any of the 0.20.x.x builds. Any
suggestions?

Thanks
-JR

Search Discussions

  • Ccxixicc at Jun 3, 2011 at 9:09 am
    HI,


    after
    $ ant compile -Dcompile.c++=true -Dlibhdfs=true


    should do a link and export
    $ ln -s $HADOOP_HOME/build/c++/Linux-amd64-64/lib/ $HADOOP_HOME/build/libhdfs
    $ export LD_LIBRARY_PATH=/usr/lib:/usr/local/lib:$HADOOP_HOME/build/libhdfs


    then
    $ ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1



    ------------------ Original ------------------
    From: "J. Ryan Earl"<oss@jryanearl.us>;
    Date: Fri, Jun 3, 2011 05:33 AM
    To: "hdfs-user"<hdfs-user@hadoop.apache.org>;

    Subject: problems compiling HDFS FUSE


    Hello I'm trying to compile HDFS/FUSE for mountable HDFS.



    When I go through the documented process @ http://wiki.apache.org/hadoop/MountableHDFS I get:


    BUILD FAILED
    /home/hadoop/hadoop-0.20.203.0/build.xml:614: The following error occurred while executing this line:
    /home/hadoop/hadoop-0.20.203.0/src/contrib/build.xml:30: The following error occurred while executing this line:
    /home/hadoop/hadoop-0.20.203.0/src/contrib/fuse-dfs/build.xml:37: libhdfs.so does not exist: /home/hadoop/hadoop-0.20.203.0/build/libhdfs/libhdfs.so. Please check flags -Dlibhdfs=1 -Dfusedfs=1 are set or first try ant compile-libhdfs -Dlibhdfs=1



    The .so files seem to be else where and there is no $HADOOP_HOME/build/libhdfs directory:


    ~/hadoop/build$ find . -name *hdfs*so -print
    ./c++-build/Linux-amd64-64/libhdfs/.libs/libhdfs.so
    ./c++/Linux-amd64-64/lib/libhdfs.so



    So I created a little script:


    $ cat build_hdfs_fuse.sh
    #!/bin/bash


    ant compile-c++-libhdfs -Dlibhdfs=1 -Dislibhdfs=1
    ant package
    cd build
    if [ ! -s libhdfs ] ; then
    ln -s c++/Linux-amd64-64/lib libhdfs
    fi
    cd ..
    ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1





    This gets further but it stills errors in the compilation process:


    [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I. -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/home/hadoop/jdk1.6.0_25/include -I/home/hadoop/hadoop-0.20.203.0/src/c++/libhdfs/ -I/home/hadoop/jdk1.6.0_25/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include -Wall -O3 -MT fuse_connect.o -MD -MP -MF .deps/fuse_connect.Tpo -c -o fuse_connect.o fuse_connect.c
    [exec] make[1]: Leaving directory `/home/hadoop/hadoop-0.20.203.0/src/contrib/fuse-dfs/src'
    [exec] fuse_connect.c: In function ‘doConnectAsUser’:
    [exec] fuse_connect.c:40: error: too many arguments to function ‘hdfsConnectAsUser’
    [exec] make[1]: *** [fuse_connect.o] Error 1
    [exec] make: *** [all-recursive] Error 1


    BUILD FAILED
    /home/hadoop/hadoop-0.20.203.0/build.xml:614: The following error occurred while executing this line:
    /home/hadoop/hadoop-0.20.203.0/src/contrib/build.xml:30: The following error occurred while executing this line:
    /home/hadoop/hadoop-0.20.203.0/src/contrib/fuse-dfs/build.xml:57: exec returned: 2



    I'm kinda of stuck. I've had no luck on any of the 0.20.x.x builds. Any suggestions?


    Thanks
    -JR

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
grouphdfs-user @
categorieshadoop
postedJun 2, '11 at 9:34p
activeJun 3, '11 at 9:09a
posts2
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

J. Ryan Earl: 1 post Ccxixicc: 1 post

People

Translate

site design / logo © 2022 Grokbase