FAQ
Dear Huy Phan and others,

Thanks a lot for your efforts in customizing the WebDav
server<http://github.com/huyphan/HDFS-over-Webdav>and make it work for
Hadoop-0.20.1.

After setting up the WebDav server, I could access it using Cadaver client
in Ubuntu without using any username password. Operations like deleting
files, etc, were working. The command is: *cadaver http://server:9800*

However, when I was trying to mount the WebDav server using davfs2 in
Ubuntu, I always get the following error:
"mount.davfs: mounting failed; the server does not support WebDAV".

I was promoted to input username and password like below:
hadoop@hdfs2:/mnt$ sudo mount.davfs http://192.168.0.131:9800/testhdfs-webdav/
Please enter the username to authenticate with server
http://192.168.0.131:9800/test or hit enter for none.
Username: hadoop
Please enter the password to authenticate user hadoop with server
http://192.168.0.131:9800/test or hit enter for none.
Password:
mount.davfs: mounting failed; the server does not support WebDAV

Even though I have tried all possible usernames and passwords either from
the WebDAV accounts.properties file or from the Ubuntu system of the WebDAV
server, I still got this error message.

Could you and anyone give me some hints on this problem? How could I solve
it? Very much appreciate your help!

Best regards,
Zhang Bingjun (Eddy)

E-mail: eddymier@gmail.com, bingjun@nus.edu.sg, bingjun@comp.nus.edu.sg
Tel No: +65-96188110 (M)

Search Discussions

  • Huy Phan at Oct 27, 2009 at 10:55 am
    Hi Zhang,

    Here is the patch for davfs2 to solve "server does not support WebDAV"
    issue:

    diff --git a/src/webdav.c b/src/webdav.c
    index 8ec7a2d..4bdaece 100644
    --- a/src/webdav.c
    +++ b/src/webdav.c
    @@ -472,7 +472,7 @@ dav_init_connection(const char *path)

    if (!ret) {
    initialized = 1;
    - if (!caps.dav_class1 && !caps.dav_class2 && !ignore_dav_header) {
    + if (!caps.dav_class1 && !ignore_dav_header) {
    if (have_terminal) {
    error(EXIT_FAILURE, 0,
    _("mounting failed; the server does not support
    WebDAV"));


    davfs2 and webdav is not a good mix actually, I had tried to mix them
    together and the performance were really bad. With the load test of 10
    requests/s, load average on my namenode were always > 15 and it took me
    about 5 mins for `ls` the root directory of HDFS during the test.

    Since you're using Hadoop 0.20.1, it's better to use fusedfs library
    provided in Hadoop package. You have to do some tricks to compile
    fusedfs with Hadoop, otherwise it would take you a lot of time for
    compiling redundant things.

    Best,
    Huy Phan

    Zhang Bingjun (Eddy) wrote:
    Dear Huy Phan and others,

    Thanks a lot for your efforts in customizing the WebDav server
    <http://github.com/huyphan/HDFS-over-Webdav> and make it work for
    Hadoop-0.20.1.

    After setting up the WebDav server, I could access it using Cadaver
    client in Ubuntu without using any username password. Operations like
    deleting files, etc, were working. The command is: *cadaver
    http://server:9800*

    However, when I was trying to mount the WebDav server using davfs2 in
    Ubuntu, I always get the following error:
    "mount.davfs: mounting failed; the server does not support WebDAV".

    I was promoted to input username and password like below:
    hadoop@hdfs2:/mnt$ sudo mount.davfs http://192.168.0.131:9800/test
    hdfs-webdav/
    Please enter the username to authenticate with server
    http://192.168.0.131:9800/test or hit enter for none.
    Username: hadoop
    Please enter the password to authenticate user hadoop with server
    http://192.168.0.131:9800/test or hit enter for none.
    Password:
    mount.davfs: mounting failed; the server does not support WebDAV

    Even though I have tried all possible usernames and passwords either
    from the WebDAV accounts.properties file or from the Ubuntu system of
    the WebDAV server, I still got this error message.

    Could you and anyone give me some hints on this problem? How could I
    solve it? Very much appreciate your help!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com bingjun@nus.edu.sg bingjun@comp.nus.edu.sg Tel No: +65-96188110 (M)
  • Zhang Bingjun (Eddy) at Oct 27, 2009 at 11:07 am
    Dear Huy Phan,

    Thanks for your quick reply.

    I was using fuse-dfs before. But I found serious memory leak with fuse-dfs
    about 10MB leakage per 10k file read/write. When the occupied memory size
    reached about 150MB, the read/write performance dropped dramatically. Did
    you encounter these problems?

    What I am trying to do is to mount HDFS as a local directory in Ubuntu. Do
    you think fuse-dfs is the best option so far?

    Thank you so much for your input!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com, bingjun@nus.edu.sg, bingjun@comp.nus.edu.sg
    Tel No: +65-96188110 (M)

    On Tue, Oct 27, 2009 at 6:55 PM, Huy Phan wrote:

    Hi Zhang,

    Here is the patch for davfs2 to solve "server does not support WebDAV"
    issue:

    diff --git a/src/webdav.c b/src/webdav.c
    index 8ec7a2d..4bdaece 100644
    --- a/src/webdav.c
    +++ b/src/webdav.c
    @@ -472,7 +472,7 @@ dav_init_connection(const char *path)

    if (!ret) {
    initialized = 1;
    - if (!caps.dav_class1 && !caps.dav_class2 && !ignore_dav_header) {
    + if (!caps.dav_class1 && !ignore_dav_header) {
    if (have_terminal) {
    error(EXIT_FAILURE, 0,
    _("mounting failed; the server does not support
    WebDAV"));


    davfs2 and webdav is not a good mix actually, I had tried to mix them
    together and the performance were really bad. With the load test of 10
    requests/s, load average on my namenode were always > 15 and it took me
    about 5 mins for `ls` the root directory of HDFS during the test.

    Since you're using Hadoop 0.20.1, it's better to use fusedfs library
    provided in Hadoop package. You have to do some tricks to compile fusedfs
    with Hadoop, otherwise it would take you a lot of time for compiling
    redundant things.

    Best,
    Huy Phan

    Zhang Bingjun (Eddy) wrote:
    Dear Huy Phan and others,

    Thanks a lot for your efforts in customizing the WebDav server <
    http://github.com/huyphan/HDFS-over-Webdav> and make it work for
    Hadoop-0.20.1.
    After setting up the WebDav server, I could access it using Cadaver client
    in Ubuntu without using any username password. Operations like deleting
    files, etc, were working. The command is: *cadaver http://server:9800*

    However, when I was trying to mount the WebDav server using davfs2 in
    Ubuntu, I always get the following error: "mount.davfs: mounting failed; the
    server does not support WebDAV".

    I was promoted to input username and password like below: hadoop@hdfs2:/mnt$
    sudo mount.davfs http://192.168.0.131:9800/test hdfs-webdav/
    Please enter the username to authenticate with server
    http://192.168.0.131:9800/test or hit enter for none.
    Username: hadoop
    Please enter the password to authenticate user hadoop with server
    http://192.168.0.131:9800/test or hit enter for none.
    Password:
    mount.davfs: mounting failed; the server does not support WebDAV

    Even though I have tried all possible usernames and passwords either from
    the WebDAV accounts.properties file or from the Ubuntu system of the WebDAV
    server, I still got this error message.
    Could you and anyone give me some hints on this problem? How could I solve
    it? Very much appreciate your help!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com > bingjun@nus.edu.sg , bingjun@comp.nus.edu.sg<mailto:
    bingjun@comp.nus.edu.sg>

    Tel No: +65-96188110 (M)
  • Huy Phan at Oct 27, 2009 at 11:19 am
    Hi Zhang,
    I didn't play much with fuse-dfs, in my opinion, memory leak is
    something solvable and I can see Apache had made some fixes for this
    issue on libhdfs.
    If you encounter these problems with older version of Hadoop, I think
    you should give a try on the latest stable version.
    Since I didn't have much fun so far with fuse-dfs, i cannot say it's the
    best or not, but it's definitely better than mixing davfs2 and webdav
    together.

    Best,
    Huy Phan

    Zhang Bingjun (Eddy) wrote:
    Dear Huy Phan,

    Thanks for your quick reply.

    I was using fuse-dfs before. But I found serious memory leak with
    fuse-dfs about 10MB leakage per 10k file read/write. When the occupied
    memory size reached about 150MB, the read/write performance dropped
    dramatically. Did you encounter these problems?

    What I am trying to do is to mount HDFS as a local directory in
    Ubuntu. Do you think fuse-dfs is the best option so far?

    Thank you so much for your input!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com bingjun@nus.edu.sg bingjun@comp.nus.edu.sg Tel No: +65-96188110 (M)


    On Tue, Oct 27, 2009 at 6:55 PM, Huy Phan wrote:

    Hi Zhang,

    Here is the patch for davfs2 to solve "server does not support
    WebDAV" issue:

    diff --git a/src/webdav.c b/src/webdav.c
    index 8ec7a2d..4bdaece 100644
    --- a/src/webdav.c
    +++ b/src/webdav.c
    @@ -472,7 +472,7 @@ dav_init_connection(const char *path)

    if (!ret) {
    initialized = 1;
    - if (!caps.dav_class1 && !caps.dav_class2 &&
    !ignore_dav_header) {
    + if (!caps.dav_class1 && !ignore_dav_header) {
    if (have_terminal) {
    error(EXIT_FAILURE, 0,
    _("mounting failed; the server does not
    support WebDAV"));


    davfs2 and webdav is not a good mix actually, I had tried to mix
    them together and the performance were really bad. With the load
    test of 10 requests/s, load average on my namenode were always >
    15 and it took me about 5 mins for `ls` the root directory of HDFS
    during the test.

    Since you're using Hadoop 0.20.1, it's better to use fusedfs
    library provided in Hadoop package. You have to do some tricks to
    compile fusedfs with Hadoop, otherwise it would take you a lot of
    time for compiling redundant things.

    Best,
    Huy Phan

    Zhang Bingjun (Eddy) wrote:

    Dear Huy Phan and others,

    Thanks a lot for your efforts in customizing the WebDav server
    <http://github.com/huyphan/HDFS-over-Webdav> and make it work
    for Hadoop-0.20.1.
    After setting up the WebDav server, I could access it using
    Cadaver client in Ubuntu without using any username password.
    Operations like deleting files, etc, were working. The command
    is: *cadaver http://server:9800*

    However, when I was trying to mount the WebDav server using
    davfs2 in Ubuntu, I always get the following error:
    "mount.davfs: mounting failed; the server does not support
    WebDAV".

    I was promoted to input username and password like below:
    hadoop@hdfs2:/mnt$ sudo mount.davfs
    http://192.168.0.131:9800/test hdfs-webdav/
    Please enter the username to authenticate with server
    http://192.168.0.131:9800/test or hit enter for none.
    Username: hadoop
    Please enter the password to authenticate user hadoop with server
    http://192.168.0.131:9800/test or hit enter for none.
    Password:
    mount.davfs: mounting failed; the server does not support WebDAV

    Even though I have tried all possible usernames and passwords
    either from the WebDAV accounts.properties file or from the
    Ubuntu system of the WebDAV server, I still got this error
    message.
    Could you and anyone give me some hints on this problem? How
    could I solve it? Very much appreciate your help!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com <mailto:eddymier@gmail.com bingjun@nus.edu.sg <mailto:bingjun@nus.edu.sg bingjun@comp.nus.edu.sg <mailto:bingjun@comp.nus.edu.sg
    Tel No: +65-96188110 (M)

  • Zhang Bingjun (Eddy) at Oct 27, 2009 at 11:35 am
    Dear Huy Phan,

    I downloaded davfs2-1.4.3 and in this version the patch you sent me seems to
    be applied already. I compiled and installed this version. However, the
    error message is still around like below...

    hadoop@hdfs2:/mnt$ sudo mount.davfs http://192.168.0.131:9800 hdfs-webdav/
    Please enter the username to authenticate with server
    http://192.168.0.131:9800 or hit enter for none.
    Username: hadoop
    Please enter the password to authenticate user hadoop with server
    http://192.168.0.131:9800 or hit enter for none.
    Password:
    mount.davfs: mounting failed; the server does not support WebDAV

    Which username or password should I input? Any user in the
    account.properties file or the user in the WebDAV OS?

    Regarding the memory leak in fuse-dfs and libhdfs, I posted one patch in
    apache jira. However, when used in production environment, the memory leak
    still exists and cause the mounting point unusable after a number of
    write/read operations. The memory leak there is really annoying...

    I hope I can setup the mix of davfs2 and WebDAV to have a try on its
    performance. Any ideas to get around the error "mount failed; the server
    does not support WebDAV"?

    Thank you so much for your help!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com, bingjun@nus.edu.sg, bingjun@comp.nus.edu.sg
    Tel No: +65-96188110 (M)

    On Tue, Oct 27, 2009 at 7:19 PM, Huy Phan wrote:

    Hi Zhang,
    I didn't play much with fuse-dfs, in my opinion, memory leak is something
    solvable and I can see Apache had made some fixes for this issue on libhdfs.
    If you encounter these problems with older version of Hadoop, I think you
    should give a try on the latest stable version.
    Since I didn't have much fun so far with fuse-dfs, i cannot say it's the
    best or not, but it's definitely better than mixing davfs2 and webdav
    together.


    Best,
    Huy Phan

    Zhang Bingjun (Eddy) wrote:
    Dear Huy Phan,


    Thanks for your quick reply.
    I was using fuse-dfs before. But I found serious memory leak with fuse-dfs
    about 10MB leakage per 10k file read/write. When the occupied memory size
    reached about 150MB, the read/write performance dropped dramatically. Did
    you encounter these problems?

    What I am trying to do is to mount HDFS as a local directory in Ubuntu. Do
    you think fuse-dfs is the best option so far?

    Thank you so much for your input!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com > bingjun@nus.edu.sg , bingjun@comp.nus.edu.sg<mailto:
    bingjun@comp.nus.edu.sg>
    Tel No: +65-96188110 (M)


    On Tue, Oct 27, 2009 at 6:55 PM, Huy Phan <dachuy@gmail.com <mailto:
    dachuy@gmail.com>> wrote:

    Hi Zhang,

    Here is the patch for davfs2 to solve "server does not support
    WebDAV" issue:

    diff --git a/src/webdav.c b/src/webdav.c
    index 8ec7a2d..4bdaece 100644
    --- a/src/webdav.c
    +++ b/src/webdav.c
    @@ -472,7 +472,7 @@ dav_init_connection(const char *path)

    if (!ret) {
    initialized = 1;
    - if (!caps.dav_class1 && !caps.dav_class2 &&
    !ignore_dav_header) {
    + if (!caps.dav_class1 && !ignore_dav_header) {
    if (have_terminal) {
    error(EXIT_FAILURE, 0,
    _("mounting failed; the server does not
    support WebDAV"));


    davfs2 and webdav is not a good mix actually, I had tried to mix
    them together and the performance were really bad. With the load
    test of 10 requests/s, load average on my namenode were always >
    15 and it took me about 5 mins for `ls` the root directory of HDFS
    during the test.

    Since you're using Hadoop 0.20.1, it's better to use fusedfs
    library provided in Hadoop package. You have to do some tricks to
    compile fusedfs with Hadoop, otherwise it would take you a lot of
    time for compiling redundant things.

    Best,
    Huy Phan

    Zhang Bingjun (Eddy) wrote:

    Dear Huy Phan and others,

    Thanks a lot for your efforts in customizing the WebDav server
    <http://github.com/huyphan/HDFS-over-Webdav> and make it work
    for Hadoop-0.20.1.
    After setting up the WebDav server, I could access it using
    Cadaver client in Ubuntu without using any username password.
    Operations like deleting files, etc, were working. The command
    is: *cadaver http://server:9800*

    However, when I was trying to mount the WebDav server using
    davfs2 in Ubuntu, I always get the following error:
    "mount.davfs: mounting failed; the server does not support
    WebDAV".

    I was promoted to input username and password like below:
    hadoop@hdfs2:/mnt$ sudo mount.davfs
    http://192.168.0.131:9800/test hdfs-webdav/
    Please enter the username to authenticate with server
    http://192.168.0.131:9800/test or hit enter for none.
    Username: hadoop
    Please enter the password to authenticate user hadoop with server
    http://192.168.0.131:9800/test or hit enter for none.
    Password:
    mount.davfs: mounting failed; the server does not support WebDAV

    Even though I have tried all possible usernames and passwords
    either from the WebDAV accounts.properties file or from the
    Ubuntu system of the WebDAV server, I still got this error
    message.
    Could you and anyone give me some hints on this problem? How
    could I solve it? Very much appreciate your help!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com > <mailto:eddymier@gmail.com >
    bingjun@nus.edu.sg > <mailto:bingjun@nus.edu.sg >
    bingjun@comp.nus.edu.sg > <mailto:bingjun@comp.nus.edu.sg >

    Tel No: +65-96188110 (M)


  • Huy Phan at Oct 28, 2009 at 1:22 am
    Hi Zhang,
    I applied my patch to davfs2-1.4.0 and it's working fine with Hadoop 0.20.1.
    If you didn't define any access restriction in account.properties file,
    you can ignore the authentication when mounting davfs2.

    Best,
    Huy Phan


    Zhang Bingjun (Eddy) wrote:
    Dear Huy Phan,

    I downloaded davfs2-1.4.3 and in this version the patch you sent me
    seems to be applied already. I compiled and installed this version.
    However, the error message is still around like below...

    hadoop@hdfs2:/mnt$ sudo mount.davfs http://192.168.0.131:9800 hdfs-webdav/
    Please enter the username to authenticate with server
    http://192.168.0.131:9800 or hit enter for none.
    Username: hadoop
    Please enter the password to authenticate user hadoop with server
    http://192.168.0.131:9800 or hit enter for none.
    Password:
    mount.davfs: mounting failed; the server does not support WebDAV

    Which username or password should I input? Any user in the
    account.properties file or the user in the WebDAV OS?

    Regarding the memory leak in fuse-dfs and libhdfs, I posted one patch
    in apache jira. However, when used in production environment, the
    memory leak still exists and cause the mounting point unusable after a
    number of write/read operations. The memory leak there is really
    annoying...

    I hope I can setup the mix of davfs2 and WebDAV to have a try on its
    performance. Any ideas to get around the error "mount failed; the
    server does not support WebDAV"?

    Thank you so much for your help!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com bingjun@nus.edu.sg bingjun@comp.nus.edu.sg Tel No: +65-96188110 (M)


    On Tue, Oct 27, 2009 at 7:19 PM, Huy Phan wrote:

    Hi Zhang,
    I didn't play much with fuse-dfs, in my opinion, memory leak is
    something solvable and I can see Apache had made some fixes for
    this issue on libhdfs.
    If you encounter these problems with older version of Hadoop, I
    think you should give a try on the latest stable version.
    Since I didn't have much fun so far with fuse-dfs, i cannot say
    it's the best or not, but it's definitely better than mixing
    davfs2 and webdav together.


    Best,
    Huy Phan

    Zhang Bingjun (Eddy) wrote:

    Dear Huy Phan,


    Thanks for your quick reply.
    I was using fuse-dfs before. But I found serious memory leak
    with fuse-dfs about 10MB leakage per 10k file read/write. When
    the occupied memory size reached about 150MB, the read/write
    performance dropped dramatically. Did you encounter these
    problems?

    What I am trying to do is to mount HDFS as a local directory
    in Ubuntu. Do you think fuse-dfs is the best option so far?

    Thank you so much for your input!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com <mailto:eddymier@gmail.com bingjun@nus.edu.sg <mailto:bingjun@nus.edu.sg bingjun@comp.nus.edu.sg <mailto:bingjun@comp.nus.edu.sg Tel No: +65-96188110 (M)


    On Tue, Oct 27, 2009 at 6:55 PM, Huy Phan wrote:

    Hi Zhang,

    Here is the patch for davfs2 to solve "server does not support
    WebDAV" issue:

    diff --git a/src/webdav.c b/src/webdav.c
    index 8ec7a2d..4bdaece 100644
    --- a/src/webdav.c
    +++ b/src/webdav.c
    @@ -472,7 +472,7 @@ dav_init_connection(const char *path)

    if (!ret) {
    initialized = 1;
    - if (!caps.dav_class1 && !caps.dav_class2 &&
    !ignore_dav_header) {
    + if (!caps.dav_class1 && !ignore_dav_header) {
    if (have_terminal) {
    error(EXIT_FAILURE, 0,
    _("mounting failed; the server does not
    support WebDAV"));


    davfs2 and webdav is not a good mix actually, I had tried
    to mix
    them together and the performance were really bad. With the
    load
    test of 10 requests/s, load average on my namenode were
    always >
    15 and it took me about 5 mins for `ls` the root directory
    of HDFS
    during the test.

    Since you're using Hadoop 0.20.1, it's better to use fusedfs
    library provided in Hadoop package. You have to do some
    tricks to
    compile fusedfs with Hadoop, otherwise it would take you a
    lot of
    time for compiling redundant things.

    Best,
    Huy Phan

    Zhang Bingjun (Eddy) wrote:

    Dear Huy Phan and others,

    Thanks a lot for your efforts in customizing the WebDav
    server
    <http://github.com/huyphan/HDFS-over-Webdav> and make
    it work
    for Hadoop-0.20.1.
    After setting up the WebDav server, I could access it using
    Cadaver client in Ubuntu without using any username
    password.
    Operations like deleting files, etc, were working. The
    command
    is: *cadaver http://server:9800*

    However, when I was trying to mount the WebDav server using
    davfs2 in Ubuntu, I always get the following error:
    "mount.davfs: mounting failed; the server does not support
    WebDAV".

    I was promoted to input username and password like below:
    hadoop@hdfs2:/mnt$ sudo mount.davfs
    http://192.168.0.131:9800/test hdfs-webdav/
    Please enter the username to authenticate with server
    http://192.168.0.131:9800/test or hit enter for none.
    Username: hadoop
    Please enter the password to authenticate user hadoop
    with server
    http://192.168.0.131:9800/test or hit enter for none.
    Password:
    mount.davfs: mounting failed; the server does not
    support WebDAV

    Even though I have tried all possible usernames and
    passwords
    either from the WebDAV accounts.properties file or from the
    Ubuntu system of the WebDAV server, I still got this error
    message.
    Could you and anyone give me some hints on this
    problem? How
    could I solve it? Very much appreciate your help!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com <mailto:eddymier@gmail.com <mailto:eddymier@gmail.com <mailto:eddymier@gmail.com
    bingjun@nus.edu.sg <mailto:bingjun@nus.edu.sg <mailto:bingjun@nus.edu.sg <mailto:bingjun@nus.edu.sg
    bingjun@comp.nus.edu.sg
    <mailto:bingjun@comp.nus.edu.sg

    Tel No: +65-96188110 (M)



  • Zhang Bingjun (Eddy) at Oct 28, 2009 at 5:51 am
    Dear Huy Phan,

    Thanks a lot!

    It seems like the diff in the patch you sent me should be the other way
    around, which is like the following:

    diff --git b/src/webdav.c a/src/webdav.c
    index 8ec7a2d..4bdaece 100644
    --- b/src/webdav.c
    +++ a/src/webdav.c
    @@ -472,7 +472,7 @@ dav_init_connection(const char *path)

    if (!ret) {
    initialized = 1;
    - if (!caps.dav_class1 && !ignore_dav_header) {
    + if (!caps.dav_class1 && !caps.dav_class2 && !ignore_dav_header) {
    if (have_terminal) {
    error(EXIT_FAILURE, 0,
    _("mounting failed; the server does not support
    WebDAV"));


    After applying this patch, the error "the server does not support WebDAV" is
    gone. After a simple test of the WebDAV + davfs2 mix, I also experienced
    very poor performance. I think I have to go back to fuse-dfs performance
    wise.

    Thanks a lot for your quick help!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com, bingjun@nus.edu.sg, bingjun@comp.nus.edu.sg
    Tel No: +65-96188110 (M)

    On Wed, Oct 28, 2009 at 9:22 AM, Huy Phan wrote:

    Hi Zhang,
    I applied my patch to davfs2-1.4.0 and it's working fine with Hadoop
    0.20.1.
    If you didn't define any access restriction in account.properties file, you
    can ignore the authentication when mounting davfs2.


    Best,
    Huy Phan


    Zhang Bingjun (Eddy) wrote:
    Dear Huy Phan,

    I downloaded davfs2-1.4.3 and in this version the patch you sent me seems
    to be applied already. I compiled and installed this version. However, the
    error message is still around like below...

    hadoop@hdfs2:/mnt$ sudo mount.davfs http://192.168.0.131:9800hdfs-webdav/
    Please enter the username to authenticate with server
    http://192.168.0.131:9800 or hit enter for none.
    Username: hadoop
    Please enter the password to authenticate user hadoop with server
    http://192.168.0.131:9800 or hit enter for none.
    Password:
    mount.davfs: mounting failed; the server does not support WebDAV

    Which username or password should I input? Any user in the
    account.properties file or the user in the WebDAV OS?

    Regarding the memory leak in fuse-dfs and libhdfs, I posted one patch in
    apache jira. However, when used in production environment, the memory leak
    still exists and cause the mounting point unusable after a number of
    write/read operations. The memory leak there is really annoying...

    I hope I can setup the mix of davfs2 and WebDAV to have a try on its
    performance. Any ideas to get around the error "mount failed; the server
    does not support WebDAV"?

    Thank you so much for your help!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com > bingjun@nus.edu.sg , bingjun@comp.nus.edu.sg<mailto:
    bingjun@comp.nus.edu.sg>
    Tel No: +65-96188110 (M)


    On Tue, Oct 27, 2009 at 7:19 PM, Huy Phan <dachuy@gmail.com <mailto:
    dachuy@gmail.com>> wrote:

    Hi Zhang,
    I didn't play much with fuse-dfs, in my opinion, memory leak is
    something solvable and I can see Apache had made some fixes for
    this issue on libhdfs.
    If you encounter these problems with older version of Hadoop, I
    think you should give a try on the latest stable version.
    Since I didn't have much fun so far with fuse-dfs, i cannot say
    it's the best or not, but it's definitely better than mixing
    davfs2 and webdav together.


    Best,
    Huy Phan

    Zhang Bingjun (Eddy) wrote:

    Dear Huy Phan,


    Thanks for your quick reply.
    I was using fuse-dfs before. But I found serious memory leak
    with fuse-dfs about 10MB leakage per 10k file read/write. When
    the occupied memory size reached about 150MB, the read/write
    performance dropped dramatically. Did you encounter these
    problems?

    What I am trying to do is to mount HDFS as a local directory
    in Ubuntu. Do you think fuse-dfs is the best option so far?

    Thank you so much for your input!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com > <mailto:eddymier@gmail.com > bingjun@nus.edu.sg > <mailto:bingjun@nus.edu.sg > bingjun@comp.nus.edu.sg > <mailto:bingjun@comp.nus.edu.sg > Tel No: +65-96188110 (M)


    On Tue, Oct 27, 2009 at 6:55 PM, Huy Phan <dachuy@gmail.com
    wrote:

    Hi Zhang,

    Here is the patch for davfs2 to solve "server does not support
    WebDAV" issue:

    diff --git a/src/webdav.c b/src/webdav.c
    index 8ec7a2d..4bdaece 100644
    --- a/src/webdav.c
    +++ b/src/webdav.c
    @@ -472,7 +472,7 @@ dav_init_connection(const char *path)

    if (!ret) {
    initialized = 1;
    - if (!caps.dav_class1 && !caps.dav_class2 &&
    !ignore_dav_header) {
    + if (!caps.dav_class1 && !ignore_dav_header) {
    if (have_terminal) {
    error(EXIT_FAILURE, 0,
    _("mounting failed; the server does not
    support WebDAV"));


    davfs2 and webdav is not a good mix actually, I had tried
    to mix
    them together and the performance were really bad. With the
    load
    test of 10 requests/s, load average on my namenode were
    always >
    15 and it took me about 5 mins for `ls` the root directory
    of HDFS
    during the test.

    Since you're using Hadoop 0.20.1, it's better to use fusedfs
    library provided in Hadoop package. You have to do some
    tricks to
    compile fusedfs with Hadoop, otherwise it would take you a
    lot of
    time for compiling redundant things.

    Best,
    Huy Phan

    Zhang Bingjun (Eddy) wrote:

    Dear Huy Phan and others,

    Thanks a lot for your efforts in customizing the WebDav
    server
    <http://github.com/huyphan/HDFS-over-Webdav> and make
    it work
    for Hadoop-0.20.1.
    After setting up the WebDav server, I could access it using
    Cadaver client in Ubuntu without using any username
    password.
    Operations like deleting files, etc, were working. The
    command
    is: *cadaver http://server:9800*

    However, when I was trying to mount the WebDav server using
    davfs2 in Ubuntu, I always get the following error:
    "mount.davfs: mounting failed; the server does not support
    WebDAV".

    I was promoted to input username and password like below:
    hadoop@hdfs2:/mnt$ sudo mount.davfs
    http://192.168.0.131:9800/test hdfs-webdav/
    Please enter the username to authenticate with server
    http://192.168.0.131:9800/test or hit enter for none.
    Username: hadoop
    Please enter the password to authenticate user hadoop
    with server
    http://192.168.0.131:9800/test or hit enter for none.
    Password:
    mount.davfs: mounting failed; the server does not
    support WebDAV

    Even though I have tried all possible usernames and
    passwords
    either from the WebDAV accounts.properties file or from the
    Ubuntu system of the WebDAV server, I still got this error
    message.
    Could you and anyone give me some hints on this
    problem? How
    could I solve it? Very much appreciate your help!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com > <mailto:eddymier@gmail.com > <mailto:eddymier@gmail.com > <mailto:eddymier@gmail.com >
    bingjun@nus.edu.sg > <mailto:bingjun@nus.edu.sg > <mailto:bingjun@nus.edu.sg > <mailto:bingjun@nus.edu.sg >
    bingjun@comp.nus.edu.sg
    <mailto:bingjun@comp.nus.edu.sg > > > > >
    Tel No: +65-96188110 (M)




  • Zhang Bingjun (Eddy) at Oct 28, 2009 at 7:40 am
    Dear Huy Phan,

    To follow up. Even though the performance of webdav+davfs2 is worse than
    fuse-dfs to access hdfs, but it is much more stable than fuse-dfs so far. As
    you have said, the memory leak of fuse-dfs is solvable, but it is hard to
    find all leaks so far, especially when the plain C/C++ code is mixed with
    JNI code.

    I have heard that other organizations (facebook, etc.) have FUSE
    implementation for HDFS. But haven't seen any open source code from those
    organizations yet. Really hope some one could contribute their code base to
    offer the public a nice (at least stable) FUSE implementation to HDFS.

    Thanks to all!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com, bingjun@nus.edu.sg, bingjun@comp.nus.edu.sg
    Tel No: +65-96188110 (M)

    On Wed, Oct 28, 2009 at 1:51 PM, Zhang Bingjun (Eddy) wrote:

    Dear Huy Phan,

    Thanks a lot!

    It seems like the diff in the patch you sent me should be the other way
    around, which is like the following:

    diff --git b/src/webdav.c a/src/webdav.c
    index 8ec7a2d..4bdaece 100644
    --- b/src/webdav.c
    +++ a/src/webdav.c
    @@ -472,7 +472,7 @@ dav_init_connection(const char *path)

    if (!ret) {
    initialized = 1;
    - if (!caps.dav_class1 && !ignore_dav_header) {

    + if (!caps.dav_class1 && !caps.dav_class2 && !ignore_dav_header) {
    if (have_terminal) {
    error(EXIT_FAILURE, 0,
    _("mounting failed; the server does not support
    WebDAV"));


    After applying this patch, the error "the server does not support WebDAV"
    is gone. After a simple test of the WebDAV + davfs2 mix, I also experienced
    very poor performance. I think I have to go back to fuse-dfs performance
    wise.

    Thanks a lot for your quick help!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com, bingjun@nus.edu.sg, bingjun@comp.nus.edu.sg

    Tel No: +65-96188110 (M)

    On Wed, Oct 28, 2009 at 9:22 AM, Huy Phan wrote:

    Hi Zhang,
    I applied my patch to davfs2-1.4.0 and it's working fine with Hadoop
    0.20.1.
    If you didn't define any access restriction in account.properties file,
    you can ignore the authentication when mounting davfs2.


    Best,
    Huy Phan


    Zhang Bingjun (Eddy) wrote:
    Dear Huy Phan,

    I downloaded davfs2-1.4.3 and in this version the patch you sent me seems
    to be applied already. I compiled and installed this version. However, the
    error message is still around like below...

    hadoop@hdfs2:/mnt$ sudo mount.davfs http://192.168.0.131:9800hdfs-webdav/
    Please enter the username to authenticate with server
    http://192.168.0.131:9800 or hit enter for none.
    Username: hadoop
    Please enter the password to authenticate user hadoop with server
    http://192.168.0.131:9800 or hit enter for none.
    Password:
    mount.davfs: mounting failed; the server does not support WebDAV

    Which username or password should I input? Any user in the
    account.properties file or the user in the WebDAV OS?

    Regarding the memory leak in fuse-dfs and libhdfs, I posted one patch in
    apache jira. However, when used in production environment, the memory leak
    still exists and cause the mounting point unusable after a number of
    write/read operations. The memory leak there is really annoying...

    I hope I can setup the mix of davfs2 and WebDAV to have a try on its
    performance. Any ideas to get around the error "mount failed; the server
    does not support WebDAV"?

    Thank you so much for your help!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com >> bingjun@nus.edu.sg , bingjun@comp.nus.edu.sg<mailto:
    bingjun@comp.nus.edu.sg>
    Tel No: +65-96188110 (M)


    On Tue, Oct 27, 2009 at 7:19 PM, Huy Phan <dachuy@gmail.com <mailto:
    dachuy@gmail.com>> wrote:

    Hi Zhang,
    I didn't play much with fuse-dfs, in my opinion, memory leak is
    something solvable and I can see Apache had made some fixes for
    this issue on libhdfs.
    If you encounter these problems with older version of Hadoop, I
    think you should give a try on the latest stable version.
    Since I didn't have much fun so far with fuse-dfs, i cannot say
    it's the best or not, but it's definitely better than mixing
    davfs2 and webdav together.


    Best,
    Huy Phan

    Zhang Bingjun (Eddy) wrote:

    Dear Huy Phan,


    Thanks for your quick reply.
    I was using fuse-dfs before. But I found serious memory leak
    with fuse-dfs about 10MB leakage per 10k file read/write. When
    the occupied memory size reached about 150MB, the read/write
    performance dropped dramatically. Did you encounter these
    problems?

    What I am trying to do is to mount HDFS as a local directory
    in Ubuntu. Do you think fuse-dfs is the best option so far?

    Thank you so much for your input!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com >> <mailto:eddymier@gmail.com >> bingjun@nus.edu.sg >> <mailto:bingjun@nus.edu.sg >> bingjun@comp.nus.edu.sg >> <mailto:bingjun@comp.nus.edu.sg >> Tel No: +65-96188110 (M)


    On Tue, Oct 27, 2009 at 6:55 PM, Huy Phan <dachuy@gmail.com
    wrote:

    Hi Zhang,

    Here is the patch for davfs2 to solve "server does not support
    WebDAV" issue:

    diff --git a/src/webdav.c b/src/webdav.c
    index 8ec7a2d..4bdaece 100644
    --- a/src/webdav.c
    +++ b/src/webdav.c
    @@ -472,7 +472,7 @@ dav_init_connection(const char *path)

    if (!ret) {
    initialized = 1;
    - if (!caps.dav_class1 && !caps.dav_class2 &&
    !ignore_dav_header) {
    + if (!caps.dav_class1 && !ignore_dav_header) {
    if (have_terminal) {
    error(EXIT_FAILURE, 0,
    _("mounting failed; the server does not
    support WebDAV"));


    davfs2 and webdav is not a good mix actually, I had tried
    to mix
    them together and the performance were really bad. With the
    load
    test of 10 requests/s, load average on my namenode were
    always >
    15 and it took me about 5 mins for `ls` the root directory
    of HDFS
    during the test.

    Since you're using Hadoop 0.20.1, it's better to use fusedfs
    library provided in Hadoop package. You have to do some
    tricks to
    compile fusedfs with Hadoop, otherwise it would take you a
    lot of
    time for compiling redundant things.

    Best,
    Huy Phan

    Zhang Bingjun (Eddy) wrote:

    Dear Huy Phan and others,

    Thanks a lot for your efforts in customizing the WebDav
    server
    <http://github.com/huyphan/HDFS-over-Webdav> and make
    it work
    for Hadoop-0.20.1.
    After setting up the WebDav server, I could access it using
    Cadaver client in Ubuntu without using any username
    password.
    Operations like deleting files, etc, were working. The
    command
    is: *cadaver http://server:9800*

    However, when I was trying to mount the WebDav server using
    davfs2 in Ubuntu, I always get the following error:
    "mount.davfs: mounting failed; the server does not support
    WebDAV".

    I was promoted to input username and password like below:
    hadoop@hdfs2:/mnt$ sudo mount.davfs
    http://192.168.0.131:9800/test hdfs-webdav/
    Please enter the username to authenticate with server
    http://192.168.0.131:9800/test or hit enter for none.
    Username: hadoop
    Please enter the password to authenticate user hadoop
    with server
    http://192.168.0.131:9800/test or hit enter for none.
    Password:
    mount.davfs: mounting failed; the server does not
    support WebDAV

    Even though I have tried all possible usernames and
    passwords
    either from the WebDAV accounts.properties file or from the
    Ubuntu system of the WebDAV server, I still got this error
    message.
    Could you and anyone give me some hints on this
    problem? How
    could I solve it? Very much appreciate your help!

    Best regards,
    Zhang Bingjun (Eddy)

    E-mail: eddymier@gmail.com >> <mailto:eddymier@gmail.com >> <mailto:eddymier@gmail.com >> <mailto:eddymier@gmail.com >>
    bingjun@nus.edu.sg >> <mailto:bingjun@nus.edu.sg >> <mailto:bingjun@nus.edu.sg >> <mailto:bingjun@nus.edu.sg >>
    bingjun@comp.nus.edu.sg
    <mailto:bingjun@comp.nus.edu.sg >> >> >> >> >>
    Tel No: +65-96188110 (M)




Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedOct 27, '09 at 10:28a
activeOct 28, '09 at 7:40a
posts8
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

Zhang Bingjun (Eddy): 5 posts Huy Phan: 3 posts

People

Translate

site design / logo © 2022 Grokbase