FAQ
Hi:

After startind HDFS, I am able to create directories via the Hadoop api
and the shell app.

However I am not able to create a new file: I keep on getting problems
connecting to the data node. (on locahost:50010)

By going to the admin UI, I see Live Datanodes listed to be correct to be
listed: localhost:50010.

And in the data node log, I see the line:
2006-11-24 10:05:07,261 INFO org.apache.hadoop.dfs.DataNode: Opened server
at 50
010

So looks like the data node is alive.

Also, by clocking on the Browse the filesystem link in the admin ui, I
am taken to the address:

http://192.168.1.3:65535/browseDirectory.jsp?namenodeInfoPort=50070&dir=%2F

Which is not resolved.

Any suggestions would be greatly appreciated.

Thanks

-John

Search Discussions

  • Howard chen at Nov 24, 2006 at 4:56 am

    On 11/24/06, John Wang wrote:
    Hi:

    After startind HDFS, I am able to create directories via the Hadoop api
    and the shell app.

    However I am not able to create a new file: I keep on getting problems
    connecting to the data node. (on locahost:50010)

    By going to the admin UI, I see Live Datanodes listed to be correct to be
    listed: localhost:50010.

    And in the data node log, I see the line:
    2006-11-24 10:05:07,261 INFO org.apache.hadoop.dfs.DataNode: Opened server
    at 50
    010

    So looks like the data node is alive.

    Also, by clocking on the Browse the filesystem link in the admin ui, I
    am taken to the address:

    http://192.168.1.3:65535/browseDirectory.jsp?namenodeInfoPort=50070&dir=%2F

    Which is not resolved.

    Any suggestions would be greatly appreciated.

    Thanks

    -John
    can you show the commands used?

    howa
  • John Wang at Nov 24, 2006 at 6:13 am
    I started HDFS by running start-dfs.sh (also tried start-all.sh, but makes
    no difference).

    I was able to create directories by running: haddop dfs -mkdir <dir>
    and using the hadoop java api, e.g. getting a FileSystem instance and do:
    fs.mkdirs().

    While trying to do FileSystem.createNewFile, I get: "Waiting to find target
    node:" messages. By stepping into the source code, I see the client could
    not connect to the data node.

    Thanks for your help.

    -John
    On 11/24/06, John Wang wrote:

    Hi:

    After startind HDFS, I am able to create directories via the Hadoop api
    and the shell app.

    However I am not able to create a new file: I keep on getting problems
    connecting to the data node. (on locahost:50010)

    By going to the admin UI, I see Live Datanodes listed to be correct to
    be listed: localhost:50010.

    And in the data node log, I see the line:
    2006-11-24 10:05:07,261 INFO org.apache.hadoop.dfs.DataNode : Opened
    server at 50
    010

    So looks like the data node is alive.

    Also, by clocking on the Browse the filesystem link in the admin ui, I
    am taken to the address:


    http://192.168.1.3:65535/browseDirectory.jsp?namenodeInfoPort=50070&dir=%2F

    Which is not resolved.

    Any suggestions would be greatly appreciated.

    Thanks

    -John
  • Raghu Angadi at Nov 27, 2006 at 7:06 pm

    John Wang wrote:
    And in the data node log, I see the line:
    2006-11-24 10:05:07,261 INFO org.apache.hadoop.dfs.DataNode: Opened server
    at 50
    010

    So looks like the data node is alive.

    Also, by clocking on the Browse the filesystem link in the admin ui, I
    am taken to the address:

    http://192.168.1.3:65535/browseDirectory.jsp?namenodeInfoPort=50070&dir=%2F
    This is fixed in the latest trunk. For now try replacing the port 65535
    with 50075.

    regd the original problem, look for any error messages in namenode log also.

    Raghu.
  • John Wang at Nov 28, 2006 at 3:01 pm
    In the namenode log, it does show it has lost heartbeat to the datanode. But
    data node log looks fine, no errors there.

    Any ideas?

    thanks

    -John

    On 11/28/06, Raghu Angadi wrote:

    John Wang wrote:
    And in the data node log, I see the line:
    2006-11-24 10:05:07,261 INFO org.apache.hadoop.dfs.DataNode: Opened server
    at 50
    010

    So looks like the data node is alive.

    Also, by clocking on the Browse the filesystem link in the admin ui, I
    am taken to the address:
    http://192.168.1.3:65535/browseDirectory.jsp?namenodeInfoPort=50070&dir=%2F

    This is fixed in the latest trunk. For now try replacing the port 65535
    with 50075.

    regd the original problem, look for any error messages in namenode log
    also.

    Raghu.

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedNov 24, '06 at 2:11a
activeNov 28, '06 at 3:01p
posts5
users3
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase