FAQ
i know it is running one datanode in one computer normally。
i wondering can i run multiple datanode in one pc?

Search Discussions

  • Lohit at Sep 4, 2008 at 4:24 am
    Yes, each datanode should point to different config.
    So, if you have conf/hadoop-site.xml make another conf2/hadoop-site.xml with ports for datanode specific stuff and you should be able to start multiple datanodes on same node.
    -Lohit



    ----- Original Message ----
    From: 叶双明 <yeshuangming@gmail.com>
    To: core-user@hadoop.apache.org
    Sent: Wednesday, September 3, 2008 8:19:59 PM
    Subject: can i run multiple datanode in one pc?

    i know it is running one datanode in one computer normally。
    i wondering can i run multiple datanode in one pc?
  • 叶双明 at Sep 4, 2008 at 7:02 am
    Thanks lohit.

    I run start datanod by comman: bin/hadoop datanode -conf
    conf/hadoop-site.xml, it can't work, but command: bin/hadoop datanode can
    work.

    Something wrong have I done?



    bin/hadoop datanode -conf conf/hadoop-site.xml


    2008/9/4 lohit <lohit_bv@yahoo.com>
    Yes, each datanode should point to different config.
    So, if you have conf/hadoop-site.xml make another conf2/hadoop-site.xml
    with ports for datanode specific stuff and you should be able to start
    multiple datanodes on same node.
    -Lohit



    ----- Original Message ----
    From: 叶双明 <yeshuangming@gmail.com>
    To: core-user@hadoop.apache.org
    Sent: Wednesday, September 3, 2008 8:19:59 PM
    Subject: can i run multiple datanode in one pc?

    i know it is running one datanode in one computer normally。
    i wondering can i run multiple datanode in one pc?
  • Lohit at Sep 4, 2008 at 5:17 pm
    Good way is to have different 'conf' dirs
    So, you would end up with dir conf1 and conf2
    and startup of datanode would be
    ./bin/hadoop-daemons.sh --config conf1 start datanode
    ./bin/hadoop-daemons.sh --config conf2 start datanode


    make sure you have different hadoop-site.xml in conf1 and conf2 dirs.

    -Lohit

    ----- Original Message ----
    From: 叶双明 <yeshuangming@gmail.com>
    To: core-user@hadoop.apache.org
    Sent: Thursday, September 4, 2008 12:01:48 AM
    Subject: Re: can i run multiple datanode in one pc?

    Thanks lohit.

    I run start datanod by comman: bin/hadoop datanode -conf
    conf/hadoop-site.xml, it can't work, but command: bin/hadoop datanode can
    work.

    Something wrong have I done?



    bin/hadoop datanode -conf conf/hadoop-site.xml


    2008/9/4 lohit <lohit_bv@yahoo.com>
    Yes, each datanode should point to different config.
    So, if you have conf/hadoop-site.xml make another conf2/hadoop-site.xml
    with ports for datanode specific stuff and you should be able to start
    multiple datanodes on same node.
    -Lohit



    ----- Original Message ----
    From: 叶双明 <yeshuangming@gmail.com>
    To: core-user@hadoop.apache.org
    Sent: Wednesday, September 3, 2008 8:19:59 PM
    Subject: can i run multiple datanode in one pc?

    i know it is running one datanode in one computer normally。
    i wondering can i run multiple datanode in one pc?
  • 叶双明 at Sep 5, 2008 at 2:08 am
    I can start first datanode by
    bin/hadoop-daemons.sh --config conf1 start datanode

    and copy conf1 to conf2, and modify the value for follwing properties, port
    = port +1

    <property>
    <name>dfs.datanode.address</name>
    <value>0.0.0.0:50011</value>
    <description>
    The address where the datanode server will listen to.
    If the port is 0 then the server will start on a free port.
    </description>
    </property>

    <property>
    <name>dfs.datanode.http.address</name>
    <value>0.0.0.0:50076</value>
    <description>
    The datanode http server address and port.
    If the port is 0 then the server will start on a free port.
    </description>
    </property>

    <property>
    <name>dfs.http.address</name>
    <value>0.0.0.0:50071</value>
    <description>
    The address and the base port where the dfs namenode web ui will listen
    on.
    If the port is 0 then the server will start on a free port.
    </description>
    </property>

    <property>
    <name>dfs.datanode.https.address</name>
    <value>0.0.0.0:50476</value>
    </property>

    And run bin/hadoop-daemons.sh --config conf2 start datanode
    but get message: datanode running as process 4137. Stop it first.

    what other properties could I modify between the two config file.

    Thanks.

    2008/9/5 lohit <lohit_bv@yahoo.com>
    Good way is to have different 'conf' dirs
    So, you would end up with dir conf1 and conf2
    and startup of datanode would be
    ./bin/hadoop-daemons.sh --config conf1 start datanode
    ./bin/hadoop-daemons.sh --config conf2 start datanode


    make sure you have different hadoop-site.xml in conf1 and conf2 dirs.

    -Lohit

    ----- Original Message ----
    From: 叶双明 <yeshuangming@gmail.com>
    To: core-user@hadoop.apache.org
    Sent: Thursday, September 4, 2008 12:01:48 AM
    Subject: Re: can i run multiple datanode in one pc?

    Thanks lohit.

    I run start datanod by comman: bin/hadoop datanode -conf
    conf/hadoop-site.xml, it can't work, but command: bin/hadoop datanode can
    work.

    Something wrong have I done?



    bin/hadoop datanode -conf conf/hadoop-site.xml


    2008/9/4 lohit <lohit_bv@yahoo.com>
    Yes, each datanode should point to different config.
    So, if you have conf/hadoop-site.xml make another conf2/hadoop-site.xml
    with ports for datanode specific stuff and you should be able to start
    multiple datanodes on same node.
    -Lohit



    ----- Original Message ----
    From: 叶双明 <yeshuangming@gmail.com>
    To: core-user@hadoop.apache.org
    Sent: Wednesday, September 3, 2008 8:19:59 PM
    Subject: can i run multiple datanode in one pc?

    i know it is running one datanode in one computer normally。
    i wondering can i run multiple datanode in one pc?
  • 叶双明 at Sep 5, 2008 at 2:25 am
    In addition, I can't start datanode in another computer use command:
    bin/hadoop-daemons.sh --config conf start datanode, by the default config,
    log message is:

    2008-09-05 10:11:19,208 INFO org.apache.hadoop.dfs.DataNode: STARTUP_MSG:
    /************************************************************
    STARTUP_MSG: Starting DataNode
    STARTUP_MSG: host = testcenter/192.168.100.120
    STARTUP_MSG: args = []
    STARTUP_MSG: version = 0.17.1
    STARTUP_MSG: build =
    http://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.17 -r 669344;
    compiled by 'hadoopqa' on Thu Jun 19 01:18:25 UTC 2008
    ************************************************************/
    2008-09-05 10:11:20,097 INFO org.apache.hadoop.dfs.DataNode: Registered
    FSDatasetStatusMBean
    2008-09-05 10:11:20,106 ERROR org.apache.hadoop.dfs.DataNode:
    java.net.BindException: Problem binding to /0.0.0.0:50010
    at org.apache.hadoop.ipc.Server.bind(Server.java:175)
    at org.apache.hadoop.dfs.DataNode.startDataNode(DataNode.java:264)
    at org.apache.hadoop.dfs.DataNode.(DataNode.java:2765)
    at
    org.apache.hadoop.dfs.DataNode.instantiateDataNode(DataNode.java:2720)
    at org.apache.hadoop.dfs.DataNode.createDataNode(DataNode.java:2728)
    at org.apache.hadoop.dfs.DataNode.main(DataNode.java:2850)

    2008-09-05 10:11:20,107 INFO org.apache.hadoop.dfs.DataNode: SHUTDOWN_MSG:
    /************************************************************
    SHUTDOWN_MSG: Shutting down DataNode at testcenter/192.168.100.120
    ************************************************************/

    Maybe there is some wrong network config???

    2008/9/5 叶双明 <yeshuangming@gmail.com>
    I can start first datanode by
    bin/hadoop-daemons.sh --config conf1 start datanode

    and copy conf1 to conf2, and modify the value for follwing properties, port
    = port +1

    <property>
    <name>dfs.datanode.address</name>
    <value>0.0.0.0:50011</value>
    <description>
    The address where the datanode server will listen to.
    If the port is 0 then the server will start on a free port.
    </description>
    </property>

    <property>
    <name>dfs.datanode.http.address</name>
    <value>0.0.0.0:50076</value>
    <description>
    The datanode http server address and port.
    If the port is 0 then the server will start on a free port.
    </description>
    </property>

    <property>
    <name>dfs.http.address</name>
    <value>0.0.0.0:50071</value>
    <description>
    The address and the base port where the dfs namenode web ui will listen
    on.
    If the port is 0 then the server will start on a free port.
    </description>
    </property>

    <property>
    <name>dfs.datanode.https.address</name>
    <value>0.0.0.0:50476</value>
    </property>

    And run bin/hadoop-daemons.sh --config conf2 start datanode
    but get message: datanode running as process 4137. Stop it first.

    what other properties could I modify between the two config file.

    Thanks.

    2008/9/5 lohit <lohit_bv@yahoo.com>

    Good way is to have different 'conf' dirs
    So, you would end up with dir conf1 and conf2
    and startup of datanode would be
    ./bin/hadoop-daemons.sh --config conf1 start datanode
    ./bin/hadoop-daemons.sh --config conf2 start datanode


    make sure you have different hadoop-site.xml in conf1 and conf2 dirs.

    -Lohit

    ----- Original Message ----
    From: 叶双明 <yeshuangming@gmail.com>
    To: core-user@hadoop.apache.org
    Sent: Thursday, September 4, 2008 12:01:48 AM
    Subject: Re: can i run multiple datanode in one pc?

    Thanks lohit.

    I run start datanod by comman: bin/hadoop datanode -conf
    conf/hadoop-site.xml, it can't work, but command: bin/hadoop datanode can
    work.

    Something wrong have I done?



    bin/hadoop datanode -conf conf/hadoop-site.xml


    2008/9/4 lohit <lohit_bv@yahoo.com>
    Yes, each datanode should point to different config.
    So, if you have conf/hadoop-site.xml make another conf2/hadoop-site.xml
    with ports for datanode specific stuff and you should be able to start
    multiple datanodes on same node.
    -Lohit



    ----- Original Message ----
    From: 叶双明 <yeshuangming@gmail.com>
    To: core-user@hadoop.apache.org
    Sent: Wednesday, September 3, 2008 8:19:59 PM
    Subject: can i run multiple datanode in one pc?

    i know it is running one datanode in one computer normally。
    i wondering can i run multiple datanode in one pc?
  • Raghu Angadi at Sep 8, 2008 at 4:49 pm

    2008-09-05 10:11:20,106 ERROR org.apache.hadoop.dfs.DataNode:
    java.net.BindException: Problem binding to /0.0.0.0:50010
    Should you check if something is listening on 50010? I strongly
    encourage users to go through these logs when something does not work.

    Regd running multiple datanodes :

    Along with config directory, you should change the following env
    variables : HADOOP_LOG_DIR and HADOOP_PID_DIR (I set both to the same
    value).

    The following variables should be different in hadoop-site.xml :
    1. dfs.data.dir or hadoop.tmp.dir
    2. dfs.datanode.address
    3. dfs.datanode.http.address
    4. dfs.datanode.ipc.address

    Raghu.

    叶双明 wrote:
    In addition, I can't start datanode in another computer use command:
    bin/hadoop-daemons.sh --config conf start datanode, by the default config,
    log message is:

    2008-09-05 10:11:19,208 INFO org.apache.hadoop.dfs.DataNode: STARTUP_MSG:
    /************************************************************
    STARTUP_MSG: Starting DataNode
    STARTUP_MSG: host = testcenter/192.168.100.120
    STARTUP_MSG: args = []
    STARTUP_MSG: version = 0.17.1
    STARTUP_MSG: build =
    http://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.17 -r 669344;
    compiled by 'hadoopqa' on Thu Jun 19 01:18:25 UTC 2008
    ************************************************************/
    2008-09-05 10:11:20,097 INFO org.apache.hadoop.dfs.DataNode: Registered
    FSDatasetStatusMBean
    2008-09-05 10:11:20,106 ERROR org.apache.hadoop.dfs.DataNode:
    java.net.BindException: Problem binding to /0.0.0.0:50010
    at org.apache.hadoop.ipc.Server.bind(Server.java:175)
    at org.apache.hadoop.dfs.DataNode.startDataNode(DataNode.java:264)
    at org.apache.hadoop.dfs.DataNode.<init>(DataNode.java:171)
    at org.apache.hadoop.dfs.DataNode.makeInstance(DataNode.java:2765)
    at
    org.apache.hadoop.dfs.DataNode.instantiateDataNode(DataNode.java:2720)
    at org.apache.hadoop.dfs.DataNode.createDataNode(DataNode.java:2728)
    at org.apache.hadoop.dfs.DataNode.main(DataNode.java:2850)

    2008-09-05 10:11:20,107 INFO org.apache.hadoop.dfs.DataNode: SHUTDOWN_MSG:
    /************************************************************
    SHUTDOWN_MSG: Shutting down DataNode at testcenter/192.168.100.120
    ************************************************************/

    Maybe there is some wrong network config???

    2008/9/5 叶双明 <yeshuangming@gmail.com>
    I can start first datanode by
    bin/hadoop-daemons.sh --config conf1 start datanode

    and copy conf1 to conf2, and modify the value for follwing properties, port
    = port +1

    <property>
    <name>dfs.datanode.address</name>
    <value>0.0.0.0:50011</value>
    <description>
    The address where the datanode server will listen to.
    If the port is 0 then the server will start on a free port.
    </description>
    </property>

    <property>
    <name>dfs.datanode.http.address</name>
    <value>0.0.0.0:50076</value>
    <description>
    The datanode http server address and port.
    If the port is 0 then the server will start on a free port.
    </description>
    </property>

    <property>
    <name>dfs.http.address</name>
    <value>0.0.0.0:50071</value>
    <description>
    The address and the base port where the dfs namenode web ui will listen
    on.
    If the port is 0 then the server will start on a free port.
    </description>
    </property>

    <property>
    <name>dfs.datanode.https.address</name>
    <value>0.0.0.0:50476</value>
    </property>

    And run bin/hadoop-daemons.sh --config conf2 start datanode
    but get message: datanode running as process 4137. Stop it first.

    what other properties could I modify between the two config file.

    Thanks.

    2008/9/5 lohit <lohit_bv@yahoo.com>

    Good way is to have different 'conf' dirs
    So, you would end up with dir conf1 and conf2
    and startup of datanode would be
    ./bin/hadoop-daemons.sh --config conf1 start datanode
    ./bin/hadoop-daemons.sh --config conf2 start datanode


    make sure you have different hadoop-site.xml in conf1 and conf2 dirs.

    -Lohit

    ----- Original Message ----
    From: 叶双明 <yeshuangming@gmail.com>
    To: core-user@hadoop.apache.org
    Sent: Thursday, September 4, 2008 12:01:48 AM
    Subject: Re: can i run multiple datanode in one pc?

    Thanks lohit.

    I run start datanod by comman: bin/hadoop datanode -conf
    conf/hadoop-site.xml, it can't work, but command: bin/hadoop datanode can
    work.

    Something wrong have I done?



    bin/hadoop datanode -conf conf/hadoop-site.xml


    2008/9/4 lohit <lohit_bv@yahoo.com>
    Yes, each datanode should point to different config.
    So, if you have conf/hadoop-site.xml make another conf2/hadoop-site.xml
    with ports for datanode specific stuff and you should be able to start
    multiple datanodes on same node.
    -Lohit



    ----- Original Message ----
    From: 叶双明 <yeshuangming@gmail.com>
    To: core-user@hadoop.apache.org
    Sent: Wednesday, September 3, 2008 8:19:59 PM
    Subject: can i run multiple datanode in one pc?

    i know it is running one datanode in one computer normally。
    i wondering can i run multiple datanode in one pc?
  • 叶双明 at Sep 9, 2008 at 1:38 am
    Thanks lohit and Raghu Angadi !
    I will try it. do more practice.

    2008/9/9 Raghu Angadi <rangadi@yahoo-inc.com>
    2008-09-05 10:11:20,106 ERROR org.apache.hadoop.dfs.DataNode:
    java.net.BindException: Problem binding to /0.0.0.0:50010
    Should you check if something is listening on 50010? I strongly
    encourage users to go through these logs when something does not work.

    Regd running multiple datanodes :

    Along with config directory, you should change the following env
    variables : HADOOP_LOG_DIR and HADOOP_PID_DIR (I set both to the same
    value).

    The following variables should be different in hadoop-site.xml :
    1. dfs.data.dir or hadoop.tmp.dir
    2. dfs.datanode.address
    3. dfs.datanode.http.address
    4. dfs.datanode.ipc.address

    Raghu.

    叶双明 wrote:
    In addition, I can't start datanode in another computer use command:
    bin/hadoop-daemons.sh --config conf start datanode, by the default config,
    log message is:

    2008-09-05 10:11:19,208 INFO org.apache.hadoop.dfs.DataNode: STARTUP_MSG:
    /************************************************************
    STARTUP_MSG: Starting DataNode
    STARTUP_MSG: host = testcenter/192.168.100.120
    STARTUP_MSG: args = []
    STARTUP_MSG: version = 0.17.1
    STARTUP_MSG: build =
    http://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.17 -r 669344;
    compiled by 'hadoopqa' on Thu Jun 19 01:18:25 UTC 2008
    ************************************************************/
    2008-09-05 10:11:20,097 INFO org.apache.hadoop.dfs.DataNode: Registered
    FSDatasetStatusMBean
    2008-09-05 10:11:20,106 ERROR org.apache.hadoop.dfs.DataNode:
    java.net.BindException: Problem binding to /0.0.0.0:50010
    at org.apache.hadoop.ipc.Server.bind(Server.java:175)
    at
    org.apache.hadoop.dfs.DataNode.startDataNode(DataNode.java:264)
    at org.apache.hadoop.dfs.DataNode.<init>(DataNode.java:171)
    at
    org.apache.hadoop.dfs.DataNode.makeInstance(DataNode.java:2765)
    at
    org.apache.hadoop.dfs.DataNode.instantiateDataNode(DataNode.java:2720)
    at
    org.apache.hadoop.dfs.DataNode.createDataNode(DataNode.java:2728)
    at org.apache.hadoop.dfs.DataNode.main(DataNode.java:2850)

    2008-09-05 10:11:20,107 INFO org.apache.hadoop.dfs.DataNode:
    SHUTDOWN_MSG:
    /************************************************************
    SHUTDOWN_MSG: Shutting down DataNode at testcenter/192.168.100.120
    ************************************************************/

    Maybe there is some wrong network config???

    2008/9/5 叶双明 <yeshuangming@gmail.com>
    I can start first datanode by
    bin/hadoop-daemons.sh --config conf1 start datanode

    and copy conf1 to conf2, and modify the value for follwing properties,
    port
    = port +1

    <property>
    <name>dfs.datanode.address</name>
    <value>0.0.0.0:50011</value>
    <description>
    The address where the datanode server will listen to.
    If the port is 0 then the server will start on a free port.
    </description>
    </property>

    <property>
    <name>dfs.datanode.http.address</name>
    <value>0.0.0.0:50076</value>
    <description>
    The datanode http server address and port.
    If the port is 0 then the server will start on a free port.
    </description>
    </property>

    <property>
    <name>dfs.http.address</name>
    <value>0.0.0.0:50071</value>
    <description>
    The address and the base port where the dfs namenode web ui will
    listen
    on.
    If the port is 0 then the server will start on a free port.
    </description>
    </property>

    <property>
    <name>dfs.datanode.https.address</name>
    <value>0.0.0.0:50476</value>
    </property>

    And run bin/hadoop-daemons.sh --config conf2 start datanode
    but get message: datanode running as process 4137. Stop it first.

    what other properties could I modify between the two config file.

    Thanks.

    2008/9/5 lohit <lohit_bv@yahoo.com>

    Good way is to have different 'conf' dirs
    So, you would end up with dir conf1 and conf2
    and startup of datanode would be
    ./bin/hadoop-daemons.sh --config conf1 start datanode
    ./bin/hadoop-daemons.sh --config conf2 start datanode


    make sure you have different hadoop-site.xml in conf1 and conf2 dirs.

    -Lohit

    ----- Original Message ----
    From: 叶双明 <yeshuangming@gmail.com>
    To: core-user@hadoop.apache.org
    Sent: Thursday, September 4, 2008 12:01:48 AM
    Subject: Re: can i run multiple datanode in one pc?

    Thanks lohit.

    I run start datanod by comman: bin/hadoop datanode -conf
    conf/hadoop-site.xml, it can't work, but command: bin/hadoop datanode
    can
    work.

    Something wrong have I done?



    bin/hadoop datanode -conf conf/hadoop-site.xml


    2008/9/4 lohit <lohit_bv@yahoo.com>
    Yes, each datanode should point to different config.
    So, if you have conf/hadoop-site.xml make another
    conf2/hadoop-site.xml
    with ports for datanode specific stuff and you should be able to start
    multiple datanodes on same node.
    -Lohit



    ----- Original Message ----
    From: 叶双明 <yeshuangming@gmail.com>
    To: core-user@hadoop.apache.org
    Sent: Wednesday, September 3, 2008 8:19:59 PM
    Subject: can i run multiple datanode in one pc?

    i know it is running one datanode in one computer normally。
    i wondering can i run multiple datanode in one pc?

    --
    Sorry for my english!! 明

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedSep 4, '08 at 3:20a
activeSep 9, '08 at 1:38a
posts8
users3
websitehadoop.apache.org...
irc#hadoop

3 users in discussion

叶双明: 5 posts Lohit: 2 posts Raghu Angadi: 1 post

People

Translate

site design / logo © 2021 Grokbase