FAQ
Hi,

I am new to hadoop (Just 1 month old). These are the steps I followed to
install and run hadoop-0.20.203.0:

1) Downloaded tar file from
http://mirrors.axint.net/apache/hadoop/common/hadoop-0.20.203.0/hadoop-0.20.203.0rc1.tar.gz.
2) Untarred it in /usr/local/ .
3) Set JAVA_HOME=/usr/lib/jvm/java-6-sun (which has already been installed)
4) Modified the config files viz. core-site.xml , hdfs-site.xml and
mapred-site.xml as provided on the single node installation page [
http://hadoop.apache.org/common/docs/r0.20.203.0/single_node_setup.html#PseudoDistributed].
5) Formatted the new distributed-filesystem using bin/hadoop namenode
-format
6) Started the hdfs daemon using bin/start-dfs.sh

Now, here is the error...

# start-dfs.sh
starting namenode, logging to
/usr/local/hadoop/bin/../logs/hadoop-root-namenode-ip-10-98-94-62.out
localhost: starting datanode, logging to
/usr/local/hadoop/bin/../logs/hadoop-root-datanode-ip-10-98-94-62.out
localhost: starting secondarynamenode, logging to
/usr/local/hadoop/bin/../logs/hadoop-root-secondarynamenode-ip-10-98-94-62.out

The terminal says that datanode has been started, but when I run jps
command, its shows different.

# jps
395 Jps
32612 SecondaryNameNode
32442 NameNode

And in the /usr/local/hadoop/logs/hadoop-root-datanode-ip-10-98-94-62.out
this is the log:

Unrecognized option: -jvm
Could not create the Java virtual machine.

My question is, can anybody tell me what is the error or what am I doing
wrong, or in general, how I can make my datanode run?

Thanks.

With regards,
Rutesh Chavda

Search Discussions

  • Joey Echeverria at Jun 15, 2011 at 5:01 pm
    By any chance, are you running as root? If so, try running as a different user.

    -Joey
    On Wed, Jun 15, 2011 at 12:53 PM, rutesh wrote:
    Hi,

    I am new to hadoop (Just 1 month old). These are the steps I followed to
    install and run hadoop-0.20.203.0:

    1) Downloaded tar file from
    http://mirrors.axint.net/apache/hadoop/common/hadoop-0.20.203.0/hadoop-0.20.203.0rc1.tar.gz.
    2) Untarred it in /usr/local/ .
    3) Set JAVA_HOME=/usr/lib/jvm/java-6-sun (which has already been installed)
    4) Modified the config files viz. core-site.xml , hdfs-site.xml and
    mapred-site.xml as provided on the single node installation page [
    http://hadoop.apache.org/common/docs/r0.20.203.0/single_node_setup.html#PseudoDistributed].
    5) Formatted the new distributed-filesystem using bin/hadoop namenode
    -format
    6) Started the hdfs daemon using bin/start-dfs.sh

    Now, here is the error...

    # start-dfs.sh
    starting namenode, logging to
    /usr/local/hadoop/bin/../logs/hadoop-root-namenode-ip-10-98-94-62.out
    localhost: starting datanode, logging to
    /usr/local/hadoop/bin/../logs/hadoop-root-datanode-ip-10-98-94-62.out
    localhost: starting secondarynamenode, logging to
    /usr/local/hadoop/bin/../logs/hadoop-root-secondarynamenode-ip-10-98-94-62.out

    The terminal says that datanode has been started, but when I run jps
    command, its shows different.

    # jps
    395 Jps
    32612 SecondaryNameNode
    32442 NameNode

    And in the /usr/local/hadoop/logs/hadoop-root-datanode-ip-10-98-94-62.out
    this is the log:

    Unrecognized option: -jvm
    Could not create the Java virtual machine.

    My question is, can anybody tell me what is the error or what am I doing
    wrong, or in general, how I can make my datanode run?

    Thanks.

    With regards,
    Rutesh Chavda


    --
    Joseph Echeverria
    Cloudera, Inc.
    443.305.9434
  • Jeff Schmitz at Jun 15, 2011 at 5:03 pm
    You have to format the datanode too hadoop datanode -format also make sure it is in the slaves file -

    Cheers -

    -----Original Message-----
    From: Joey Echeverria
    Sent: Wednesday, June 15, 2011 12:01 PM
    To: common-user@hadoop.apache.org
    Subject: Re: Datanode not created on hadoop-0.20.203.0

    By any chance, are you running as root? If so, try running as a different user.

    -Joey
    On Wed, Jun 15, 2011 at 12:53 PM, rutesh wrote:
    Hi,

    I am new to hadoop (Just 1 month old). These are the steps I followed to
    install and run hadoop-0.20.203.0:

    1) Downloaded tar file from
    http://mirrors.axint.net/apache/hadoop/common/hadoop-0.20.203.0/hadoop-0.20.203.0rc1.tar.gz.
    2) Untarred it in /usr/local/ .
    3) Set JAVA_HOME=/usr/lib/jvm/java-6-sun (which has already been installed)
    4) Modified the config files viz. core-site.xml , hdfs-site.xml and
    mapred-site.xml as provided on the single node installation page [
    http://hadoop.apache.org/common/docs/r0.20.203.0/single_node_setup.html#PseudoDistributed].
    5) Formatted the new distributed-filesystem using bin/hadoop namenode
    -format
    6) Started the hdfs daemon using bin/start-dfs.sh

    Now, here is the error...

    # start-dfs.sh
    starting namenode, logging to
    /usr/local/hadoop/bin/../logs/hadoop-root-namenode-ip-10-98-94-62.out
    localhost: starting datanode, logging to
    /usr/local/hadoop/bin/../logs/hadoop-root-datanode-ip-10-98-94-62.out
    localhost: starting secondarynamenode, logging to
    /usr/local/hadoop/bin/../logs/hadoop-root-secondarynamenode-ip-10-98-94-62.out

    The terminal says that datanode has been started, but when I run jps
    command, its shows different.

    # jps
    395 Jps
    32612 SecondaryNameNode
    32442 NameNode

    And in the /usr/local/hadoop/logs/hadoop-root-datanode-ip-10-98-94-62.out
    this is the log:

    Unrecognized option: -jvm
    Could not create the Java virtual machine.

    My question is, can anybody tell me what is the error or what am I doing
    wrong, or in general, how I can make my datanode run?

    Thanks.

    With regards,
    Rutesh Chavda


    --
    Joseph Echeverria
    Cloudera, Inc.
    443.305.9434
  • Rutesh at Jun 16, 2011 at 5:47 am
    Hi,

    I tried formatting the datanode too, but again same result:

    #hadoop datanode -format
    Unrecognized option: -jvm
    Could not create the Java virtual machine.

    Also, the datanodes have been defined in the slaves file. Is there any other
    workaround or process?

    With regards
    Rutesh
    On Wed, Jun 15, 2011 at 10:32 PM, wrote:

    You have to format the datanode too hadoop datanode -format also make sure
    it is in the slaves file -

    Cheers -

    -----Original Message-----
    From: Joey Echeverria
    Sent: Wednesday, June 15, 2011 12:01 PM
    To: common-user@hadoop.apache.org
    Subject: Re: Datanode not created on hadoop-0.20.203.0

    By any chance, are you running as root? If so, try running as a different
    user.

    -Joey
    On Wed, Jun 15, 2011 at 12:53 PM, rutesh wrote:
    Hi,

    I am new to hadoop (Just 1 month old). These are the steps I followed to
    install and run hadoop-0.20.203.0:

    1) Downloaded tar file from
    http://mirrors.axint.net/apache/hadoop/common/hadoop-0.20.203.0/hadoop-0.20.203.0rc1.tar.gz
    .
    2) Untarred it in /usr/local/ .
    3) Set JAVA_HOME=/usr/lib/jvm/java-6-sun (which has already been
    installed)
    4) Modified the config files viz. core-site.xml , hdfs-site.xml and
    mapred-site.xml as provided on the single node installation page [
    http://hadoop.apache.org/common/docs/r0.20.203.0/single_node_setup.html#PseudoDistributed
    ].
    5) Formatted the new distributed-filesystem using bin/hadoop namenode
    -format
    6) Started the hdfs daemon using bin/start-dfs.sh

    Now, here is the error...

    # start-dfs.sh
    starting namenode, logging to
    /usr/local/hadoop/bin/../logs/hadoop-root-namenode-ip-10-98-94-62.out
    localhost: starting datanode, logging to
    /usr/local/hadoop/bin/../logs/hadoop-root-datanode-ip-10-98-94-62.out
    localhost: starting secondarynamenode, logging to
    /usr/local/hadoop/bin/../logs/hadoop-root-secondarynamenode-ip-10-98-94-62.out
    The terminal says that datanode has been started, but when I run jps
    command, its shows different.

    # jps
    395 Jps
    32612 SecondaryNameNode
    32442 NameNode

    And in the /usr/local/hadoop/logs/hadoop-root-datanode-ip-10-98-94-62.out
    this is the log:

    Unrecognized option: -jvm
    Could not create the Java virtual machine.

    My question is, can anybody tell me what is the error or what am I doing
    wrong, or in general, how I can make my datanode run?

    Thanks.

    With regards,
    Rutesh Chavda


    --
    Joseph Echeverria
    Cloudera, Inc.
    443.305.9434

  • Joey Echeverria at Jun 16, 2011 at 1:59 pm
    What user are you running as?

    -Joey
    On Thu, Jun 16, 2011 at 1:46 AM, rutesh wrote:
    Hi,

    I tried formatting the datanode too, but again same result:

    #hadoop datanode -format
    Unrecognized option: -jvm
    Could not create the Java virtual machine.

    Also, the datanodes have been defined in the slaves file. Is there any other
    workaround or process?

    With regards
    Rutesh
    On Wed, Jun 15, 2011 at 10:32 PM, wrote:

    You have to format the datanode too hadoop datanode -format also make sure
    it is in the slaves file -

    Cheers -

    -----Original Message-----
    From: Joey Echeverria
    Sent: Wednesday, June 15, 2011 12:01 PM
    To: common-user@hadoop.apache.org
    Subject: Re: Datanode not created on hadoop-0.20.203.0

    By any chance, are you running as root? If so, try running as a different
    user.

    -Joey
    On Wed, Jun 15, 2011 at 12:53 PM, rutesh wrote:
    Hi,

    I am new to hadoop (Just 1 month old). These are the steps I followed to
    install and run hadoop-0.20.203.0:

    1) Downloaded tar file from
    http://mirrors.axint.net/apache/hadoop/common/hadoop-0.20.203.0/hadoop-0.20.203.0rc1.tar.gz
    .
    2) Untarred it in /usr/local/ .
    3) Set JAVA_HOME=/usr/lib/jvm/java-6-sun (which has already been
    installed)
    4) Modified the config files viz. core-site.xml , hdfs-site.xml and
    mapred-site.xml as provided on the single node installation page [
    http://hadoop.apache.org/common/docs/r0.20.203.0/single_node_setup.html#PseudoDistributed
    ].
    5) Formatted the new distributed-filesystem using bin/hadoop namenode
    -format
    6) Started the hdfs daemon using bin/start-dfs.sh

    Now, here is the error...

    # start-dfs.sh
    starting namenode, logging to
    /usr/local/hadoop/bin/../logs/hadoop-root-namenode-ip-10-98-94-62.out
    localhost: starting datanode, logging to
    /usr/local/hadoop/bin/../logs/hadoop-root-datanode-ip-10-98-94-62.out
    localhost: starting secondarynamenode, logging to
    /usr/local/hadoop/bin/../logs/hadoop-root-secondarynamenode-ip-10-98-94-62.out
    The terminal says that datanode has been started, but when I run jps
    command, its shows different.

    # jps
    395 Jps
    32612 SecondaryNameNode
    32442 NameNode

    And in the /usr/local/hadoop/logs/hadoop-root-datanode-ip-10-98-94-62.out
    this is the log:

    Unrecognized option: -jvm
    Could not create the Java virtual machine.

    My question is, can anybody tell me what is the error or what am I doing
    wrong, or in general, how I can make my datanode run?

    Thanks.

    With regards,
    Rutesh Chavda


    --
    Joseph Echeverria
    Cloudera, Inc.
    443.305.9434



    --
    Joseph Echeverria
    Cloudera, Inc.
    443.305.9434
  • Rutesh at Jun 17, 2011 at 4:43 am
    I am running as 'hadoop' user. A user I created on my machine to run hadoop
    exclusively.

    Rutesh
  • Harsh J at Jun 17, 2011 at 5:19 am
    Rutesh,

    As per your command outputs in the first mail, you seem to be running
    the daemons as 'root'. Try starting them as 'hadoop' with no sudo.
    On Fri, Jun 17, 2011 at 10:12 AM, rutesh wrote:
    I am running as 'hadoop' user.  A user I created on my machine to run hadoop
    exclusively.

    Rutesh


    --
    Harsh J
  • Rutesh at Jun 17, 2011 at 5:23 am
    Hi Harsh,

    I started the daemons as 'hadoop' without sudo. Still it is giving me the
    error.

    Rutesh
    On Fri, Jun 17, 2011 at 10:48 AM, Harsh J wrote:

    Rutesh,

    As per your command outputs in the first mail, you seem to be running
    the daemons as 'root'. Try starting them as 'hadoop' with no sudo.
    On Fri, Jun 17, 2011 at 10:12 AM, rutesh wrote:
    I am running as 'hadoop' user. A user I created on my machine to run hadoop
    exclusively.

    Rutesh


    --
    Harsh J

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedJun 15, '11 at 4:54p
activeJun 17, '11 at 5:23a
posts8
users4
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase