FAQ
I just downloaded and installed hadoop ver 0.200.1 and cygwin 1.5.25-15
and installed them (Windows XP.) I'm having trouble with ssh. When I
enter "ssh localhost" I'm prompted for a password. I can enter it and I
can log in successfully. So I ran these two commands:



$ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
$ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys



But I'm still prompted for a password. Did I miss something when
configuring ssh? The files created in .ssh look ok.



Btw, I am able to run one of the example hadoop applications in
Standalone mode and it works.



I'm following the instructions in:



http://hadoop.apache.org/common/docs/current/quickstart.html#Local



Thanks.



-Dennis

Search Discussions

  • Edward Capriolo at Oct 21, 2009 at 9:48 pm

    On Wed, Oct 21, 2009 at 5:39 PM, Dennis DiMaria wrote:
    I just downloaded and installed hadoop ver 0.200.1 and cygwin 1.5.25-15
    and installed them (Windows XP.) I'm having trouble with ssh. When I
    enter "ssh localhost" I'm prompted for a password. I can enter it and I
    can log in successfully. So I ran these two commands:



    $ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
    $ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys



    But I'm still prompted for a password. Did I miss something when
    configuring ssh? The files created in .ssh look ok.



    Btw, I am able to run one of the example hadoop applications in
    Standalone mode and it works.



    I'm following the instructions in:



    http://hadoop.apache.org/common/docs/current/quickstart.html#Local



    Thanks.



    -Dennis
    More then likely this is the permissions of the authorized keys file.
    Make sure:
    chmod 600 ~/.ssh/authorized_keys
    Make sure:
    file is owned by proper user
    Make sure:
    drwx------ .ssh
    You can tune up the verbosity of your ssh server to troubleshoot this more.

    There are lots of ssh key tutorials out there. Good hunting.
  • Aaron Kimball at Oct 22, 2009 at 4:28 am
    Another sneaky permissions requirement is that ~/.ssh/ itself must be mode
    0750 or more strict.

    - Aaron
    On Wed, Oct 21, 2009 at 2:47 PM, Edward Capriolo wrote:

    On Wed, Oct 21, 2009 at 5:39 PM, Dennis DiMaria
    wrote:
    I just downloaded and installed hadoop ver 0.200.1 and cygwin 1.5.25-15
    and installed them (Windows XP.) I'm having trouble with ssh. When I
    enter "ssh localhost" I'm prompted for a password. I can enter it and I
    can log in successfully. So I ran these two commands:



    $ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
    $ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys



    But I'm still prompted for a password. Did I miss something when
    configuring ssh? The files created in .ssh look ok.



    Btw, I am able to run one of the example hadoop applications in
    Standalone mode and it works.



    I'm following the instructions in:



    http://hadoop.apache.org/common/docs/current/quickstart.html#Local



    Thanks.



    -Dennis
    More then likely this is the permissions of the authorized keys file.
    Make sure:
    chmod 600 ~/.ssh/authorized_keys
    Make sure:
    file is owned by proper user
    Make sure:
    drwx------ .ssh
    You can tune up the verbosity of your ssh server to troubleshoot this more.

    There are lots of ssh key tutorials out there. Good hunting.
  • Dennis DiMaria at Oct 22, 2009 at 2:52 pm
    Success! Thank you Edward and Aaron. I had to change the permissions of
    my home directory "chmod 750 ~" and chown my home dir so that I owned
    it.

    FYI: when debugging sshd, the -d and -e options are very useful.

    So I ran the 0.20.1 grep example is pseudo distributed mode and it
    worked. It was much slower than the run I did in Standalone mode. I'm
    running Windows XP and the latest version of cygwin, 1.5.25-15. Is
    Windows generally slow or is it a tuning issue?

    Thanks again.

    -Dennis

    -----Original Message-----
    From: Aaron Kimball
    Sent: Thursday, October 22, 2009 12:28 AM
    To: common-user@hadoop.apache.org
    Subject: Re: openssh - can't achieve passphraseless ssh

    Another sneaky permissions requirement is that ~/.ssh/ itself must be
    mode
    0750 or more strict.

    - Aaron

    On Wed, Oct 21, 2009 at 2:47 PM, Edward Capriolo
    wrote:
    On Wed, Oct 21, 2009 at 5:39 PM, Dennis DiMaria
    wrote:
    I just downloaded and installed hadoop ver 0.200.1 and cygwin
    1.5.25-15
    and installed them (Windows XP.) I'm having trouble with ssh. When I
    enter "ssh localhost" I'm prompted for a password. I can enter it
    and I
    can log in successfully. So I ran these two commands:



    $ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
    $ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys



    But I'm still prompted for a password. Did I miss something when
    configuring ssh? The files created in .ssh look ok.



    Btw, I am able to run one of the example hadoop applications in
    Standalone mode and it works.



    I'm following the instructions in:



    http://hadoop.apache.org/common/docs/current/quickstart.html#Local



    Thanks.



    -Dennis
    More then likely this is the permissions of the authorized keys file.
    Make sure:
    chmod 600 ~/.ssh/authorized_keys
    Make sure:
    file is owned by proper user
    Make sure:
    drwx------ .ssh
    You can tune up the verbosity of your ssh server to troubleshoot this more.
    There are lots of ssh key tutorials out there. Good hunting.
  • Todd Lipcon at Oct 22, 2009 at 3:03 pm
    Hi Dennis,
    This is normal to see pseudodistributed much slower than standalone.

    Most of the extra overhead you're seeing is due to the heartbeat mechanism
    of Hadoop. When the JobTracker has tasks that need to be run, it does not
    actually push them to the TaskTrackers. Instead, the TaskTrackers
    periodically heartbeat to the JobTracker asking for new tasks. This is true
    even on a single node pseudodistributed cluster. Because of this, there is a
    fair amount of wall clock time spent sleeping waiting for heartbeats to
    occur so the tasks can get assigned.

    This might sound like a big problem, but Hadoop is just not designed for
    "interactive" jobs that only take a few seconds. At last count, I found that
    "no-op" jobs that only process a few bytes of data usually take between 20
    and 25 seconds on a pseudodistributed cluster. The problem is a bit less
    pronounced on a larger cluster since tasktrackers are heartbeating more
    regularly.

    -Todd
    On Thu, Oct 22, 2009 at 7:52 AM, Dennis DiMaria wrote:

    Success! Thank you Edward and Aaron. I had to change the permissions of
    my home directory "chmod 750 ~" and chown my home dir so that I owned
    it.

    FYI: when debugging sshd, the -d and -e options are very useful.

    So I ran the 0.20.1 grep example is pseudo distributed mode and it
    worked. It was much slower than the run I did in Standalone mode. I'm
    running Windows XP and the latest version of cygwin, 1.5.25-15. Is
    Windows generally slow or is it a tuning issue?

    Thanks again.

    -Dennis

    -----Original Message-----
    From: Aaron Kimball
    Sent: Thursday, October 22, 2009 12:28 AM
    To: common-user@hadoop.apache.org
    Subject: Re: openssh - can't achieve passphraseless ssh

    Another sneaky permissions requirement is that ~/.ssh/ itself must be
    mode
    0750 or more strict.

    - Aaron

    On Wed, Oct 21, 2009 at 2:47 PM, Edward Capriolo
    wrote:
    On Wed, Oct 21, 2009 at 5:39 PM, Dennis DiMaria
    wrote:
    I just downloaded and installed hadoop ver 0.200.1 and cygwin
    1.5.25-15
    and installed them (Windows XP.) I'm having trouble with ssh. When I
    enter "ssh localhost" I'm prompted for a password. I can enter it
    and I
    can log in successfully. So I ran these two commands:



    $ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
    $ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys



    But I'm still prompted for a password. Did I miss something when
    configuring ssh? The files created in .ssh look ok.



    Btw, I am able to run one of the example hadoop applications in
    Standalone mode and it works.



    I'm following the instructions in:



    http://hadoop.apache.org/common/docs/current/quickstart.html#Local



    Thanks.



    -Dennis
    More then likely this is the permissions of the authorized keys file.
    Make sure:
    chmod 600 ~/.ssh/authorized_keys
    Make sure:
    file is owned by proper user
    Make sure:
    drwx------ .ssh
    You can tune up the verbosity of your ssh server to troubleshoot this more.
    There are lots of ssh key tutorials out there. Good hunting.

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedOct 21, '09 at 9:40p
activeOct 22, '09 at 3:03p
posts5
users4
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase