FAQ
Hi, I'm using Cloudera's distribution with the pseudo config.

I'm also using a system-wide install of RVM, which manages Ruby and Gems.

My mapper is a Ruby script like this

#!/bin/env ruby
...

The problem is MapRed process seems can't load RVM. I added
/etc/profile.d/rvm.sh in hadoop-env.sh. But the script still fails due
to the same error.



I can see the job is executed as user `mapred'. So I tried to login as
`mapred' and make sure the following code works.

ssh mapred@localhost '/bin/env ruby -v'



But I still getting the following error if running

/bin/env: ruby: No such file or directory



Any tips for me to resolve this?

Search Discussions

  • Harsh J at Apr 2, 2011 at 4:53 am
    Is the 'ruby' binary available in the $PATH for the 'mapred' user? You
    can see if it finds one using 'which ruby'.
    On Sat, Apr 2, 2011 at 10:17 AM, Guang-Nan Cheng wrote:
    /bin/env: ruby: No such file or directory
    --
    Harsh J
    http://harshj.com
  • Guang-Nan Cheng at Apr 2, 2011 at 5:07 am
    If I manually logged in as `mapred', then yes. No matter it's interactive or
    non-interactive shell. But streaming seems launch the program differently.



    I've tried to use `-mapper env' for diagnosis. It prints things below. So
    right, `ruby' is not in $PATH when streaming launch it.

    fs_s3n_block_size=67108864ng_impl=org.apache.hadoop.net.ScriptBasedMappinge.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.BZip2Codec0000_0/worke/mapred/mapred/local/taskTracker/guangnan/jobcache/job_201104020248_0013/attempt_201104020248_0013_m_000000_0/work:/usr/java/jdk1.6.0_23/jre/lib/i386/server:/usr/java/jdk1.6.0_23/jre/lib/i386:/usr/java/jdk1.6.0_23/jre/../lib/i386




    Any further ideas? Maybe it's not logged in as `mapred'?


    R/W/S=7689/309/0 in:NA [rec/s] out:NA [rec/s]
    minRecWrittenToEnableSkip_=9223372036854775807 *LOGNAME=null*
    *HOST=null
    USER=mapred
    HADOOP_USER=null
    last Hadoop input: |null|
    *last tool output:
    fs_s3n_block_size=67108864ng_impl=org.apache.hadoop.net.ScriptBasedMappinge.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.BZip2Codec0000_0/worke/mapred/mapred/local/taskTracker/guangnan/jobcache/job_201104020248_0013/attempt_201104020248_0013_m_000000_0/work:/usr/java/jdk1.6.0_23/jre/lib/i386/server:/usr/java/jdk1.6.0_23/jre/lib/i386:/usr/java/jdk1.6.0_23/jre/../lib/i386|
    Date: Sat Apr 02 12:58:32 CST 2011
    java.io.IOException: Broken pipe




    On Sat, Apr 2, 2011 at 12:52 PM, Harsh J wrote:
    Is the 'ruby' binary available in the $PATH for the 'mapred' user? You
    can see if it finds one using 'which ruby'.
    On Sat, Apr 2, 2011 at 10:17 AM, Guang-Nan Cheng wrote:
    /bin/env: ruby: No such file or directory
    --
    Harsh J
    http://harshj.com
  • Allen Wittenauer at Apr 2, 2011 at 2:35 pm

    On Apr 1, 2011, at 9:47 PM, Guang-Nan Cheng wrote:

    The problem is MapRed process seems can't load RVM. I added
    /etc/profile.d/rvm.sh in hadoop-env.sh. But the script still fails due
    to the same error.
    Add this to the .bashrc:

    [ -x /etc/profile ] && . /etc/profile

    and make sure /etc/profile is executable.
  • Todd Lipcon at Apr 4, 2011 at 5:26 pm
    Another way would be to set "mapred.child.env" in your Job configuration to
    "PATH=$PATH:/path/to/rvm/bin"

    -Todd

    On Sat, Apr 2, 2011 at 7:37 AM, Allen Wittenauer
    wrote:
    On Apr 1, 2011, at 9:47 PM, Guang-Nan Cheng wrote:

    The problem is MapRed process seems can't load RVM. I added
    /etc/profile.d/rvm.sh in hadoop-env.sh. But the script still fails due
    to the same error.
    Add this to the .bashrc:

    [ -x /etc/profile ] && . /etc/profile

    and make sure /etc/profile is executable.


    --
    Todd Lipcon
    Software Engineer, Cloudera
  • Guang-Nan Cheng at Apr 16, 2011 at 4:24 am
    Great! It works!
    On Tue, Apr 5, 2011 at 1:26 AM, Todd Lipcon wrote:

    Another way would be to set "mapred.child.env" in your Job configuration to
    "PATH=$PATH:/path/to/rvm/bin"

    -Todd

    On Sat, Apr 2, 2011 at 7:37 AM, Allen Wittenauer
    wrote:
    On Apr 1, 2011, at 9:47 PM, Guang-Nan Cheng wrote:

    The problem is MapRed process seems can't load RVM. I added
    /etc/profile.d/rvm.sh in hadoop-env.sh. But the script still fails due
    to the same error.
    Add this to the .bashrc:

    [ -x /etc/profile ] && . /etc/profile

    and make sure /etc/profile is executable.


    --
    Todd Lipcon
    Software Engineer, Cloudera

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedApr 2, '11 at 4:48a
activeApr 16, '11 at 4:24a
posts6
users4
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase