FAQ
Hi Everyone,

I'm trying to get my first MapReduce example to work. Background:

RedHat ES 5.2
Sun Java 1.6.0_16-b01
Hadoop 0.20.1+133 (Cloudera distro)

I've started the hadoop daemons, created an HDFS locally, and checked that
basic operations in HDFS appear to work.

I'm trying to get the first most basic example from Tom White's book "Hadoop
The Definitive Guide" to work. I'm running hadoop in pseudo-mode. I'm using
a generic user. Basic hadoop commands appear to work:

hadoop fs -ls
Found 1 items
-rw-r--r-- 1 david supergroup 28081 2009-10-06 23:27
/user/david/docnotes.txt

I compiled the examples in chapter 2 "by hand" (why is a separate thread). I
then try and see if I can invoke MaxTemperature with non-existent files (at
this point I'm just trying to see if we can get everything to load and
initialized):

export HADOOP_CLASSPATH="./"
hadoop MaxTemperature foo bar

I get the error message:

Exception in thread "main"
org.apache.hadoop.security.AccessControlException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=david, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

There's a long stack trace the start of which looks like:

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)
at
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)
at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:909)
at
org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:262)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1162)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:306)
.........

I'm out of ideas at this point. Any suggestions for where I should look to
solve this?

Cheers,

David

Search Discussions

  • Jason Venner at Oct 13, 2009 at 5:09 am
    My first guess is that, for what ever reason, the user you are running the
    job as, can not write the directory specified for the hadoop.tmp.dir in your
    configuration.
    This is usually in the system temporary area and not an issue.
    On Mon, Oct 12, 2009 at 9:52 AM, David Greer wrote:

    Hi Everyone,

    I'm trying to get my first MapReduce example to work. Background:

    RedHat ES 5.2
    Sun Java 1.6.0_16-b01
    Hadoop 0.20.1+133 (Cloudera distro)

    I've started the hadoop daemons, created an HDFS locally, and checked that
    basic operations in HDFS appear to work.

    I'm trying to get the first most basic example from Tom White's book
    "Hadoop The Definitive Guide" to work. I'm running hadoop in pseudo-mode.
    I'm using a generic user. Basic hadoop commands appear to work:

    hadoop fs -ls
    Found 1 items
    -rw-r--r-- 1 david supergroup 28081 2009-10-06 23:27
    /user/david/docnotes.txt

    I compiled the examples in chapter 2 "by hand" (why is a separate thread).
    I then try and see if I can invoke MaxTemperature with non-existent files
    (at this point I'm just trying to see if we can get everything to load and
    initialized):

    export HADOOP_CLASSPATH="./"
    hadoop MaxTemperature foo bar

    I get the error message:

    Exception in thread "main"
    org.apache.hadoop.security.AccessControlException:
    org.apache.hadoop.security.AccessControlException: Permission denied:
    user=david, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

    There's a long stack trace the start of which looks like:

    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:96)
    at
    org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:58)
    at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:909)
    at
    org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:262)
    at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1162)
    at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:306)
    .........

    I'm out of ideas at this point. Any suggestions for where I should look to
    solve this?

    Cheers,

    David


    --
    Pro Hadoop, a book to guide you from beginner to hadoop mastery,
    http://www.amazon.com/dp/1430219424?tag=jewlerymall
    www.prohadoopbook.com a community for Hadoop Professionals
  • David Greer at Oct 13, 2009 at 11:20 pm
    Jason Venner writes ...
    My first guess is that, for what ever reason, the user you are running the
    job as, can not write the directory specified for the hadoop.tmp.dir in your
    configuration.
    This is usually in the system temporary area and not an issue.

    Assuming that files are in /tmp/hadoop-david, all seems to be fine:

    [[email protected] ~]$ cd /tmp/hadoop-david
    [[email protected] hadoop-david]$ mkdir foo
    [[email protected] hadoop-david]$ rmdir foo

    Using hadoop fs commands, all appears fine too:

    [[email protected] ~]$ hadoop fs -ls
    Found 1 items
    -rw-r--r-- 1 david supergroup 28081 2009-10-06 23:27
    /user/david/docnotes.txt
    [[email protected] ~]$ hadoop fs -mkdir foo
    [[email protected] ~]$ hadoop fs -touchz foo/bar.txt
    [[email protected] ~]$ hadoop fs -rm foo/bar.txt
    Deleted hdfs://localhost/user/david/foo/bar.txt
    [[email protected] ~]$ hadoop fs -rmr foo
    Deleted hdfs://localhost/user/david/foo

    But trying to run MaxTemperature (this time with a valid input file
    and output directory) I still get an error:

    [[email protected] java]$ hadoop MaxTemperature sample.txt output
    09/10/13 16:18:29 INFO jvm.JvmMetrics: Initializing JVM Metrics with
    processName=JobTracker, sessionId=
    09/10/13 16:18:29 WARN mapred.JobClient: Use GenericOptionsParser for
    parsing the arguments. Applications should implement Tool for the
    same.
    Exception in thread "main"
    org.apache.hadoop.security.AccessControlException:
    org.apache.hadoop.security.AccessControlException: Permission denied:
    user=david, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

    The error message includes "root". I wonder if this is a clue? I'm out of ideas.
  • Allen Wittenauer at Oct 13, 2009 at 11:25 pm

    On 10/13/09 4:19 PM, "David Greer" wrote:

    user=david, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

    The error message includes "root". I wonder if this is a clue? I'm out of
    ideas.

    Check /tmp in hdfs.
  • David Greer at Oct 13, 2009 at 11:41 pm
    Following the suggestion:

    [[email protected] java]$ hadoop fs -ls /tmp
    Found 1 items
    drwxr-xr-x - root supergroup 0 2009-10-06 15:15 /tmp/hadoop-root
    [[email protected] java]$ hadoop fs -touchz /tmp/foobar.txt
    touchz: org.apache.hadoop.security.AccessControlException: Permission
    denied: user=david, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

    We seem to be on the right track. Before I change permissions on
    /tmp/hadoop-root any idea why it didn't get built with the right permissions
    in the first place? I've let hadoop build all of its directories for HDFS. I
    assume that adding write access for groups should solve the problem.

    Thanks,

    David

    On Tue, Oct 13, 2009 at 4:25 PM, Allen Wittenauer
    wrote:

    On 10/13/09 4:19 PM, "David Greer" wrote:

    user=david, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

    The error message includes "root". I wonder if this is a clue? I'm out of
    ideas.

    Check /tmp in hdfs.
  • Allen Wittenauer at Oct 13, 2009 at 11:48 pm
    The problem isn't /tmp/hadoop-root. The problem is likely /tmp not being
    world writable.



    On 10/13/09 4:40 PM, "David Greer" wrote:

    Following the suggestion:

    [[email protected] java]$ hadoop fs -ls /tmp
    Found 1 items
    drwxr-xr-x - root supergroup 0 2009-10-06 15:15 /tmp/hadoop-root
    [[email protected] java]$ hadoop fs -touchz /tmp/foobar.txt
    touchz: org.apache.hadoop.security.AccessControlException: Permission
    denied: user=david, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

    We seem to be on the right track. Before I change permissions on
    /tmp/hadoop-root any idea why it didn't get built with the right permissions
    in the first place? I've let hadoop build all of its directories for HDFS. I
    assume that adding write access for groups should solve the problem.

    Thanks,

    David

    On Tue, Oct 13, 2009 at 4:25 PM, Allen Wittenauer
    wrote:

    On 10/13/09 4:19 PM, "David Greer" wrote:

    user=david, access=WRITE, inode="tmp":root:supergroup:rwxr-xr-x

    The error message includes "root". I wonder if this is a clue? I'm out of
    ideas.

    Check /tmp in hdfs.
  • David Greer at Oct 13, 2009 at 11:59 pm
    That doesn't appear to be the problem:

    drwxrwxrwt 39 root root 77824 Oct 13 16:31 tmp

    Cheers,

    David

    On Tue, Oct 13, 2009 at 4:48 PM, Allen Wittenauer
    wrote:

    The problem isn't /tmp/hadoop-root. The problem is likely /tmp not being
    world writable.

  • Allen Wittenauer at Oct 14, 2009 at 12:01 am
    What does:

    hadoop dfs -ls / | grep tmp

    show you?

    On 10/13/09 4:59 PM, "David Greer" wrote:

    That doesn't appear to be the problem:

    drwxrwxrwt 39 root root 77824 Oct 13 16:31 tmp

    Cheers,

    David

    On Tue, Oct 13, 2009 at 4:48 PM, Allen Wittenauer
    wrote:

    The problem isn't /tmp/hadoop-root. The problem is likely /tmp not being
    world writable.

  • David Greer at Oct 14, 2009 at 5:24 am
    [[email protected] /]$ hadoop dfs -ls / | grep tmp
    drwxr-xr-x - root supergroup 0 2009-10-06 15:15 /tmp

    On Tue, Oct 13, 2009 at 5:00 PM, Allen Wittenauer
    wrote:
    What does:

    hadoop dfs -ls / | grep tmp

    show you?
  • Allen Wittenauer at Oct 14, 2009 at 3:28 pm

    On 10/13/09 10:23 PM, "David Greer" wrote:

    [[email protected] /]$ hadoop dfs -ls / | grep tmp
    drwxr-xr-x - root supergroup 0 2009-10-06 15:15 /tmp
    So /tmp in the HDFS is not world writable.

    hadoop dfs -chmod a+w /tmp should make it world writable and then your stuff
    should run.
  • David Greer at Oct 14, 2009 at 5:05 pm
    Fantastic Allen. Your solution worked and my first java MapReduce ran.
    Thanks for all the help.

    David
    -----Original Message-----
    From: Allen Wittenauer
    Sent: October 14, 2009 8:23 AM
    To: [email protected]
    Subject: Re: Security error running hadoop with MaxTemperature example



    On 10/13/09 10:23 PM, "David Greer" wrote:

    [[email protected] /]$ hadoop dfs -ls / | grep tmp
    drwxr-xr-x - root supergroup 0 2009-10-06 15:15 /tmp
    So /tmp in the HDFS is not world writable.

    hadoop dfs -chmod a+w /tmp should make it world writable and then your
    stuff
    should run.

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
grouphdfs-user @
categorieshadoop
postedOct 12, '09 at 4:52p
activeOct 14, '09 at 5:05p
posts11
users3
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2023 Grokbase