FAQ
Using a copy of the Cloudera security-enabled CDH3b3, we installed vanilla
hadoop in /home/www/hadoop

Now when a try to run a job as me I get permission errors -
I am not even sure if the error is in writing to local files or hdfs or
where staging is but I need to set permissions to allow the job to work

Any bright ideas


10/11/09 12:58:04 WARN conf.Configuration: mapred.task.id is deprecated.
Instead, use mapreduce.task.attempt.id
Exception in thread "main"
org.apache.hadoop.security.AccessControlException: Permission denied:
user=slewis, access=WRITE, inode="staging":www:supergroup:rwxr-xr-x
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:207)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:188)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:136)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4019)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:3993)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:1914)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:1882)
at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:847)
at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.apache.hadoop.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:342)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1350)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1346)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:742)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1344)

--
Steven M. Lewis PhD
4221 105th Ave Ne
Kirkland, WA 98033
206-384-1340 (cell)
Institute for Systems Biology
Seattle WA

Search Discussions

  • Raj V at Nov 9, 2010 at 9:49 pm
    Steve

    (Todd Lipcon helped here).

    There are two users ( hdfs and mapred ) and one group (hadoop).

    All hdfs files are owned by hdfs and belong to the hadoop group.
    All mapred fles are owned by user mapred and belong to the hadoop group.

    For example
    1.I have hadoop.tmp.dir as /hadoop/tmp and here are the permissions ( /hadoop
    775 hdfs hadoop) (/hadoop/tmp - 1777 hdfs hadoop).
    2. mapred.local.dir /hadoop/local . permisisons /hadoop/local is 775 hdfs hadoop
    3. mapred.system.dir is /mapred/system 755 mapred system.
    4. dfs.data.dir = /hadoop/dfs/data 755 hdfs hadoop
    5. dfs.namedir = /hadoop/dfs/name 755 hdfs hadoop.

    Finally to get it to work you need to do

    - sudo -u hdfs hadoop fs -mkdir /mapred
    - sudo -u hdfs hadoop fs -chown mapred /mapred

    For regular M/R jobs the user needs to belong to the hadoop group.
    For fsadmin tasks ( formatting , fsck and such like you need to run them as hdfs
    user).

    Hope this works.

    Raj






    ________________________________
    From: Steve Lewis <lordjoe2000@gmail.com>
    To: common-user <common-user@hadoop.apache.org>
    Sent: Tue, November 9, 2010 1:29:35 PM
    Subject: Permissions issue

    Using a copy of the Cloudera security-enabled CDH3b3, we installed vanilla
    hadoop in /home/www/hadoop

    Now when a try to run a job as me I get permission errors -
    I am not even sure if the error is in writing to local files or hdfs or
    where staging is but I need to set permissions to allow the job to work

    Any bright ideas


    10/11/09 12:58:04 WARN conf.Configuration: mapred.task.id is deprecated.
    Instead, use mapreduce.task.attempt.id
    Exception in thread "main"
    org.apache.hadoop.security.AccessControlException: Permission denied:
    user=slewis, access=WRITE, inode="staging":www:supergroup:rwxr-xr-x
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:207)

    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:188)

    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:136)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4019)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:3993)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:1914)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:1882)

    at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:847)
    at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:342)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1350)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1346)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:742)

    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1344)

    --
    Steven M. Lewis PhD
    4221 105th Ave Ne
    Kirkland, WA 98033
    206-384-1340 (cell)
    Institute for Systems Biology
    Seattle WA
  • Eli Collins at Nov 10, 2010 at 1:07 am
    Adding cdh-user@, BCC common-user@

    Hey Steve,

    Sounds like you need to chmod 777 the staging dir. By default
    mapreduce.jobtracker.staging.root.dir is
    ${hadoop.tmp.dir}/mapred/staging but per the mapred configuration
    below setting this to /user is better and should mean you don't need
    to do the above chmod.

    http://hadoop.apache.org/mapreduce/docs/current/mapred-default.html

    Thanks,
    Eli

    On Tue, Nov 9, 2010 at 1:49 PM, Raj V wrote:
    Steve

    (Todd Lipcon helped here).

    There are two users ( hdfs and mapred ) and one group (hadoop).

    All hdfs files are owned by hdfs and belong to the hadoop group.
    All mapred fles are owned by user mapred and belong to the hadoop group.

    For example
    1.I have hadoop.tmp.dir as /hadoop/tmp and here are the permissions ( /hadoop
    775 hdfs hadoop) (/hadoop/tmp - 1777 hdfs hadoop).
    2. mapred.local.dir /hadoop/local . permisisons /hadoop/local is 775 hdfs hadoop
    3. mapred.system.dir is /mapred/system 755 mapred system.
    4. dfs.data.dir = /hadoop/dfs/data  755 hdfs hadoop
    5. dfs.namedir = /hadoop/dfs/name 755 hdfs hadoop.

    Finally to get it to work you need to do

    - sudo -u hdfs hadoop fs -mkdir /mapred
    - sudo -u hdfs hadoop fs -chown mapred /mapred

    For regular M/R jobs the user needs to belong to the hadoop group.
    For fsadmin tasks ( formatting , fsck and such like you need to run them as hdfs
    user).

    Hope this works.

    Raj






    ________________________________
    From: Steve Lewis <lordjoe2000@gmail.com>
    To: common-user <common-user@hadoop.apache.org>
    Sent: Tue, November 9, 2010 1:29:35 PM
    Subject: Permissions issue

    Using a copy of the Cloudera  security-enabled CDH3b3, we installed vanilla
    hadoop in /home/www/hadoop

    Now when a try to run a job as me I get permission errors -
    I am not even sure if the error is in writing to local files or hdfs or
    where staging is but I need to set permissions to allow the job to work

    Any bright ideas


    10/11/09 12:58:04 WARN conf.Configuration: mapred.task.id is deprecated.
    Instead, use mapreduce.task.attempt.id
    Exception in thread "main"
    org.apache.hadoop.security.AccessControlException: Permission denied:
    user=slewis, access=WRITE, inode="staging":www:supergroup:rwxr-xr-x
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:207)

    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:188)

    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:136)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4019)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:3993)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:1914)

    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:1882)

    at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:847)
    at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)
    at
    org.apache.hadoop.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:342)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1350)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1346)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:742)

    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1344)

    --
    Steven M. Lewis PhD
    4221 105th Ave Ne
    Kirkland, WA 98033
    206-384-1340 (cell)
    Institute for Systems Biology
    Seattle WA
  • Allen Wittenauer at Nov 9, 2010 at 11:00 pm

    On Nov 9, 2010, at 1:29 PM, Steve Lewis wrote:

    Using a copy of the Cloudera security-enabled CDH3b3, we installed vanilla
    hadoop in /home/www/hadoop

    Now when a try to run a job as me I get permission errors -
    I am not even sure if the error is in writing to local files or hdfs or
    where staging is but I need to set permissions to allow the job to work

    Any bright ideas
    This is an HDFS permissions errors. The top of the stack trace tells you pretty much what is going on:
    10/11/09 12:58:04 WARN conf.Configuration: mapred.task.id is deprecated.
    Instead, use mapreduce.task.attempt.id
    Exception in thread "main"
    org.apache.hadoop.security.AccessControlException: Permission denied:
    user=slewis, access=WRITE, inode="staging":www:supergroup:rwxr-xr-x
    User slewis can't write to the staging directory because

    - dir is owned by www
    - the dir permissions prevent anyone else to write

    If you aren't sure where the staging dir, check the hdfs audit log information.

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedNov 9, '10 at 9:29p
activeNov 10, '10 at 1:07a
posts4
users4
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase