FAQ
Hi, I installed cloudera manager on my cluster with HDFS , MapReduce ,
oozie services in it.
I am tryin to run a wordcount streaming job with the following command

bin/hadoop jar
/opt/cloudera/parcels/CDH-4.2.0-1.cdh4.2.0.p0.10/lib/hadoop-mapreduce/hadoop-streaming-2.0.0-cdh4.2.0.jar
-file /home/srikanth/pythonex/mapper.py -mapper
/home/srikanth/pythonex/mapper.py -file /home/srikanth/pythonex/reducer.py
-reducer /home/srikanth/pythonex/reducer.py -input
/user/hdfs/data/pg4300.txt -output /user/hdfs/output

I am getting the error as following

13/04/03 18:50:38 ERROR security.UserGroupInformation:
PriviledgedActionException as:srikanth (auth:SIMPLE)
cause:org.apache.hadoop.security.AccessControlException: Permission denied:
user=srikanth, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4639)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4610)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2968)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:2932)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2911)
at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:649)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:417)
at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44096)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1695)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1691)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1689)

13/04/03 18:50:38 ERROR streaming.StreamJob: Error Launching job :
Permission denied: user=srikanth, access=WRITE,
inode="/user":hdfs:supergroup:drwxr-xr-x
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4639)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4610)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2968)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:2932)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2911)
at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:649)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:417)
at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44096)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1695)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1691)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1689)

Streaming Command Failed!

I tried to change the ownership from srikanth to hdfs for the .py files but
still getting the error. Can any one please help me how to resolve this.

Search Discussions

  • Deepak Gattala at Apr 3, 2013 at 3:28 am
    13/04/03 18:50:38 ERROR streaming.StreamJob: Error Launching job :
    Permission denied: user=srikanth, access=WRITE,
    inode="/user":hdfs:supergroup:drwxr-xr-x

    the above is your problem, I thinkyou have logged in as sriikanth and
    running it, either u can do sudo su -hdfs or login as hdfs and run the job,
    or grant urself access by adding to the required grouo then you should be
    good.or chmod 770 /users and join the group what you have used for /users

    thanks
    Deepak Gattala

    On Tue, Apr 2, 2013 at 10:04 PM, srikanth ayalasomayajulu wrote:

    Hi, I installed cloudera manager on my cluster with HDFS , MapReduce ,
    oozie services in it.
    I am tryin to run a wordcount streaming job with the following command

    bin/hadoop jar
    /opt/cloudera/parcels/CDH-4.2.0-1.cdh4.2.0.p0.10/lib/hadoop-mapreduce/hadoop-streaming-2.0.0-cdh4.2.0.jar
    -file /home/srikanth/pythonex/mapper.py -mapper
    /home/srikanth/pythonex/mapper.py -file /home/srikanth/pythonex/reducer.py
    -reducer /home/srikanth/pythonex/reducer.py -input
    /user/hdfs/data/pg4300.txt -output /user/hdfs/output

    I am getting the error as following

    13/04/03 18:50:38 ERROR security.UserGroupInformation:
    PriviledgedActionException as:srikanth (auth:SIMPLE)
    cause:org.apache.hadoop.security.AccessControlException: Permission denied:
    user=srikanth, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186)
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4639)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4610)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2968)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:2932)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2911)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:649)
    at
    org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:417)
    at
    org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44096)
    at
    org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1695)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1691)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1689)

    13/04/03 18:50:38 ERROR streaming.StreamJob: Error Launching job :
    Permission denied: user=srikanth, access=WRITE,
    inode="/user":hdfs:supergroup:drwxr-xr-x
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186)
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4639)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4610)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2968)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:2932)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2911)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:649)
    at
    org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:417)
    at
    org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44096)
    at
    org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1695)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1691)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1689)

    Streaming Command Failed!

    I tried to change the ownership from srikanth to hdfs for the .py files
    but still getting the error. Can any one please help me how to resolve this.
  • Darren Lo at Apr 3, 2013 at 2:27 pm
    You might just need to create the home directory for /user/srikanth in
    hdfs. Generally all users need a home directory in hdfs. Making the home
    directory for your user is generally preferred over running jobs as hdfs or
    chmodding /users, much how you normally don't run jobs as root or change
    permissions on /home.

    You can use Hue to easily set up your home directory by logging in as an
    hdfs admin (such as user "hdfs") and checking the appropriate option when
    creating a user in Hue. You can also accomplish this via hdfs command line
    (using sudo -u hdfs).

    Thanks,
    Darren

    On Tue, Apr 2, 2013 at 8:28 PM, Deepak Gattala wrote:

    13/04/03 18:50:38 ERROR streaming.StreamJob: Error Launching job :
    Permission denied: user=srikanth, access=WRITE,
    inode="/user":hdfs:supergroup:drwxr-xr-x

    the above is your problem, I thinkyou have logged in as sriikanth and
    running it, either u can do sudo su -hdfs or login as hdfs and run the job,
    or grant urself access by adding to the required grouo then you should be
    good.or chmod 770 /users and join the group what you have used for /users

    thanks
    Deepak Gattala


    On Tue, Apr 2, 2013 at 10:04 PM, srikanth ayalasomayajulu <
    srikraj8341@gmail.com> wrote:
    Hi, I installed cloudera manager on my cluster with HDFS , MapReduce ,
    oozie services in it.
    I am tryin to run a wordcount streaming job with the following command

    bin/hadoop jar
    /opt/cloudera/parcels/CDH-4.2.0-1.cdh4.2.0.p0.10/lib/hadoop-mapreduce/hadoop-streaming-2.0.0-cdh4.2.0.jar
    -file /home/srikanth/pythonex/mapper.py -mapper
    /home/srikanth/pythonex/mapper.py -file /home/srikanth/pythonex/reducer.py
    -reducer /home/srikanth/pythonex/reducer.py -input
    /user/hdfs/data/pg4300.txt -output /user/hdfs/output

    I am getting the error as following

    13/04/03 18:50:38 ERROR security.UserGroupInformation:
    PriviledgedActionException as:srikanth (auth:SIMPLE)
    cause:org.apache.hadoop.security.AccessControlException: Permission denied:
    user=srikanth, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186)
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4639)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4610)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2968)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:2932)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2911)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:649)
    at
    org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:417)
    at
    org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44096)
    at
    org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1695)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1691)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1689)

    13/04/03 18:50:38 ERROR streaming.StreamJob: Error Launching job :
    Permission denied: user=srikanth, access=WRITE,
    inode="/user":hdfs:supergroup:drwxr-xr-x
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186)
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4639)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4610)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2968)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:2932)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2911)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:649)
    at
    org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:417)
    at
    org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44096)
    at
    org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1695)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1691)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1689)

    Streaming Command Failed!

    I tried to change the ownership from srikanth to hdfs for the .py files
    but still getting the error. Can any one please help me how to resolve this.

    --
    Thanks,
    Darren
  • Amit Mor at Apr 3, 2013 at 2:49 pm
    Correct me if I am wrong, but If you are using CM and don't really mind the
    security and file system permissions, you can un-tick dfs.permissions,
    under the hdfs configuration section in CM. This would set the user
    permission checking to false and allow any user to write files to dfs

    On Wednesday, April 3, 2013 6:04:29 AM UTC+3, srikanth ayalasomayajulu
    wrote:
    Hi, I installed cloudera manager on my cluster with HDFS , MapReduce ,
    oozie services in it.
    I am tryin to run a wordcount streaming job with the following command

    bin/hadoop jar
    /opt/cloudera/parcels/CDH-4.2.0-1.cdh4.2.0.p0.10/lib/hadoop-mapreduce/hadoop-streaming-2.0.0-cdh4.2.0.jar
    -file /home/srikanth/pythonex/mapper.py -mapper
    /home/srikanth/pythonex/mapper.py -file /home/srikanth/pythonex/reducer.py
    -reducer /home/srikanth/pythonex/reducer.py -input
    /user/hdfs/data/pg4300.txt -output /user/hdfs/output

    I am getting the error as following

    13/04/03 18:50:38 ERROR security.UserGroupInformation:
    PriviledgedActionException as:srikanth (auth:SIMPLE)
    cause:org.apache.hadoop.security.AccessControlException: Permission denied:
    user=srikanth, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186)
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4639)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4610)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2968)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:2932)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2911)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:649)
    at
    org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:417)
    at
    org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44096)
    at
    org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1695)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1691)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1689)

    13/04/03 18:50:38 ERROR streaming.StreamJob: Error Launching job :
    Permission denied: user=srikanth, access=WRITE,
    inode="/user":hdfs:supergroup:drwxr-xr-x
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186)
    at
    org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4639)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4610)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2968)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:2932)
    at
    org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2911)
    at
    org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:649)
    at
    org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:417)
    at
    org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44096)
    at
    org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1695)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1691)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1689)

    Streaming Command Failed!

    I tried to change the ownership from srikanth to hdfs for the .py files
    but still getting the error. Can any one please help me how to resolve this.

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupscm-users @
categorieshadoop
postedApr 3, '13 at 3:04a
activeApr 3, '13 at 2:49p
posts4
users4
websitecloudera.com
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase