FAQ
Hi,

I'm reposting this from the hive user's group -- wondering if anybody here has seen this problem.

I am encountering an intermittent failure on my cluster related to an HDFS leaseholder issue..
This script (and the oozie workflow that triggers it) can run just fine most of the time..

Running CDH4 and have tried Hive 0.9.0 and the included 0.8.x. Same issue either way.

Any help is appreciated!



org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException: failed to create file /tmp/hive-mapred/hive_2012-09-25_20-45-27_578_4149304906998798834/-mr-10006/1/emptyFile for DFSClient_NONMAPREDUCE_-502553752_1 on client 192.168.217.135 because current leaseholder is trying to recreate file.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.recoverLeaseInternal(FSNamesystem.java:1752)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1589)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1514)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:408)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:200)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:42590)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:427)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1692)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1688)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1686)

at org.apache.hadoop.ipc.Client.call(Client.java:1161)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:184)
at $Proxy10.create(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:165)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:84)
at $Proxy10.create(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:187)
at org.apache.hadoop.hdfs.DFSOutputStream.(DFSOutputStream.java:1269)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1074)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1032)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:232)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:75)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:806)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:787)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:686)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:675)
at org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat.getHiveRecordWriter(HiveIgnoreKeyTextOutputFormat.java:80)
at org.apache.hadoop.hive.ql.exec.ExecDriver.addInputPath(ExecDriver.java:859)
at org.apache.hadoop.hive.ql.exec.ExecDriver.addInputPaths(ExecDriver.java:903)
at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:426)
at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:136)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:47)
Job Submission failed with exception 'org.apache.hadoop.ipc.RemoteException(failed to create file /tmp/hive-mapred/hive_2012-09-25_20-45-27_578_4149304906998798834/-mr-10006/1/emptyFile for DFSClient_NONMAPREDUCE_-502553752_1 on client 192.168.217.135 because current leaseholder is trying to recreate file.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.recoverLeaseInternal(FSNamesystem.java:1752)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1589)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1514)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:408)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:200)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:42590)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:427)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1692)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1688)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1686)
)'
java.lang.IllegalArgumentException: Can not create a Path from an empty string
at org.apache.hadoop.fs.Path.checkPathArg(Path.java:91)
at org.apache.hadoop.fs.Path.(Utilities.java:379)
at org.apache.hadoop.hive.ql.exec.Utilities.clearMapRedWork(Utilities.java:192)
at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:476)
at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:136)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:47)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask
FAILED: Hive Internal Error: java.lang.SecurityException(Intercepted System.exit(9))
java.lang.SecurityException: Intercepted System.exit(9)
at org.apache.oozie.action.hadoop.LauncherSecurityManager.checkExit(LauncherMapper.java:747)
at java.lang.Runtime.exit(Runtime.java:88)
at java.lang.System.exit(System.java:904)
at org.apache.hadoop.hive.ql.Driver.taskCleanup(Driver.java:1346)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1175)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:255)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:212)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:338)
at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:436)
at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:446)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:642)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:554)
at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:303)
at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:280)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:37)
at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:55)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:454)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:393)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:327)
at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
at org.apache.hadoop.mapred.Child.main(Child.java:264)

Intercepting System.exit(12)
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [12]
Starting Job = job_201209212239_20505, Tracking URL = http://qa-cassandra-node01:50030/jobdetails.jsp?jobid=job_201209212239_20505
Kill Command = /usr/lib/hadoop-0.20-mapreduce/bin/hadoop job -Dmapred.job.tracker=qa-cassandra-node01:8021 -kill job_201209212239_20505
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 12
log4j:WARN No appenders could be found for logger (org.apache.hadoop.hive.ql.exec.Task).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
2012-09-25 20:45:36,388 Stage-1 map = 100%, reduce = 100%
Ended Job = job_201209212239_20505 with errors
Error during job, obtaining debugging information...
Examining task ID: task_201209212239_20505_m_000002 (and more) from job job_201209212239_20505



If you follow the link to the launch mapred job (http://hadoop-dev-node01.tendrilinc.com:50030/jobdetails.jsp?jobid=job_201208030244_27387)
it reports that Job Setup failed & all attempts have the following error:

Error initializing attempt_201209212239_20505_m_000002_0:
java.io.FileNotFoundException: File does not exist: /tmp/hive-mapred/hive_2012-09-25_20-45-27_578_4149304906998798834/-mr-10007/32679f09-38c1-44db-9659-bbfaa74026bd
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:736)
at org.apache.hadoop.filecache.TaskDistributedCacheManager.setupCache(TaskDistributedCacheManager.java:180)
at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1368)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1359)
at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1274)
at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:2612)
at org.apache.hadoop.mapred.TaskTracker$TaskLauncher.run(TaskTracker.java:2576)

--

Search Discussions

  • Patrick Angeles at Sep 26, 2012 at 2:07 pm
    Haven't seen this before... my hunch is that it could be a problem with how
    your Oozie workflow is defined.
    On Tue, Sep 25, 2012 at 4:56 PM, Kate Bierbaum wrote:

    Hi,

    I'm reposting this from the hive user's group -- wondering if anybody here has seen this problem.

    I am encountering an intermittent failure on my cluster related to an HDFS leaseholder issue..
    This script (and the oozie workflow that triggers it) can run just fine most of the time..

    Running CDH4 and have tried Hive 0.9.0 and the included 0.8.x. Same issue either way.

    Any help is appreciated!



    org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException: failed to create file /tmp/hive-mapred/hive_2012-09-25_20-45-27_578_4149304906998798834/-mr-10006/1/emptyFile for DFSClient_NONMAPREDUCE_-502553752_1 on client 192.168.217.135 because current leaseholder is trying to recreate file.
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.recoverLeaseInternal(FSNamesystem.java:1752)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1589)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1514)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:408)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:200)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:42590)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:427)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1692)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1688)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1686)

    at org.apache.hadoop.ipc.Client.call(Client.java:1161)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:184)
    at $Proxy10.create(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:165)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:84)
    at $Proxy10.create(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:187)
    at org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1250)
    at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1269)
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1074)
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1032)
    at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:232)
    at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:75)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:806)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:787)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:686)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:675)
    at org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat.getHiveRecordWriter(HiveIgnoreKeyTextOutputFormat.java:80)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.addInputPath(ExecDriver.java:859)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.addInputPaths(ExecDriver.java:903)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:426)
    at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:136)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:47)
    Job Submission failed with exception 'org.apache.hadoop.ipc.RemoteException(failed to create file /tmp/hive-mapred/hive_2012-09-25_20-45-27_578_4149304906998798834/-mr-10006/1/emptyFile for DFSClient_NONMAPREDUCE_-502553752_1 on client 192.168.217.135 because current leaseholder is trying to recreate file.
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.recoverLeaseInternal(FSNamesystem.java:1752)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1589)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1514)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:408)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:200)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:42590)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:427)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1692)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1688)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1686)
    )'
    java.lang.IllegalArgumentException: Can not create a Path from an empty string
    at org.apache.hadoop.fs.Path.checkPathArg(Path.java:91)
    at org.apache.hadoop.fs.Path.<init>(Path.java:99)
    at org.apache.hadoop.hive.ql.exec.Utilities.getHiveJobID(Utilities.java:379)
    at org.apache.hadoop.hive.ql.exec.Utilities.clearMapRedWork(Utilities.java:192)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:476)
    at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:136)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:47)
    FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask
    FAILED: Hive Internal Error: java.lang.SecurityException(Intercepted System.exit(9))
    java.lang.SecurityException: Intercepted System.exit(9)
    at org.apache.oozie.action.hadoop.LauncherSecurityManager.checkExit(LauncherMapper.java:747)
    at java.lang.Runtime.exit(Runtime.java:88)
    at java.lang.System.exit(System.java:904)
    at org.apache.hadoop.hive.ql.Driver.taskCleanup(Driver.java:1346)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1175)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)
    at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:255)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:212)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:338)
    at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:436)
    at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:446)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:642)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:554)
    at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:303)
    at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:280)
    at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:37)
    at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:55)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:454)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:393)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:327)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.mapred.Child.main(Child.java:264)

    Intercepting System.exit(12)
    Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [12]
    Starting Job = job_201209212239_20505, Tracking URL = http://qa-cassandra-node01:50030/jobdetails.jsp?jobid=job_201209212239_20505
    Kill Command = /usr/lib/hadoop-0.20-mapreduce/bin/hadoop job -Dmapred.job.tracker=qa-cassandra-node01:8021 -kill job_201209212239_20505
    Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 12
    log4j:WARN No appenders could be found for logger (org.apache.hadoop.hive.ql.exec.Task).
    log4j:WARN Please initialize the log4j system properly.
    log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
    2012-09-25 20:45:36,388 Stage-1 map = 100%, reduce = 100%
    Ended Job = job_201209212239_20505 with errors
    Error during job, obtaining debugging information...
    Examining task ID: task_201209212239_20505_m_000002 (and more) from job job_201209212239_20505



    If you follow the link to the launch mapred job (
    http://hadoop-dev-node01.tendrilinc.com:50030/jobdetails.jsp?jobid=job_201208030244_27387)
    it reports that Job Setup failed & all attempts have the following error:

    Error initializing attempt_201209212239_20505_m_000002_0:
    java.io.FileNotFoundException: File does not exist: /tmp/hive-mapred/hive_2012-09-25_20-45-27_578_4149304906998798834/-mr-10007/32679f09-38c1-44db-9659-bbfaa74026bd
    at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:736)
    at org.apache.hadoop.filecache.TaskDistributedCacheManager.setupCache(TaskDistributedCacheManager.java:180)
    at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1368)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1359)
    at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1274)
    at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:2612)
    at org.apache.hadoop.mapred.TaskTracker$TaskLauncher.run(TaskTracker.java:2576)

    --


    --
  • Kevin O'dell at Sep 26, 2012 at 2:23 pm
    Are you using Scribe or anything else that has appends turned on?
    On Wed, Sep 26, 2012 at 10:00 AM, Patrick Angeles wrote:

    Haven't seen this before... my hunch is that it could be a problem with
    how your Oozie workflow is defined.

    On Tue, Sep 25, 2012 at 4:56 PM, Kate Bierbaum wrote:

    Hi,

    I'm reposting this from the hive user's group -- wondering if anybody here has seen this problem.

    I am encountering an intermittent failure on my cluster related to an HDFS leaseholder issue..
    This script (and the oozie workflow that triggers it) can run just fine most of the time..

    Running CDH4 and have tried Hive 0.9.0 and the included 0.8.x. Same issue either way.

    Any help is appreciated!



    org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException: failed to create file /tmp/hive-mapred/hive_2012-09-25_20-45-27_578_4149304906998798834/-mr-10006/1/emptyFile for DFSClient_NONMAPREDUCE_-502553752_1 on client 192.168.217.135 because current leaseholder is trying to recreate file.
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.recoverLeaseInternal(FSNamesystem.java:1752)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1589)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1514)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:408)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:200)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:42590)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:427)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1692)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1688)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1686)

    at org.apache.hadoop.ipc.Client.call(Client.java:1161)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:184)
    at $Proxy10.create(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:165)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:84)
    at $Proxy10.create(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:187)
    at org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1250)
    at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1269)
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1074)
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1032)
    at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:232)
    at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:75)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:806)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:787)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:686)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:675)
    at org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat.getHiveRecordWriter(HiveIgnoreKeyTextOutputFormat.java:80)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.addInputPath(ExecDriver.java:859)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.addInputPaths(ExecDriver.java:903)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:426)
    at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:136)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:47)
    Job Submission failed with exception 'org.apache.hadoop.ipc.RemoteException(failed to create file /tmp/hive-mapred/hive_2012-09-25_20-45-27_578_4149304906998798834/-mr-10006/1/emptyFile for DFSClient_NONMAPREDUCE_-502553752_1 on client 192.168.217.135 because current leaseholder is trying to recreate file.
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.recoverLeaseInternal(FSNamesystem.java:1752)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1589)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1514)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:408)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:200)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:42590)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:427)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1692)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1688)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1686)
    )'
    java.lang.IllegalArgumentException: Can not create a Path from an empty string
    at org.apache.hadoop.fs.Path.checkPathArg(Path.java:91)
    at org.apache.hadoop.fs.Path.<init>(Path.java:99)
    at org.apache.hadoop.hive.ql.exec.Utilities.getHiveJobID(Utilities.java:379)
    at org.apache.hadoop.hive.ql.exec.Utilities.clearMapRedWork(Utilities.java:192)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:476)
    at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:136)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:47)
    FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask
    FAILED: Hive Internal Error: java.lang.SecurityException(Intercepted System.exit(9))
    java.lang.SecurityException: Intercepted System.exit(9)
    at org.apache.oozie.action.hadoop.LauncherSecurityManager.checkExit(LauncherMapper.java:747)
    at java.lang.Runtime.exit(Runtime.java:88)
    at java.lang.System.exit(System.java:904)
    at org.apache.hadoop.hive.ql.Driver.taskCleanup(Driver.java:1346)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1175)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)
    at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:255)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:212)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:338)
    at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:436)
    at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:446)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:642)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:554)
    at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:303)
    at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:280)
    at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:37)
    at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:55)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:454)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:393)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:327)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.mapred.Child.main(Child.java:264)

    Intercepting System.exit(12)
    Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [12]
    Starting Job = job_201209212239_20505, Tracking URL = http://qa-cassandra-node01:50030/jobdetails.jsp?jobid=job_201209212239_20505
    Kill Command = /usr/lib/hadoop-0.20-mapreduce/bin/hadoop job -Dmapred.job.tracker=qa-cassandra-node01:8021 -kill job_201209212239_20505
    Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 12
    log4j:WARN No appenders could be found for logger (org.apache.hadoop.hive.ql.exec.Task).
    log4j:WARN Please initialize the log4j system properly.
    log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
    2012-09-25 20:45:36,388 Stage-1 map = 100%, reduce = 100%
    Ended Job = job_201209212239_20505 with errors
    Error during job, obtaining debugging information...
    Examining task ID: task_201209212239_20505_m_000002 (and more) from job job_201209212239_20505



    If you follow the link to the launch mapred job (
    http://hadoop-dev-node01.tendrilinc.com:50030/jobdetails.jsp?jobid=job_201208030244_27387)
    it reports that Job Setup failed & all attempts have the following error:

    Error initializing attempt_201209212239_20505_m_000002_0:
    java.io.FileNotFoundException: File does not exist: /tmp/hive-mapred/hive_2012-09-25_20-45-27_578_4149304906998798834/-mr-10007/32679f09-38c1-44db-9659-bbfaa74026bd
    at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:736)
    at org.apache.hadoop.filecache.TaskDistributedCacheManager.setupCache(TaskDistributedCacheManager.java:180)
    at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1368)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1359)
    at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1274)
    at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:2612)
    at org.apache.hadoop.mapred.TaskTracker$TaskLauncher.run(TaskTracker.java:2576)

    --


    --




    --
    Kevin O'Dell
    Customer Operations Engineer, Cloudera

    --
  • Kate Bierbaum at Sep 26, 2012 at 4:47 pm
    So, I took your hint & checked out an example from cloudera of an oozie
    hive action...

    Previously we had:
    <hive xmlns="uri:oozie:hive-action:0.2">
    <job-tracker>${jobTracker}</job-tracker>
    <name-node>${nameNode}</name-node>

    <prepare>
    </prepare>
    <configuration>
    <property>
    <name>oozie.hive.defaults</name>
    <value>../../hive/hive-default.xml</value>
    </property>
    <property>
    <name>hive.exec.parallel</name>
    <value>true</value>
    </property>
    <property>
    <name>hive.exec.parallel.thread.number</name>
    <value>24</value>
    </property>
    </configuration>

    I added
    <job-xml>../../hive/hive-default.xml</job-xml>
    after prepare & that seemed to fix the problem. (Previously failed 1 out of
    5.. now has run 20 times without failing)

    It seemed redundant since it was in the configuration options, but I guess
    not!

    Thanks for your help!
    On Wednesday, September 26, 2012 8:00:49 AM UTC-6, Patrick Angeles wrote:

    Haven't seen this before... my hunch is that it could be a problem with
    how your Oozie workflow is defined.

    On Tue, Sep 25, 2012 at 4:56 PM, Kate Bierbaum <kbie...@tendrilinc.com<javascript:>
    wrote:
    Hi,

    I'm reposting this from the hive user's group -- wondering if anybody here has seen this problem.

    I am encountering an intermittent failure on my cluster related to an HDFS leaseholder issue..
    This script (and the oozie workflow that triggers it) can run just fine most of the time..

    Running CDH4 and have tried Hive 0.9.0 and the included 0.8.x. Same issue either way.

    Any help is appreciated!



    org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException: failed to create file /tmp/hive-mapred/hive_2012-09-25_20-45-27_578_4149304906998798834/-mr-10006/1/emptyFile for DFSClient_NONMAPREDUCE_-502553752_1 on client 192.168.217.135 because current leaseholder is trying to recreate file.
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.recoverLeaseInternal(FSNamesystem.java:1752)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1589)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1514)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:408)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:200)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:42590)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:427)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1692)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1688)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1686)

    at org.apache.hadoop.ipc.Client.call(Client.java:1161)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:184)
    at $Proxy10.create(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:165)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:84)
    at $Proxy10.create(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:187)
    at org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1250)
    at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1269)
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1074)
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1032)
    at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:232)
    at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:75)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:806)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:787)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:686)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:675)
    at org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat.getHiveRecordWriter(HiveIgnoreKeyTextOutputFormat.java:80)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.addInputPath(ExecDriver.java:859)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.addInputPaths(ExecDriver.java:903)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:426)
    at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:136)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:47)
    Job Submission failed with exception 'org.apache.hadoop.ipc.RemoteException(failed to create file /tmp/hive-mapred/hive_2012-09-25_20-45-27_578_4149304906998798834/-mr-10006/1/emptyFile for DFSClient_NONMAPREDUCE_-502553752_1 on client 192.168.217.135 because current leaseholder is trying to recreate file.
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.recoverLeaseInternal(FSNamesystem.java:1752)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1589)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1514)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:408)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:200)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:42590)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:427)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1692)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1688)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1686)
    )'
    java.lang.IllegalArgumentException: Can not create a Path from an empty string
    at org.apache.hadoop.fs.Path.checkPathArg(Path.java:91)
    at org.apache.hadoop.fs.Path.<init>(Path.java:99)
    at org.apache.hadoop.hive.ql.exec.Utilities.getHiveJobID(Utilities.java:379)
    at org.apache.hadoop.hive.ql.exec.Utilities.clearMapRedWork(Utilities.java:192)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:476)
    at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:136)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:47)
    FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask
    FAILED: Hive Internal Error: java.lang.SecurityException(Intercepted System.exit(9))
    java.lang.SecurityException: Intercepted System.exit(9)
    at org.apache.oozie.action.hadoop.LauncherSecurityManager.checkExit(LauncherMapper.java:747)
    at java.lang.Runtime.exit(Runtime.java:88)
    at java.lang.System.exit(System.java:904)
    at org.apache.hadoop.hive.ql.Driver.taskCleanup(Driver.java:1346)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1175)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)
    at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:255)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:212)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:338)
    at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:436)
    at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:446)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:642)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:554)
    at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:303)
    at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:280)
    at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:37)
    at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:55)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:454)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:393)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:327)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.mapred.Child.main(Child.java:264)

    Intercepting System.exit(12)
    Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [12]
    Starting Job = job_201209212239_20505, Tracking URL = http://qa-cassandra-node01:50030/jobdetails.jsp?jobid=job_201209212239_20505
    Kill Command = /usr/lib/hadoop-0.20-mapreduce/bin/hadoop job -Dmapred.job.tracker=qa-cassandra-node01:8021 -kill job_201209212239_20505
    Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 12
    log4j:WARN No appenders could be found for logger (org.apache.hadoop.hive.ql.exec.Task).
    log4j:WARN Please initialize the log4j system properly.
    log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
    2012-09-25 20:45:36,388 Stage-1 map = 100%, reduce = 100%
    Ended Job = job_201209212239_20505 with errors
    Error during job, obtaining debugging information...
    Examining task ID: task_201209212239_20505_m_000002 (and more) from job job_201209212239_20505



    If you follow the link to the launch mapred job (
    http://hadoop-dev-node01.tendrilinc.com:50030/jobdetails.jsp?jobid=job_201208030244_27387)
    it reports that Job Setup failed & all attempts have the following error:

    Error initializing attempt_201209212239_20505_m_000002_0:
    java.io.FileNotFoundException: File does not exist: /tmp/hive-mapred/hive_2012-09-25_20-45-27_578_4149304906998798834/-mr-10007/32679f09-38c1-44db-9659-bbfaa74026bd
    at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:736)
    at org.apache.hadoop.filecache.TaskDistributedCacheManager.setupCache(TaskDistributedCacheManager.java:180)
    at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1368)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1359)
    at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1274)
    at org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:2612)
    at org.apache.hadoop.mapred.TaskTracker$TaskLauncher.run(TaskTracker.java:2576)

    --


    --

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcdh-user @
categorieshadoop
postedSep 25, '12 at 8:56p
activeSep 26, '12 at 4:47p
posts4
users3
websitecloudera.com
irc#hadoop

People

Translate

site design / logo © 2018 Grokbase