Dear Cloudera Team,
I run a hive task, and there is a exception.
The problem had confused us for some days, and it appears every week almost.
Could anyone kindly help us to explain the reason why this error occurred, and how to fix the problem?
Thank you in advance.
The cluster environment:
CDH 4.2.1
hive-0.10.0-cdh4.2.1
NameNode HA: tongjihadoop1(Active NameNode), tongjihadoop11(Standby NameNode)
1. HiveServer error logs:
HiveServerException(errorCode=40000, message='Query returned non-zero code: 40000, cause: FAILED: RuntimeException java.io.IOException: Failed on local exception: java.nio.channels.ClosedByInterruptException; Host Details : local host is: "tongjihadoop28/10.32.21.28"; destination host is: "tongjihadoop11":8020; ', SQLState='42000')
HiveServerException(errorCode=40000, message='Query returned non-zero code: 40000, cause: FAILED: RuntimeException java.io.IOException: Failed on local exception: java.nio.channels.ClosedByInterruptException; Host Details : local host is: "tongjihadoop28/10.32.21.28"; destination host is: "tongjihadoop1":8020; ', SQLState='42000')
2013-12-11 03:46:19,704 WARN ipc.Client (Client.java:call(1203)) - interrupted waiting to send params to server
java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1279)
at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:218)
at java.util.concurrent.FutureTask.get(FutureTask.java:83)
at org.apache.hadoop.ipc.Client$Connection.sendParam(Client.java:913)
at org.apache.hadoop.ipc.Client.call(Client.java:1198)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
at $Proxy15.delete(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.delete(ClientNamenodeProtocolTranslatorPB.java:407)
at sun.reflect.GeneratedMethodAccessor78.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
at $Proxy16.delete(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:1487)
at org.apache.hadoop.hdfs.DistributedFileSystem.delete(DistributedFileSystem.java:356)
at org.apache.hadoop.hive.ql.Context.removeScratchDir(Context.java:236)
at org.apache.hadoop.hive.ql.Context.clear(Context.java:377)
at org.apache.hadoop.hive.ql.Driver.close(Driver.java:1476)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:186)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:644)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:628)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
2013-12-11 03:46:19,725 WARN retry.RetryInvocationHandler (RetryInvocationHandler.java:invoke(94)) - Exception while invoking class org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.delete. Not retrying because the invoked method is not idempotent, and unable to determine whether it was invoked
java.io.IOException: java.lang.InterruptedException
at org.apache.hadoop.ipc.Client.call(Client.java:1204)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
at $Proxy15.delete(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.delete(ClientNamenodeProtocolTranslatorPB.java:407)
at sun.reflect.GeneratedMethodAccessor78.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
at $Proxy16.delete(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:1487)
at org.apache.hadoop.hdfs.DistributedFileSystem.delete(DistributedFileSystem.java:356)
at org.apache.hadoop.hive.ql.Context.removeScratchDir(Context.java:236)
at org.apache.hadoop.hive.ql.Context.clear(Context.java:377)
at org.apache.hadoop.hive.ql.Driver.close(Driver.java:1476)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:186)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:644)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:628)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Caused by: java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1279)
at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:218)
at java.util.concurrent.FutureTask.get(FutureTask.java:83)
at org.apache.hadoop.ipc.Client$Connection.sendParam(Client.java:913)
at org.apache.hadoop.ipc.Client.call(Client.java:1198)
... 23 more
2013-12-11 03:46:19,726 WARN ql.Context (Context.java:removeScratchDir(238)) - Error Removing Scratch: java.io.IOException: java.lang.InterruptedException
at org.apache.hadoop.ipc.Client.call(Client.java:1204)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
at $Proxy15.delete(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.delete(ClientNamenodeProtocolTranslatorPB.java:407)
at sun.reflect.GeneratedMethodAccessor78.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
at $Proxy16.delete(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:1487)
at org.apache.hadoop.hdfs.DistributedFileSystem.delete(DistributedFileSystem.java:356)
at org.apache.hadoop.hive.ql.Context.removeScratchDir(Context.java:236)
at org.apache.hadoop.hive.ql.Context.clear(Context.java:377)
at org.apache.hadoop.hive.ql.Driver.close(Driver.java:1476)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:186)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:644)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:628)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Caused by: java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1279)
at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:218)
at java.util.concurrent.FutureTask.get(FutureTask.java:83)
at org.apache.hadoop.ipc.Client$Connection.sendParam(Client.java:913)
at org.apache.hadoop.ipc.Client.call(Client.java:1198)
... 23 more
2013-12-11 03:46:19,807 WARN ipc.Client (Client.java:call(1203)) - interrupted waiting to send params to server
java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1279)
at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:218)
at java.util.concurrent.FutureTask.get(FutureTask.java:83)
at org.apache.hadoop.ipc.Client$Connection.sendParam(Client.java:913)
at org.apache.hadoop.ipc.Client.call(Client.java:1198)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
at $Proxy15.mkdirs(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:425)
at sun.reflect.GeneratedMethodAccessor56.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
at $Proxy16.mkdirs(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2121)
at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2092)
at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:546)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1902)
at org.apache.hadoop.hive.ql.Context.getScratchDir(Context.java:164)
at org.apache.hadoop.hive.ql.Context.getExternalScratchDir(Context.java:225)
at org.apache.hadoop.hive.ql.Context.getExternalTmpFileURI(Context.java:318)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFileSinkPlan(SemanticAnalyzer.java:4648)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPostGroupByBodyPlan(SemanticAnalyzer.java:6811)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:6721)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:7454)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:8131)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:258)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:443)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:347)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:908)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:198)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:644)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:628)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
2013-12-11 03:46:19,809 WARN retry.RetryInvocationHandler (RetryInvocationHandler.java:invoke(117)) - Exception while invoking mkdirs of class ClientNamenodeProtocolTranslatorPB. Trying to fail over immediately.
2013-12-11 03:46:19,811 WARN retry.RetryInvocationHandler (RetryInvocationHandler.java:invoke(117)) - Exception while invoking mkdirs of class ClientNamenodeProtocolTranslatorPB after 1 fail over attempts. Trying to fail over immediately.
java.io.IOException: Failed on local exception: java.nio.channels.ClosedByInterruptException; Host Details : local host is: "tongjihadoop10/10.32.21.10"; destination host is: "tongjihadoop11":8020;
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:760)
at org.apache.hadoop.ipc.Client.call(Client.java:1229)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
at $Proxy15.mkdirs(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:425)
at sun.reflect.GeneratedMethodAccessor46.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
at $Proxy16.mkdirs(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2121)
at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2092)
at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:546)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1902)
at org.apache.hadoop.hive.ql.exec.ExecDriver.createTmpDirs(ExecDriver.java:223)
at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:445)
at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:138)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:198)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:644)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:628)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Caused by: java.nio.channels.ClosedByInterruptException
at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:184)
at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:511)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:193)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:525)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:489)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:499)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:593)
at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:241)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1278)
at org.apache.hadoop.ipc.Client.call(Client.java:1196)
... 30 more
Job Submission failed with exception 'java.io.IOException(Failed on local exception: java.nio.channels.ClosedByInterruptException; Host Details : local host is: "tongjihadoop10/10.32.21.10"; destination host is: "tongjihadoop11":8020; )'
2、NameNode error logs:
2013-12-11 03:43:10,730 ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:hadoop (auth:SIMPLE) cause:org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException: No lease on /tmp/hive-hadoop/hive_2013-12-11_03-39-52_021_2074940431677688659/_task_tmp.-ext-10002/_tmp.000003_0 File does not exist. Holder DFSClient_NONMAPREDUCE_1022005432_1 does not have any open files.
2013-12-11 03:43:10,730 INFO org.apache.hadoop.ipc.Server: IPC Server handler 578 on 8020, call org.apache.hadoop.hdfs.protocol.ClientProtocol.complete from 10.32.21.25:40530: error: org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException: No lease on /tmp/hive-hadoop/hive_2013-12-11_03-39-52_021_2074940431677688659/_task_tmp.-ext-10002/_tmp.000003_0 File does not exist. Holder DFSClient_NONMAPREDUCE_1022005432_1 does not have any open files.
org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException: No lease on /tmp/hive-hadoop/hive_2013-12-11_03-39-52_021_2074940431677688659/_task_tmp.-ext-10002/_tmp.000003_0 File does not exist. Holder DFSClient_NONMAPREDUCE_1022005432_1 does not have any open files.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:2419)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:2410)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.completeFileInternal(FSNamesystem.java:2478)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.completeFile(FSNamesystem.java:2455)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.complete(NameNodeRpcServer.java:535)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.complete(ClientNamenodeProtocolServerSideTranslatorPB.java:335)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44084)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1695)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1691)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1689)
2013-12-11 03:43:10,737 INFO BlockStateChange: BLOCK* addToInvalidates: blk_-5428338368977879777_18430422 10.32.21.28:50010 10.32.21.25:50010 10.32.21.41:50010 10.32.21.21:50010 10.32.21.12:50010 10.32.21.32:50010 10.32.21.33:50010 10.32.21.29:50010 10.32.21.37:50010 10.32.21.43:50010
2013-12-11 03:43:10,737 INFO BlockStateChange: BLOCK* addToInvalidates: blk_-8989026595130721770_18430424 10.32.21.28:50010 10.32.21.12:50010 10.32.21.16:50010 10.32.21.22:50010 10.32.21.21:50010 10.32.21.37:50010 10.32.21.30:50010 10.32.21.32:50010 10.32.21.14:50010 10.32.21.9:50010
2013-12-11 03:43:10,737 INFO BlockStateChange: BLOCK* addToInvalidates: blk_4688728492859260894_18430426 10.32.21.28:50010 10.32.21.26:50010 10.32.21.39:50010
2013-12-11 03:43:10,737 INFO BlockStateChange: BLOCK* addToInvalidates: blk_6666893292808536198_18430428 10.32.21.28:50010 10.32.21.37:50010 10.32.21.40:50010
2013-12-11 03:43:10,737 INFO BlockStateChange: BLOCK* addToInvalidates: blk_-3847187288385873412_18430418 10.32.21.28:50010 10.32.21.8:50010 10.32.21.41:50010 10.32.21.13:50010 10.32.21.42:50010 10.32.21.21:50010 10.32.21.35:50010 10.32.21.32:50010 10.32.21.14:50010 10.32.21.26:50010
2013-12-11 03:43:10,737 INFO BlockStateChange: BLOCK* addToInvalidates: blk_7477012993476457763_18430420 10.32.21.28:50010 10.32.21.36:50010 10.32.21.18:50010 10.32.21.11:50010 10.32.21.46:50010 10.32.21.39:50010 10.32.21.31:50010 10.32.21.38:50010 10.32.21.9:50010 10.32.21.12:50010
2013-12-11 03:43:10,762 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* allocateBlock: /tmp/hive-hadoop/hive_2013-12-11_03-39-52_021_2074940431677688659/_task_tmp.-ext-10002/_tmp.000039_0. BP-1616094619-10.32.21.1-1369297210855 blk_-2415267581369766488_18430652{blockUCState=UNDER_CONSTRUCTION, primaryNodeIndex=-1, replicas=[ReplicaUnderConstruction[10.32.21.33:50010|RBW], ReplicaUnderConstruction[10.32.21.42:50010|RBW], ReplicaUnderConstruction[10.32.21.22:50010|RBW]]}
2013-12-11 03:43:10,796 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* allocateBlock: /tmp/hive-hadoop/hive_2013-12-11_03-39-52_021_2074940431677688659/_task_tmp.-ext-10002/_tmp.000022_0. BP-1616094619-10.32.21.1-1369297210855 blk_-7115905787522100678_18430653{blockUCState=UNDER_CONSTRUCTION, primaryNodeIndex=-1, replicas=[ReplicaUnderConstruction[10.32.21.41:50010|RBW], ReplicaUnderConstruction[10.32.21.36:50010|RBW], ReplicaUnderConstruction[10.32.21.8:50010|RBW]]}
2013-12-11 03:43:10,827 ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:hadoop (auth:SIMPLE) cause:org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException: No lease on /tmp/hive-hadoop/hive_2013-12-11_03-39-52_021_2074940431677688659/_task_tmp.-ext-10002/_tmp.000066_0 File does not exist. Holder DFSClient_NONMAPREDUCE_1576277036_1 does not have any open files.
2013-12-11 03:43:10,827 INFO org.apache.hadoop.ipc.Server: IPC Server handler 433 on 8020, call org.apache.hadoop.hdfs.protocol.ClientProtocol.addBlock from 10.32.21.32:7479: error: org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException: No lease on /tmp/hive-hadoop/hive_2013-12-11_03-39-52_021_2074940431677688659/_task_tmp.-ext-10002/_tmp.000066_0 File does not exist. Holder DFSClient_NONMAPREDUCE_1576277036_1 does not have any open files.
org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException: No lease on /tmp/hive-hadoop/hive_2013-12-11_03-39-52_021_2074940431677688659/_task_tmp.-ext-10002/_tmp.000066_0 File does not exist. Holder DFSClient_NONMAPREDUCE_1576277036_1 does not have any open files.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:2419)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:2410)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2203)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:480)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:297)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44080)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1695)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1691)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1689)
thanks,
Best wishes!
James, beijing, china
2013-12-11
To unsubscribe from this group and stop receiving emails from it, send an email to
[email protected].