Grokbase Groups Hive user June 2011
FAQ
I get the following ClosedByInterruptException often - but not always - when running a query with hive.exec.parallel=true. It seems to happen only when 2 MR jobs are being launched in parallel. I doubt I'm the first person to have seen this error in this scenario, but googling didn't help me. Any idea?

I am using a pre-release version of Hive 0.7.

Ended Job = job_201106080050_0253
Launching Job 2 out of 3
Launching Job 3 out of 3
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapred.reduce.tasks=<number>
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapred.reduce.tasks=<number>
java.io.IOException: Call to ip-10-78-97-240.ec2.internal/10.78.97.240:9001 failed on local exception: java.nio.channels.ClosedByInterruptException
at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
at org.apache.hadoop.ipc.Client.call(Client.java:743)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
at org.apache.hadoop.mapred.$Proxy9.getNewJobId(Unknown Source)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:804)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:777)
at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:660)
at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:121)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:128)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:55)
at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:47)
Caused by: java.nio.channels.ClosedByInterruptException
at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:184)
at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:341)
at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:55)
at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:146)
at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:107)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
at java.io.DataOutputStream.flush(DataOutputStream.java:106)
at org.apache.hadoop.ipc.Client$Connection.sendParam(Client.java:480)
at org.apache.hadoop.ipc.Client.call(Client.java:721)
... 9 more
Job Submission failed with exception 'java.io.IOException(Call to ip-10-78-97-240.ec2.internal/10.78.97.240:9001 failed on local exception: java.nio.channels.ClosedByInterruptException)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask

Thanks.
Steven

Search Discussions

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 1 of 1 | next ›
Discussion Overview
groupuser @
categorieshive, hadoop
postedJun 8, '11 at 7:26p
activeJun 8, '11 at 7:26p
posts1
users1
websitehive.apache.org

1 user in discussion

Steven Wong: 1 post

People

Translate

site design / logo © 2021 Grokbase