FAQ
Hi,

Using the latest hive 0.7 version to test parallel executions and dynamic
partitions using the multiple insert. Below is the same pseudo code with
actual error message. Is it possible to do in Hive using parellel execution
with dynamic partitions and multiple table inserts?

Thanks,
-ray


set hive.exec.parallel=true;

set hive.merge.mapfiles=false;
set hive.exec.dynamic.partition=true;
set hive.exec.dynamic.partition.mode=nonstrict;


from foo
insert overwrite table foo2 (dt, sub)
select
a,
sum(x),
dt,
sub
group by
a, dt, sub
insert overwrite table foo3 (dt, sub)
select
a,
b,
sum(x),
dt,
sub
group by
a, b, dt, sub
insert overwrite table foo4 (dt, sub, sub)
select
a, b,c
sum(x),
dt,
sub
group by
a, b, c, dt, sub
...


Errror Message:
Launching Job 2 out of 6
Launching Job 3 out of 6
Launching Job 4 out of 6
Launching Job 5 out of 6
Launching Job 6 out of 6
Number of reduce tasks not specified. Estimated from input data size: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapred.reduce.tasks=<number>
Number of reduce tasks not specified. Estimated from input data size: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapred.reduce.tasks=<number>
Number of reduce tasks not specified. Estimated from input data size: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapred.reduce.tasks=<number>
Number of reduce tasks not specified. Estimated from input data size: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapred.reduce.tasks=<number>
Number of reduce tasks not specified. Estimated from input data size: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapred.reduce.tasks=<number>
[Warning] could not update stats.
java.io.IOException: java.lang.InterruptedException
at org.apache.hadoop.ipc.Client.call(Client.java:796)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:221)
at org.apache.hadoop.mapred.$Proxy8.getNewJobId(Unknown Source)
at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:798)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:771)
at
org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:620)
at
org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:121)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:117)
at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:55)
at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:47)
Caused by: java.lang.InterruptedException
at
java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1215)
at
java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:218)
at java.util.concurrent.FutureTask.get(FutureTask.java:83)
at
org.apache.hadoop.ipc.Client$Connection.sendParam(Client.java:544)
at org.apache.hadoop.ipc.Client.call(Client.java:790)
... 9 more
Job Submission failed with exception
'java.io.IOException(java.lang.InterruptedException)'

Search Discussions

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupuser @
categorieshive, hadoop
postedOct 25, '10 at 12:10a
activeOct 25, '10 at 12:10a
posts1
users1
websitehive.apache.org

1 user in discussion

Ray Duong: 1 post

People

Translate

site design / logo © 2022 Grokbase