Hi,
I recently installed Cloudera Manager 4.1 Free Edition;
I installed CDH4 through the Manager in 4 nodes successfully.
I can run map reduce jobs through the command-line, but when I try to run a
Hive query through the Hue server
Select * FROM jamesjoyce
WHERE count >100 SORT BY count ASC
LIMIT 10
I get the following:
Driver returned: 2. Errors: Hive history file=/tmp/hue/hive_job_log_hue_201212010833_733610235.txt
Total MapReduce jobs = 2
Launching Job 1 out of 2
Number of reduce tasks not specified. Estimated from input data size: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapred.reduce.tasks=<number>
Starting Job = job_201211271304_0003, Tracking URL = http://cdhnode3.aar.cisco.com:50030/jobdetails.jsp?jobid=job_201211271304_0003
Kill Command = /usr/lib/hadoop/bin/hadoop job -Dmapred.job.tracker=cdhnode3.aar.cisco.com:8021 -kill job_201211271304_0003
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1
2012-12-01 08:34:06,554 Stage-1 map = 0%, reduce = 0%
2012-12-01 08:34:28,968 Stage-1 map = 100%, reduce = 100%
Ended Job = job_201211271304_0003 with errors
FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask
MapReduce Jobs Launched:
Job 0: Map: 1 Reduce: 1 HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec
Looking at the log file in TaskTracker I see this error:
java.io.IOException: Could not create job user log directory:
file:/var/log/hadoop-0.20-mapreduce/userlogs/job_201211271304_0001
at
org.apache.hadoop.mapred.JobLocalizer.initializeJobLogDir(JobLocalizer.java:241)
at
org.apache.hadoop.mapred.DefaultTaskController.initializeJob(DefaultTaskController.java:225)
at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1415)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
at
org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1390)
at
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1305)
at
org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java:2722)
at
org.apache.hadoop.mapred.TaskTracker$TaskLauncher.run(TaskTracker.java:2686)
I guess is a permission problem.
I created a hdfs user in Hue as a superuse.
Any ideas how to help?
Thank you
Rui Vaz