FAQ
We just installed cloudera 4.1, and I've been focused on being able to
write a simple table into hive.
My configuration tasks so far:
I added the mysql-connector-java-5.1.22-bin.jar to the /usr/lib/hive/lib
directory.
I built the mysql remote metastore database.
Because the cloudera distribution has already added the beeswax server and
the /user/beeswax/warehouse directory, I modified the hive-site.xml by
commenting out the derby metastore settings, and added the following lines
to that file:

<property>
<name>hive.metastore.warehouse.dir</name>
<value>/user/beeswax/warehouse</value>
</property>
<property> <name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://myhost/metastore</value> </property>

<property> <name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value> </property>

<property> <name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value> </property>

<property> <name>javax.jdo.option.ConnectionPassword</name>
<value>mypassword</value> </property>

<property> <name>datanucleus.autoCreateSchema</name> <value>false</value>
</property>

<property> <name>datanucleus.fixedDatastore</name> <value>true</value>
</property>
And i have launched the metastore service in a separate ssh session.:
%hive --service metastore
Starting Hive Metastore Server


I can get hive to respond to a "show tables;" request, but when I try the
following (from Hadoop the Definitive Guide, p 415);
% echo 'X' > /home/rway/dummy.txt
% hive -e "create table dummy (value string); \
load data local inpath '/home/rway/dummy.txt' \
overwrite into table dummy"

I get the following console response:

Logging initialized using configuration in
file:/etc/hive/conf.dist/hive-log4j.properties
Hive history file=/tmp/rway/hive_job_log_rway_201301271750_629052072.txt
FAILED: Error in metadata: MetaException(message:Got exception:
org.apache.hadoop.security.AccessControlException Permission denied:
user=rway, access=WRITE, inode="/user/beeswax/warehouse":hue:hive:drwxrwxr-x

Which is puzzling me. There is a hadoop fs directory called
/user/beeswax/warehouse and it's 775. I also added my userid to the hdfs
group on the host, just in case this was needed, though I wasn't sure.
My userid can do the following with no issue:
%hadoop fs -put '/home/rway/dummy.txt' and when I look at the hdfs file
system browser, the file dummy.txt is certainly in hdfs.

I've seen other posts on this kind of topic but everything I am doing seems
pretty plain vanilla. Thanks in advance for any help!!

Robin Way

Search Discussions

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupscm-users @
categorieshadoop
postedJan 28, '13 at 2:05a
activeJan 28, '13 at 2:05a
posts1
users1
websitecloudera.com
irc#hadoop

1 user in discussion

Robin Way: 1 post

People

Translate

site design / logo © 2022 Grokbase