Grokbase Groups Hive user May 2011
FAQ
hi,
The default value is "/user/hive/warehouse" in hive.site.xml. After I
changed the directory to a path on HDFS, I got the exception.

FAILED: Error in metadata: MetaException(message:Got exception:
org.apache.hadoop.security.
AccessControlException org.apache.hadoop.security.AccessControlException:
Permission denied:
user=root, access=WRITE, inode="output
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask

Is this failure related to the hadoop-site.xml or something?
Thanks for your help.

--
dujinhang

Search Discussions

  • Viral Bajaria at May 18, 2011 at 5:23 am
    check your dfs.permissions in hdfs-site.xml, I am guessing it's set to true.

    If that's the case and you point the hive warehouse dir to an existing path
    in hdfs the chances are the user that run's the hive jobs does not have
    permissions on that path.

    -Viral
    On Tue, May 17, 2011 at 9:53 PM, jinhang du wrote:

    hi,
    The default value is "/user/hive/warehouse" in hive.site.xml. After I
    changed the directory to a path on HDFS, I got the exception.

    FAILED: Error in metadata: MetaException(message:Got exception:
    org.apache.hadoop.security.
    AccessControlException org.apache.hadoop.security.AccessControlException:
    Permission denied:
    user=root, access=WRITE, inode="output
    FAILED: Execution Error, return code 1 from
    org.apache.hadoop.hive.ql.exec.DDLTask

    Is this failure related to the hadoop-site.xml or something?
    Thanks for your help.

    --
    dujinhang
  • Ankit Jain at May 18, 2011 at 5:26 am
    Hi,

    At the time installation we perform following steps:

    $HADOOP_HOME/bin/hadoop fs –mkdir /user/hive/warehouse
    $HADOOP_HOME/bin/hadoop fs -chmod g+w /user/hive/warehouse

    Replace /user/hive/warehouse with a new path.

    Example:-
    $HADOOP_HOME/bin/hadoop fs -mkdir /com/impetus/data
    $HADOOP_HOME/bin/hadoop fs -chmod g+w /com/impetus/data

    then your data start storing in /com/impetus/data dir.

    Regards,
    Ankit Jain

    On Wed, May 18, 2011 at 12:53 AM, jinhang du wrote:

    hi,
    The default value is "/user/hive/warehouse" in hive.site.xml. After I
    changed the directory to a path on HDFS, I got the exception.

    FAILED: Error in metadata: MetaException(message:Got exception:
    org.apache.hadoop.security.
    AccessControlException org.apache.hadoop.security.AccessControlException:
    Permission denied:
    user=root, access=WRITE, inode="output
    FAILED: Execution Error, return code 1 from
    org.apache.hadoop.hive.ql.exec.DDLTask

    Is this failure related to the hadoop-site.xml or something?
    Thanks for your help.

    --
    dujinhang
  • Ankit Jain at May 18, 2011 at 5:30 am
    Hi,

    If path aleady exist in hdfs, then you need to perform following step.

    $HADOOP_HOME/bin/hadoop fs -chmod g+w Exist_Path_hdfs

    Example:
    $HADOOP_HOME/bin/hadoop fs -chmod g+w /com/impetus/data
    On Wed, May 18, 2011 at 1:25 AM, Ankit Jain wrote:

    Hi,

    At the time installation we perform following steps:

    $HADOOP_HOME/bin/hadoop fs –mkdir /user/hive/warehouse
    $HADOOP_HOME/bin/hadoop fs -chmod g+w /user/hive/warehouse

    Replace /user/hive/warehouse with a new path.

    Example:-
    $HADOOP_HOME/bin/hadoop fs -mkdir /com/impetus/data
    $HADOOP_HOME/bin/hadoop fs -chmod g+w /com/impetus/data

    then your data start storing in /com/impetus/data dir.

    Regards,
    Ankit Jain

    On Wed, May 18, 2011 at 12:53 AM, jinhang du wrote:

    hi,
    The default value is "/user/hive/warehouse" in hive.site.xml. After I
    changed the directory to a path on HDFS, I got the exception.

    FAILED: Error in metadata: MetaException(message:Got exception:
    org.apache.hadoop.security.
    AccessControlException org.apache.hadoop.security.AccessControlException:
    Permission denied:
    user=root, access=WRITE, inode="output
    FAILED: Execution Error, return code 1 from
    org.apache.hadoop.hive.ql.exec.DDLTask

    Is this failure related to the hadoop-site.xml or something?
    Thanks for your help.

    --
    dujinhang
  • Jov at May 18, 2011 at 5:32 am

    2011/5/18 jinhang du <dujinhang@gmail.com>:
    hi,
    The default value is "/user/hive/warehouse" in hive.site.xml. After I
    changed the directory to a path on HDFS, I got the exception.

    FAILED: Error in metadata: MetaException(message:Got exception:
    org.apache.hadoop.security.
    AccessControlException org.apache.hadoop.security.AccessControlException:
    Permission denied:
    user=root, access=WRITE, inode="output
    make sure you can READ/WRITE to the new path.to get permissions,see
    http://hadoop.apache.org/common/docs/r0.20.0/hdfs_shell.html#chmod
    FAILED: Execution Error, return code 1 from
    org.apache.hadoop.hive.ql.exec.DDLTask

    Is this failure related to the hadoop-site.xml or something?
    Thanks for your help.

    --
    dujinhang
  • Ted Yu at May 18, 2011 at 5:38 am
    Can you try as user hadoop ?

    Cheers


    On May 17, 2011, at 9:53 PM, jinhang du wrote:

    hi,
    The default value is "/user/hive/warehouse" in hive.site.xml. After I changed the directory to a path on HDFS, I got the exception.

    FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.security.
    AccessControlException org.apache.hadoop.security.AccessControlException: Permission denied:
    user=root, access=WRITE, inode="output
    FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

    Is this failure related to the hadoop-site.xml or something?
    Thanks for your help.

    --
    dujinhang
  • Jinhang du at May 18, 2011 at 6:49 am
    Thanks for your answers.

    I run my hive client console on a machine which doesn't belong to the hadoop
    cluster.
    I just changed "fs.default.name" and "mapred.job.tracker" in hive-site.xml
    to connect the hadoop cluster.
    I can create table and files are created on HDFS.
    When I changed the hive.metastore.warehouse.dir, the exception appeared.

    My account to access hadoop cluster doesn't have the authority to write in
    path "/user/hive/warehouse".
    This owner of the directory is superuser. This directory was created by
    someone else.

    Is there something wrong about my operations or my understanding about
    hive?
    Can hive just act a client without any changes on hadoop configuration?

    Thanks.
    2011/5/18 Ted Yu <yuzhihong@gmail.com>
    Can you try as user hadoop ?

    Cheers


    On May 17, 2011, at 9:53 PM, jinhang du wrote:

    hi,
    The default value is "/user/hive/warehouse" in hive.site.xml. After I
    changed the directory to a path on HDFS, I got the exception.
    FAILED: Error in metadata: MetaException(message:Got exception:
    org.apache.hadoop.security.
    AccessControlException org.apache.hadoop.security.AccessControlException:
    Permission denied:
    user=root, access=WRITE, inode="output
    FAILED: Execution Error, return code 1 from
    org.apache.hadoop.hive.ql.exec.DDLTask
    Is this failure related to the hadoop-site.xml or something?
    Thanks for your help.

    --
    dujinhang


    --
    dujinhang
  • Jinhang du at May 19, 2011 at 4:54 am
    I make the value of 'hive.metastore.warehouse.dir" writable for all users on
    Hadoop.
    And the problem is solved. Tables can be created in my spec dir.
    But this operation is not safe.

    I don't understand the strategy of access control between hive and hdfs.
    Will you help me ?

    thx.

    2011/5/18 jinhang du <dujinhang@gmail.com>
    Thanks for your answers.

    I run my hive client console on a machine which doesn't belong to the
    hadoop cluster.
    I just changed "fs.default.name" and "mapred.job.tracker" in hive-site.xml
    to connect the hadoop cluster.
    I can create table and files are created on HDFS.
    When I changed the hive.metastore.warehouse.dir, the exception appeared.

    My account to access hadoop cluster doesn't have the authority to write in
    path "/user/hive/warehouse".
    This owner of the directory is superuser. This directory was created by
    someone else.

    Is there something wrong about my operations or my understanding about
    hive?
    Can hive just act a client without any changes on hadoop configuration?

    Thanks.

    2011/5/18 Ted Yu <yuzhihong@gmail.com>
    Can you try as user hadoop ?

    Cheers


    On May 17, 2011, at 9:53 PM, jinhang du wrote:

    hi,
    The default value is "/user/hive/warehouse" in hive.site.xml. After I
    changed the directory to a path on HDFS, I got the exception.
    FAILED: Error in metadata: MetaException(message:Got exception:
    org.apache.hadoop.security.
    AccessControlException
    org.apache.hadoop.security.AccessControlException: Permission denied:
    user=root, access=WRITE, inode="output
    FAILED: Execution Error, return code 1 from
    org.apache.hadoop.hive.ql.exec.DDLTask
    Is this failure related to the hadoop-site.xml or something?
    Thanks for your help.

    --
    dujinhang


    --
    dujinhang


    --
    dujinhang

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupuser @
categorieshive, hadoop
postedMay 18, '11 at 4:54a
activeMay 19, '11 at 4:54a
posts8
users5
websitehive.apache.org

People

Translate

site design / logo © 2022 Grokbase