FAQ
Supervisor returned FATAL: + eval 'OLD_VALUE=$HADOOP_CLASSPATH'
++ OLD_VALUE='/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
+ '[' -z '/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::' ']'
+ export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
+ HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
+ set -x
+ perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER#g' /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml
+ '[' -e /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/topology.py ']'
+ export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true '
+ HADOOP_OPTS='-Djava.net.preferIPv4Stack=true '
+ acquire_kerberos_tgt mapred.keytab
+ '[' -z mapred.keytab ']'
+ '[' -n '' ']'
+ '[' tasktracker = jobtracker ']'
+ '[' tasktracker = tasktracker ']'
+ chmod 0644 /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/log4j.properties
+ exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker






Can any one have any idea about it.

regards
Saurav

Search Discussions

  • Adam Smieszny at Jan 29, 2013 at 12:50 pm
    Can you share the contents of the TaskTracker log? Once the command
    executes as below, the detail we need will show up in the role log file
    (which you can get to via CM in this case)
    *+ exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config
    /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker*

    On Tue, Jan 29, 2013 at 7:39 AM, Saurav Sinha wrote:



    Supervisor returned FATAL: + eval 'OLD_VALUE=$HADOOP_CLASSPATH'
    ++ OLD_VALUE='/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + '[' -z '/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::' ']'
    + export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + set -x
    + perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER#g' /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml
    + '[' -e /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/topology.py ']'
    + export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true '
    + HADOOP_OPTS='-Djava.net.preferIPv4Stack=true '
    + acquire_kerberos_tgt mapred.keytab
    + '[' -z mapred.keytab ']'
    + '[' -n '' ']'
    + '[' tasktracker = jobtracker ']'
    + '[' tasktracker = tasktracker ']'
    + chmod 0644 /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/log4j.properties
    + exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker






    Can any one have any idea about it.

    regards
    Saurav


    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156
  • Adam Smieszny at Jan 29, 2013 at 2:22 pm
    Hi Saurav,

    Please continue to CC the group, as then you will get more support.

    I see the following line in your log file:
    2013-01-29 05:53:44,048 INFO org.apache.hadoop.mapred.TaskTracker:
    Tasktracker disallowed by JobTracker.

    That seems out of place to me.

    I'm wondering if it might be a similar situation to this on the CDH mailing
    list
    https://groups.google.com/a/cloudera.org/d/topic/cdh-user/PRy7GynUI6U/discussion

    Or this thread:
    http://grokbase.com/t/cloudera/scm-users/12b894bz1e/tasktracker-disallowed-by-jobtracker-error-message-on-all-nodes

    Please check your networking configuration - are you using DNS? Are you
    using /etc/hosts file? What does /etc/hosts contain?

    Thanks,
    Adam

    On Tue, Jan 29, 2013 at 9:17 AM, Saurav Sinha wrote:

    Hi,

    I have attched the TT log file from the location *
    /var/log/hadoop-0.20-mapreduce/hadoop-cmf-mapreduce1-TASKTRACKER-localhost.localdomain.log.out
    *

    The the suggested location contains the configuration file of the hadoop
    ie
    */var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker
    *

    and the command is not working.

    regards,
    Saurav

    On Tue, Jan 29, 2013 at 4:50 AM, Adam Smieszny wrote:

    Can you share the contents of the TaskTracker log? Once the command
    executes as below, the detail we need will show up in the role log file
    (which you can get to via CM in this case)
    *+ exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config
    /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker
    *

    On Tue, Jan 29, 2013 at 7:39 AM, Saurav Sinha wrote:



    Supervisor returned FATAL: + eval 'OLD_VALUE=$HADOOP_CLASSPATH'
    ++ OLD_VALUE='/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + '[' -z '/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::' ']'
    + export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + set -x
    + perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER#g' /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml
    + '[' -e /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/topology.py ']'
    + export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true '
    + HADOOP_OPTS='-Djava.net.preferIPv4Stack=true '
    + acquire_kerberos_tgt mapred.keytab
    + '[' -z mapred.keytab ']'
    + '[' -n '' ']'
    + '[' tasktracker = jobtracker ']'
    + '[' tasktracker = tasktracker ']'
    + chmod 0644 /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/log4j.properties
    + exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker






    Can any one have any idea about it.

    regards


    Saurav


    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156

    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156
  • Saurav Sinha at Jan 29, 2013 at 6:32 pm
    Hi Adam,

    Thanks for your advice and link support. Got my tasktraker up and running
    by cnaging the file /etc/hosts

    from
    127.0.0.1 localhost localhost.localdomain localhost4
    localhost4.localdomain4
    ::1 localhost localhost.localdomain localhost6
    localhost6.localdomain6

    to

    127.0.0.1 localhost.localdomain localhost4 localhost4.localdomain4
    localhost
    ::1 localhost localhost.localdomain localhost6
    localhost6.localdomain6

    Now I am facing another issue I am not able to put ant thing to HDFS and
    the hadoop ls command not working as told in traing

    there the output of the hadoop commad I tried

    [saurav@localhost ~]$ hadoop fs -ls
    ls: `.': No such file or directory

    [saurav@localhost ~]$ hadoop fs -ls /
    Found 3 items
    drwxr-xr-x - hbase hbase 0 2013-01-29 10:17 /hbase
    drwxrwxrwt - hdfs hdfs 0 2013-01-26 07:43 /tmp
    drwxr-xr-x - hdfs supergroup 0 2013-01-26 07:45 /user

    hadoop fs -put Documents /user/Documents
    put: Permission denied: user=saurav, access=WRITE,
    inode="/user":hdfs:supergroup:drwxr-xr-x

    Can you help me with this also

    I will be greatful to you for your reply

    regards,
    Saurav
    On Tue, Jan 29, 2013 at 6:22 AM, Adam Smieszny wrote:

    Hi Saurav,

    Please continue to CC the group, as then you will get more support.

    I see the following line in your log file:
    2013-01-29 05:53:44,048 INFO org.apache.hadoop.mapred.TaskTracker:
    Tasktracker disallowed by JobTracker.

    That seems out of place to me.

    I'm wondering if it might be a similar situation to this on the CDH
    mailing list

    https://groups.google.com/a/cloudera.org/d/topic/cdh-user/PRy7GynUI6U/discussion

    Or this thread:

    http://grokbase.com/t/cloudera/scm-users/12b894bz1e/tasktracker-disallowed-by-jobtracker-error-message-on-all-nodes

    Please check your networking configuration - are you using DNS? Are you
    using /etc/hosts file? What does /etc/hosts contain?

    Thanks,
    Adam

    On Tue, Jan 29, 2013 at 9:17 AM, Saurav Sinha wrote:

    Hi,

    I have attched the TT log file from the location *
    /var/log/hadoop-0.20-mapreduce/hadoop-cmf-mapreduce1-TASKTRACKER-localhost.localdomain.log.out
    *

    The the suggested location contains the configuration file of the hadoop
    ie
    */var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER
    tasktracker*

    and the command is not working.

    regards,
    Saurav

    On Tue, Jan 29, 2013 at 4:50 AM, Adam Smieszny wrote:

    Can you share the contents of the TaskTracker log? Once the command
    executes as below, the detail we need will show up in the role log file
    (which you can get to via CM in this case)
    *+ exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config
    /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker
    *

    On Tue, Jan 29, 2013 at 7:39 AM, Saurav Sinha wrote:



    Supervisor returned FATAL: + eval 'OLD_VALUE=$HADOOP_CLASSPATH'
    ++ OLD_VALUE='/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + '[' -z '/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::' ']'
    + export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + set -x
    + perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER#g' /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml
    + '[' -e /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/topology.py ']'
    + export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true '
    + HADOOP_OPTS='-Djava.net.preferIPv4Stack=true '
    + acquire_kerberos_tgt mapred.keytab
    + '[' -z mapred.keytab ']'
    + '[' -n '' ']'
    + '[' tasktracker = jobtracker ']'
    + '[' tasktracker = tasktracker ']'
    + chmod 0644 /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/log4j.properties
    + exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker






    Can any one have any idea about it.

    regards



    Saurav


    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156

    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156
  • Adam Smieszny at Jan 29, 2013 at 9:15 pm
    Hi Saurav,

    You will need to become the HDFS user to change the permissions within
    HDFS. HDFS works very much like the Linux filesystem, so you need to create
    directories and modify owners/permissions in order to be able to create
    files

    Do the following to create the directory that you are attempting to write
    in your test above, and assign the owner to your own user "saurav"

    sudo su - hdfs
    hadoop fs -mkdir /user/Documents
    hadoop fs -chown saurav /user/Documents


    Thanks,
    Adam

    On Tue, Jan 29, 2013 at 1:31 PM, Saurav Sinha wrote:

    Hi Adam,

    Thanks for your advice and link support. Got my tasktraker up and running
    by cnaging the file /etc/hosts

    from
    127.0.0.1 localhost localhost.localdomain localhost4
    localhost4.localdomain4
    ::1 localhost localhost.localdomain localhost6
    localhost6.localdomain6

    to

    127.0.0.1 localhost.localdomain localhost4 localhost4.localdomain4
    localhost
    ::1 localhost localhost.localdomain localhost6
    localhost6.localdomain6

    Now I am facing another issue I am not able to put ant thing to HDFS and
    the hadoop ls command not working as told in traing

    there the output of the hadoop commad I tried

    [saurav@localhost ~]$ hadoop fs -ls
    ls: `.': No such file or directory

    [saurav@localhost ~]$ hadoop fs -ls /
    Found 3 items
    drwxr-xr-x - hbase hbase 0 2013-01-29 10:17 /hbase
    drwxrwxrwt - hdfs hdfs 0 2013-01-26 07:43 /tmp
    drwxr-xr-x - hdfs supergroup 0 2013-01-26 07:45 /user

    hadoop fs -put Documents /user/Documents
    put: Permission denied: user=saurav, access=WRITE,
    inode="/user":hdfs:supergroup:drwxr-xr-x

    Can you help me with this also

    I will be greatful to you for your reply

    regards,
    Saurav
    On Tue, Jan 29, 2013 at 6:22 AM, Adam Smieszny wrote:

    Hi Saurav,

    Please continue to CC the group, as then you will get more support.

    I see the following line in your log file:
    2013-01-29 05:53:44,048 INFO org.apache.hadoop.mapred.TaskTracker:
    Tasktracker disallowed by JobTracker.

    That seems out of place to me.

    I'm wondering if it might be a similar situation to this on the CDH
    mailing list

    https://groups.google.com/a/cloudera.org/d/topic/cdh-user/PRy7GynUI6U/discussion

    Or this thread:

    http://grokbase.com/t/cloudera/scm-users/12b894bz1e/tasktracker-disallowed-by-jobtracker-error-message-on-all-nodes

    Please check your networking configuration - are you using DNS? Are you
    using /etc/hosts file? What does /etc/hosts contain?

    Thanks,
    Adam

    On Tue, Jan 29, 2013 at 9:17 AM, Saurav Sinha wrote:

    Hi,

    I have attched the TT log file from the location *
    /var/log/hadoop-0.20-mapreduce/hadoop-cmf-mapreduce1-TASKTRACKER-localhost.localdomain.log.out
    *

    The the suggested location contains the configuration file of the hadoop
    ie
    */var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER
    tasktracker*

    and the command is not working.

    regards,
    Saurav

    On Tue, Jan 29, 2013 at 4:50 AM, Adam Smieszny wrote:

    Can you share the contents of the TaskTracker log? Once the command
    executes as below, the detail we need will show up in the role log file
    (which you can get to via CM in this case)
    *+ exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config
    /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker
    *

    On Tue, Jan 29, 2013 at 7:39 AM, Saurav Sinha wrote:



    Supervisor returned FATAL: + eval 'OLD_VALUE=$HADOOP_CLASSPATH'
    ++ OLD_VALUE='/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + '[' -z '/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::' ']'
    + export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + set -x
    + perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER#g' /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml
    + '[' -e /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/topology.py ']'
    + export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true '
    + HADOOP_OPTS='-Djava.net.preferIPv4Stack=true '
    + acquire_kerberos_tgt mapred.keytab
    + '[' -z mapred.keytab ']'
    + '[' -n '' ']'
    + '[' tasktracker = jobtracker ']'
    + '[' tasktracker = tasktracker ']'
    + chmod 0644 /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/log4j.properties
    + exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker






    Can any one have any idea about it.

    regards




    Saurav


    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156

    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156

    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156
  • Saurav Sinha at Jan 30, 2013 at 6:57 pm
    Hi Adam,

    Thanks for your help and advice now I have my TT is up and running.

    And I am able to connect to my hdfs with CLI command

    Next I have start impala I have followed the steps as written in Cloudera
    Manger Guide 4.5.

    I have updated hive and hdfs setting.

    Now when I restarted impala services Impala demon is in bad state.

    These are the logs i get from the manger

    at org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.(HiveMetaStoreClient.java:116)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(Catalog.java:113)
    at com.cloudera.impala.catalog.Catalog.(Frontend.java:86)
    at com.cloudera.impala.service.JniFrontend.<init>(JniFrontend.java:58)
    Caused by: org.datanucleus.exceptions.NucleusException: Attempt to
    invoke the "DBCP" plugin to create a ConnectionPool gave an error :
    The specified datastore driver ("com.mysql.jdbc.Driver") was not found
    in the CLASSPATH. Please check your CLASSPATH specification, and the
    name of the driver.
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(ConnectionFactoryImpl.java:165)
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:84)
    ... 49 more
    Caused by: org.datanucleus.store.rdbms.datasource.DatastoreDriverNotFoundException:
    The specified datastore driver ("com.mysql.jdbc.Driver") was not found
    in the CLASSPATH. Please check your CLASSPATH specification, and the
    name of the driver.
    at org.datanucleus.store.rdbms.datasource.dbcp.DBCPDataSourceFactory.makePooledDataSource(DBCPDataSourceFactory.java:80)
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(ConnectionFactoryImpl.java:144)
    ... 50 more
    + date


    Can you help me with this also

    regards
    Saurav


    Now am am facing an issue that through CLI I am not able to run the hive
    On Wed, Jan 30, 2013 at 2:45 AM, Adam Smieszny wrote:

    Hi Saurav,

    You will need to become the HDFS user to change the permissions within
    HDFS. HDFS works very much like the Linux filesystem, so you need to create
    directories and modify owners/permissions in order to be able to create
    files

    Do the following to create the directory that you are attempting to write
    in your test above, and assign the owner to your own user "saurav"

    sudo su - hdfs
    hadoop fs -mkdir /user/Documents
    hadoop fs -chown saurav /user/Documents


    Thanks,
    Adam

    On Tue, Jan 29, 2013 at 1:31 PM, Saurav Sinha wrote:

    Hi Adam,

    Thanks for your advice and link support. Got my tasktraker up and running
    by cnaging the file /etc/hosts

    from
    127.0.0.1 localhost localhost.localdomain localhost4
    localhost4.localdomain4
    ::1 localhost localhost.localdomain localhost6
    localhost6.localdomain6

    to

    127.0.0.1 localhost.localdomain localhost4 localhost4.localdomain4
    localhost
    ::1 localhost localhost.localdomain localhost6
    localhost6.localdomain6

    Now I am facing another issue I am not able to put ant thing to HDFS and
    the hadoop ls command not working as told in traing

    there the output of the hadoop commad I tried

    [saurav@localhost ~]$ hadoop fs -ls
    ls: `.': No such file or directory

    [saurav@localhost ~]$ hadoop fs -ls /
    Found 3 items
    drwxr-xr-x - hbase hbase 0 2013-01-29 10:17 /hbase
    drwxrwxrwt - hdfs hdfs 0 2013-01-26 07:43 /tmp
    drwxr-xr-x - hdfs supergroup 0 2013-01-26 07:45 /user

    hadoop fs -put Documents /user/Documents
    put: Permission denied: user=saurav, access=WRITE,
    inode="/user":hdfs:supergroup:drwxr-xr-x

    Can you help me with this also

    I will be greatful to you for your reply

    regards,
    Saurav
    On Tue, Jan 29, 2013 at 6:22 AM, Adam Smieszny wrote:

    Hi Saurav,

    Please continue to CC the group, as then you will get more support.

    I see the following line in your log file:
    2013-01-29 05:53:44,048 INFO org.apache.hadoop.mapred.TaskTracker:
    Tasktracker disallowed by JobTracker.

    That seems out of place to me.

    I'm wondering if it might be a similar situation to this on the CDH
    mailing list

    https://groups.google.com/a/cloudera.org/d/topic/cdh-user/PRy7GynUI6U/discussion

    Or this thread:

    http://grokbase.com/t/cloudera/scm-users/12b894bz1e/tasktracker-disallowed-by-jobtracker-error-message-on-all-nodes

    Please check your networking configuration - are you using DNS? Are you
    using /etc/hosts file? What does /etc/hosts contain?

    Thanks,
    Adam

    On Tue, Jan 29, 2013 at 9:17 AM, Saurav Sinha wrote:

    Hi,

    I have attched the TT log file from the location *
    /var/log/hadoop-0.20-mapreduce/hadoop-cmf-mapreduce1-TASKTRACKER-localhost.localdomain.log.out
    *

    The the suggested location contains the configuration file of the hadoop
    ie
    */var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER
    tasktracker*

    and the command is not working.

    regards,
    Saurav

    On Tue, Jan 29, 2013 at 4:50 AM, Adam Smieszny wrote:

    Can you share the contents of the TaskTracker log? Once the command
    executes as below, the detail we need will show up in the role log file
    (which you can get to via CM in this case)
    *+ exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config
    /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker
    *


    On Tue, Jan 29, 2013 at 7:39 AM, Saurav Sinha <sauravsinha76@gmail.com
    wrote:

    Supervisor returned FATAL: + eval 'OLD_VALUE=$HADOOP_CLASSPATH'
    ++ OLD_VALUE='/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + '[' -z '/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::' ']'
    + export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + set -x
    + perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER#g' /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml
    + '[' -e /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/topology.py ']'
    + export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true '
    + HADOOP_OPTS='-Djava.net.preferIPv4Stack=true '
    + acquire_kerberos_tgt mapred.keytab
    + '[' -z mapred.keytab ']'
    + '[' -n '' ']'
    + '[' tasktracker = jobtracker ']'
    + '[' tasktracker = tasktracker ']'
    + chmod 0644 /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/log4j.properties
    + exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker






    Can any one have any idea about it.

    regards






    Saurav


    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156

    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156

    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156
  • Saurav Sinha at Jan 30, 2013 at 7:17 pm
    hi Adam,

    Contuning previous mail
    Now am am facing an issue that through CLI I am not able to run the hive

    [saurav@localhost ~]$ hive
    Logging initialized using configuration in
    file:/etc/hive/conf.dist/hive-log4j.properties
    Hive history
    file=/tmp/saurav/hive_job_log_saurav_201301301109_1273106907.txt
    hive> show tables
    ;
    FAILED: Error in metadata: javax.jdo.JDOFatalDataStoreException: Failed to
    create database '/var/lib/hive/metastore/metastore_db', see the next
    exception for details.
    NestedThrowables:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    FAILED: Execution Error, return code 1 from
    org.apache.hadoop.hive.ql.exec.DDLTask

    This the information of the hive command

    the log file is

    2013-01-30 11:09:54,230 ERROR Datastore.Schema
    (Log4JLogger.java:error(125)) - Failed initialising database.
    Failed to create database '/var/lib/hive/metastore/metastore_db', see the
    next exception for details.
    org.datanucleus.exceptions.NucleusDataStoreException: Failed to create
    database '/var/lib/hive/metastore/metastore_db', see the next exception for
    details.
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:536)
    at
    org.datanucleus.store.rdbms.RDBMSStoreManager.(Native Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:593)
    at
    org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
    org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
    at
    javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:248)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:277)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:210)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:185)
    at
    org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.(RetryingRawStore.java:71)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.(HiveMetaStoreClient.java:116)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2093)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2103)
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1077)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1066)
    at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1998)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:324)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at
    org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1331)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1117)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:950)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:744)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:607)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
    at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
    at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at
    org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75)
    at
    org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
    at
    org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148)
    at
    org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521)
    ... 56 more
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    ... 73 more
    Caused by: java.sql.SQLException: Directory
    /var/lib/hive/metastore/metastore_db cannot be created.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
    Source)
    ... 70 more
    Caused by: ERROR XBM0H: Directory /var/lib/hive/metastore/metastore_db
    cannot be created.
    at org.apache.derby.iapi.error.StandardException.newException(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown
    Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
    Source)
    at
    org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
    Source)
    ... 70 more
    Nested Throwables StackTrace:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
    at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
    at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at
    org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75)
    at
    org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
    at
    org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148)
    at
    org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521)
    at
    org.datanucleus.store.rdbms.RDBMSStoreManager.(Native Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:593)
    at
    org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
    org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
    at
    javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:248)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:277)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:210)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:185)
    at
    org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.(RetryingRawStore.java:71)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.(HiveMetaStoreClient.java:116)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2093)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2103)
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1077)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1066)
    at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1998)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:324)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at
    org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1331)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1117)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:950)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:744)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:607)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    ... 73 more
    Caused by: java.sql.SQLException: Directory
    /var/lib/hive/metastore/metastore_db cannot be created.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
    Source)
    ... 70 more
    Caused by: ERROR XBM0H: Directory /var/lib/hive/metastore/metastore_db
    cannot be created.
    at org.apache.derby.iapi.error.StandardException.newException(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown
    Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
    Source)
    at
    org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
    Source)
    ... 70 more
    2013-01-30 11:09:54,230 ERROR Datastore.Schema
    (Log4JLogger.java:error(125)) - Failed initialising database.
    Failed to create database '/var/lib/hive/metastore/metastore_db', see the
    next exception for details.
    org.datanucleus.exceptions.NucleusDataStoreException: Failed to create
    database '/var/lib/hive/metastore/metastore_db', see the next exception for
    details.
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:536)
    at
    org.datanucleus.store.rdbms.RDBMSStoreManager.(Native Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:593)
    at
    org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
    org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
    at
    javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:248)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:277)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:210)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:185)
    at
    org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.(RetryingRawStore.java:71)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.(HiveMetaStoreClient.java:116)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2093)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2103)
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1077)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1066)
    at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1998)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:324)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at
    org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1331)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1117)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:950)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:744)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:607)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
    at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
    at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at
    org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75)
    at
    org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
    at
    org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148)
    at
    org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521)
    ... 56 more
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    ... 73 more
    Caused by: java.sql.SQLException: Directory
    /var/lib/hive/metastore/metastore_db cannot be created.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
    Source)
    ... 70 more
    Caused by: ERROR XBM0H: Directory /var/lib/hive/metastore/metastore_db
    cannot be created.
    at org.apache.derby.iapi.error.StandardException.newException(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown
    Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
    Source)
    at
    org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
    Source)
    ... 70 more
    Nested Throwables StackTrace:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
    at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
    at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at
    org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75)
    at
    org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
    at
    org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148)
    at
    org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521)
    at
    org.datanucleus.store.rdbms.RDBMSStoreManager.(Native Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:593)
    at
    org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
    org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
    at
    javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:248)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:277)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:210)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:185)
    at
    org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.(RetryingRawStore.java:71)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.(HiveMetaStoreClient.java:116)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2093)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2103)
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1077)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1066)
    at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1998)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:324)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at
    org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1331)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1117)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:950)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:744)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:607)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    ... 73 more
    Caused by: java.sql.SQLException: Directory
    /var/lib/hive/metastore/metastore_db cannot be created.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
    Source)
    ... 70 more
    Caused by: ERROR XBM0H: Directory /var/lib/hive/metastore/metastore_db
    cannot be created.
    at org.apache.derby.iapi.error.StandardException.newException(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown
    Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
    Source)
    at
    org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
    Source)
    ... 70 more
    2013-01-30 11:09:54,537 ERROR exec.Task (SessionState.java:printError(403))
    - FAILED: Error in metadata: javax.jdo.JDOFatalDataStoreException: Failed
    to create database '/var/lib/hive/metastore/metastore_db', see the next
    exception for details.
    NestedThrowables:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    org.apache.hadoop.hive.ql.metadata.HiveException:
    javax.jdo.JDOFatalDataStoreException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    NestedThrowables:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1081)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1066)
    at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1998)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:324)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at
    org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1331)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1117)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:950)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:744)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:607)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
    Caused by: javax.jdo.JDOFatalDataStoreException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    NestedThrowables:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:298)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:601)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
    at
    javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:248)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:277)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:210)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:185)
    at
    org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.(RetryingRawStore.java:71)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.(HiveMetaStoreClient.java:116)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2093)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2103)
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1077)
    ... 18 more
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
    at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
    at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at
    org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75)
    at
    org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
    at
    org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148)
    at
    org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521)
    at
    org.datanucleus.store.rdbms.RDBMSStoreManager.(Native Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:593)
    at
    org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
    org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583)
    ... 47 more
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    ... 73 more
    Caused by: java.sql.SQLException: Directory
    /var/lib/hive/metastore/metastore_db cannot be created.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
    Source)
    ... 70 more
    Caused by: ERROR XBM0H: Directory /var/lib/hive/metastore/metastore_db
    cannot be created.
    at org.apache.derby.iapi.error.StandardException.newException(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown
    Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
    Source)
    at
    org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
    Source)
    ... 70 more

    2013-01-30 11:09:54,622 ERROR ql.Driver (SessionState.java:printError(403))
    - FAILED: Execution Error, return code 1 from
    org.apache.hadoop.hive.ql.exec.DDLTask


    It is quite similar to data to the impala demon


    regards

    Saurav


    On Wed, Jan 30, 2013 at 10:56 AM, Saurav Sinha wrote:

    Hi Adam,

    Thanks for your help and advice now I have my TT is up and running.

    And I am able to connect to my hdfs with CLI command

    Next I have start impala I have followed the steps as written in Cloudera
    Manger Guide 4.5.

    I have updated hive and hdfs setting.

    Now when I restarted impala services Impala demon is in bad state.

    These are the logs i get from the manger

    at org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:279)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:116)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:99)
    at com.cloudera.impala.catalog.Catalog.createHiveMetaStoreClient(Catalog.java:113)
    at com.cloudera.impala.catalog.Catalog.<init>(Catalog.java:89)
    at com.cloudera.impala.service.Frontend.<init>(Frontend.java:86)
    at com.cloudera.impala.service.JniFrontend.<init>(JniFrontend.java:58)
    Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "DBCP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(ConnectionFactoryImpl.java:165)
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:84)
    ... 49 more
    Caused by: org.datanucleus.store.rdbms.datasource.DatastoreDriverNotFoundException: The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.
    at org.datanucleus.store.rdbms.datasource.dbcp.DBCPDataSourceFactory.makePooledDataSource(DBCPDataSourceFactory.java:80)
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(ConnectionFactoryImpl.java:144)
    ... 50 more
    + date



    Can you help me with this also

    regads
    Saurav

    On Wed, Jan 30, 2013 at 2:45 AM, Adam Smieszny wrote:

    Hi Saurav,

    You will need to become the HDFS user to change the permissions within
    HDFS. HDFS works very much like the Linux filesystem, so you need to create
    directories and modify owners/permissions in order to be able to create
    files

    Do the following to create the directory that you are attempting to write
    in your test above, and assign the owner to your own user "saurav"

    sudo su - hdfs
    hadoop fs -mkdir /user/Documents
    hadoop fs -chown saurav /user/Documents


    Thanks,
    Adam

    On Tue, Jan 29, 2013 at 1:31 PM, Saurav Sinha wrote:

    Hi Adam,

    Thanks for your advice and link support. Got my tasktraker up and running
    by cnaging the file /etc/hosts

    from
    127.0.0.1 localhost localhost.localdomain localhost4
    localhost4.localdomain4
    ::1 localhost localhost.localdomain localhost6
    localhost6.localdomain6

    to

    127.0.0.1 localhost.localdomain localhost4 localhost4.localdomain4
    localhost
    ::1 localhost localhost.localdomain localhost6
    localhost6.localdomain6

    Now I am facing another issue I am not able to put ant thing to HDFS and
    the hadoop ls command not working as told in traing

    there the output of the hadoop commad I tried

    [saurav@localhost ~]$ hadoop fs -ls
    ls: `.': No such file or directory

    [saurav@localhost ~]$ hadoop fs -ls /
    Found 3 items
    drwxr-xr-x - hbase hbase 0 2013-01-29 10:17 /hbase
    drwxrwxrwt - hdfs hdfs 0 2013-01-26 07:43 /tmp
    drwxr-xr-x - hdfs supergroup 0 2013-01-26 07:45 /user

    hadoop fs -put Documents /user/Documents
    put: Permission denied: user=saurav, access=WRITE,
    inode="/user":hdfs:supergroup:drwxr-xr-x

    Can you help me with this also

    I will be greatful to you for your reply

    regards,
    Saurav
    On Tue, Jan 29, 2013 at 6:22 AM, Adam Smieszny wrote:

    Hi Saurav,

    Please continue to CC the group, as then you will get more support.

    I see the following line in your log file:
    2013-01-29 05:53:44,048 INFO org.apache.hadoop.mapred.TaskTracker:
    Tasktracker disallowed by JobTracker.

    That seems out of place to me.

    I'm wondering if it might be a similar situation to this on the CDH
    mailing list

    https://groups.google.com/a/cloudera.org/d/topic/cdh-user/PRy7GynUI6U/discussion

    Or this thread:

    http://grokbase.com/t/cloudera/scm-users/12b894bz1e/tasktracker-disallowed-by-jobtracker-error-message-on-all-nodes

    Please check your networking configuration - are you using DNS? Are you
    using /etc/hosts file? What does /etc/hosts contain?

    Thanks,
    Adam

    On Tue, Jan 29, 2013 at 9:17 AM, Saurav Sinha wrote:

    Hi,

    I have attched the TT log file from the location *
    /var/log/hadoop-0.20-mapreduce/hadoop-cmf-mapreduce1-TASKTRACKER-localhost.localdomain.log.out
    *

    The the suggested location contains the configuration file of the hadoop
    ie
    */var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER
    tasktracker*

    and the command is not working.

    regards,
    Saurav

    On Tue, Jan 29, 2013 at 4:50 AM, Adam Smieszny wrote:

    Can you share the contents of the TaskTracker log? Once the command
    executes as below, the detail we need will show up in the role log file
    (which you can get to via CM in this case)
    *+ exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config
    /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker
    *


    On Tue, Jan 29, 2013 at 7:39 AM, Saurav Sinha <sauravsinha76@gmail.com
    wrote:

    Supervisor returned FATAL: + eval 'OLD_VALUE=$HADOOP_CLASSPATH'
    ++ OLD_VALUE='/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + '[' -z '/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::' ']'
    + export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + set -x
    + perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER#g' /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml
    + '[' -e /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/topology.py ']'
    + export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true '
    + HADOOP_OPTS='-Djava.net.preferIPv4Stack=true '
    + acquire_kerberos_tgt mapred.keytab
    + '[' -z mapred.keytab ']'
    + '[' -n '' ']'
    + '[' tasktracker = jobtracker ']'
    + '[' tasktracker = tasktracker ']'
    + chmod 0644 /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/log4j.properties
    + exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker






    Can any one have any idea about it.

    regards







    Saurav


    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156

    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156

    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156
  • Philip Zeyliger at Jan 31, 2013 at 6:38 pm
    Hi Saurav,

    The error below suggests you don't have the MySQL JDBC driver installed.
    Unfortunately, because of MySQL's licensing, Cloudera Manager can't just
    do that for you. See
    https://ccp.cloudera.com/display/CDH4DOC/Hive+Installation#HiveInstallation-ConfiguringaremoteMySQLdatabaseasHiveMetastore
    for
    some instructions.

    The specified datastore driver ("com.mysql.jdbc.Driver") was not found
    in the CLASSPATH. Please check your CLASSPATH specification, and the
    name of the driver.

    Cheers,

    -- Philip
    On Wed, Jan 30, 2013 at 11:17 AM, Saurav Sinha wrote:

    hi Adam,

    Contuning previous mail
    Now am am facing an issue that through CLI I am not able to run the hive

    [saurav@localhost ~]$ hive
    Logging initialized using configuration in
    file:/etc/hive/conf.dist/hive-log4j.properties
    Hive history
    file=/tmp/saurav/hive_job_log_saurav_201301301109_1273106907.txt
    hive> show tables
    ;
    FAILED: Error in metadata: javax.jdo.JDOFatalDataStoreException: Failed to
    create database '/var/lib/hive/metastore/metastore_db', see the next
    exception for details.
    NestedThrowables:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    FAILED: Execution Error, return code 1 from
    org.apache.hadoop.hive.ql.exec.DDLTask

    This the information of the hive command

    the log file is

    2013-01-30 11:09:54,230 ERROR Datastore.Schema
    (Log4JLogger.java:error(125)) - Failed initialising database.
    Failed to create database '/var/lib/hive/metastore/metastore_db', see the
    next exception for details.
    org.datanucleus.exceptions.NucleusDataStoreException: Failed to create
    database '/var/lib/hive/metastore/metastore_db', see the next exception for
    details.
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:536)
    at
    org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:290)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:593)
    at
    org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
    org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
    at
    javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:248)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:277)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:210)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:185)
    at
    org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:62)

    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:279)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:116)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2093)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2103)
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1077)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1066)
    at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1998)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:324)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at
    org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1331)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1117)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:950)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:744)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:607)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
    at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
    at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at
    org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75)
    at
    org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
    at
    org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148)
    at
    org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521)
    ... 56 more
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    ... 73 more
    Caused by: java.sql.SQLException: Directory
    /var/lib/hive/metastore/metastore_db cannot be created.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
    Source)
    ... 70 more
    Caused by: ERROR XBM0H: Directory /var/lib/hive/metastore/metastore_db
    cannot be created.
    at org.apache.derby.iapi.error.StandardException.newException(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown
    Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
    Source)
    at
    org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
    Source)
    ... 70 more
    Nested Throwables StackTrace:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
    at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
    at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at
    org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75)
    at
    org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
    at
    org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148)
    at
    org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521)
    at
    org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:290)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:593)
    at
    org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
    org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
    at
    javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:248)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:277)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:210)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:185)
    at
    org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:62)

    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:279)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:116)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2093)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2103)
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1077)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1066)
    at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1998)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:324)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at
    org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1331)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1117)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:950)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:744)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:607)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    ... 73 more
    Caused by: java.sql.SQLException: Directory
    /var/lib/hive/metastore/metastore_db cannot be created.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
    Source)
    ... 70 more
    Caused by: ERROR XBM0H: Directory /var/lib/hive/metastore/metastore_db
    cannot be created.
    at org.apache.derby.iapi.error.StandardException.newException(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown
    Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
    Source)
    at
    org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
    Source)
    ... 70 more
    2013-01-30 11:09:54,230 ERROR Datastore.Schema
    (Log4JLogger.java:error(125)) - Failed initialising database.
    Failed to create database '/var/lib/hive/metastore/metastore_db', see the
    next exception for details.
    org.datanucleus.exceptions.NucleusDataStoreException: Failed to create
    database '/var/lib/hive/metastore/metastore_db', see the next exception for
    details.
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:536)
    at
    org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:290)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:593)
    at
    org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
    org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
    at
    javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:248)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:277)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:210)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:185)
    at
    org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:62)

    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:279)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:116)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2093)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2103)
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1077)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1066)
    at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1998)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:324)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at
    org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1331)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1117)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:950)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:744)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:607)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
    at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
    at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at
    org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75)
    at
    org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
    at
    org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148)
    at
    org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521)
    ... 56 more
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    ... 73 more
    Caused by: java.sql.SQLException: Directory
    /var/lib/hive/metastore/metastore_db cannot be created.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
    Source)
    ... 70 more
    Caused by: ERROR XBM0H: Directory /var/lib/hive/metastore/metastore_db
    cannot be created.
    at org.apache.derby.iapi.error.StandardException.newException(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown
    Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
    Source)
    at
    org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
    Source)
    ... 70 more
    Nested Throwables StackTrace:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
    at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
    at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at
    org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75)
    at
    org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
    at
    org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148)
    at
    org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521)
    at
    org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:290)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:593)
    at
    org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
    org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
    at
    javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:248)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:277)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:210)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:185)
    at
    org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:62)

    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:279)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:116)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2093)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2103)
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1077)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1066)
    at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1998)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:324)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at
    org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1331)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1117)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:950)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:744)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:607)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    ... 73 more
    Caused by: java.sql.SQLException: Directory
    /var/lib/hive/metastore/metastore_db cannot be created.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
    Source)
    ... 70 more
    Caused by: ERROR XBM0H: Directory /var/lib/hive/metastore/metastore_db
    cannot be created.
    at org.apache.derby.iapi.error.StandardException.newException(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown
    Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
    Source)
    at
    org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
    Source)
    ... 70 more
    2013-01-30 11:09:54,537 ERROR exec.Task
    (SessionState.java:printError(403)) - FAILED: Error in metadata:
    javax.jdo.JDOFatalDataStoreException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    NestedThrowables:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    org.apache.hadoop.hive.ql.metadata.HiveException:
    javax.jdo.JDOFatalDataStoreException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    NestedThrowables:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1081)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1066)
    at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1998)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:324)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at
    org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1331)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1117)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:950)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:744)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:607)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
    Caused by: javax.jdo.JDOFatalDataStoreException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    NestedThrowables:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:298)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:601)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
    at
    javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:248)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:277)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:210)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:185)
    at
    org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:62)

    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:279)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:116)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2093)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2103)
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1077)
    ... 18 more
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
    at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
    at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
    at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at
    org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75)
    at
    org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
    at
    org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148)
    at
    org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521)
    at
    org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:290)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:593)
    at
    org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
    org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583)
    ... 47 more
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    ... 73 more
    Caused by: java.sql.SQLException: Directory
    /var/lib/hive/metastore/metastore_db cannot be created.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
    Source)
    ... 70 more
    Caused by: ERROR XBM0H: Directory /var/lib/hive/metastore/metastore_db
    cannot be created.
    at org.apache.derby.iapi.error.StandardException.newException(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown
    Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
    Source)
    at
    org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
    Source)
    ... 70 more

    2013-01-30 11:09:54,622 ERROR ql.Driver
    (SessionState.java:printError(403)) - FAILED: Execution Error, return code
    1 from org.apache.hadoop.hive.ql.exec.DDLTask


    It is quite similar to data to the impala demon


    regards

    Saurav


    On Wed, Jan 30, 2013 at 10:56 AM, Saurav Sinha wrote:

    Hi Adam,

    Thanks for your help and advice now I have my TT is up and running.

    And I am able to connect to my hdfs with CLI command

    Next I have start impala I have followed the steps as written in Cloudera
    Manger Guide 4.5.

    I have updated hive and hdfs setting.

    Now when I restarted impala services Impala demon is in bad state.

    These are the logs i get from the manger

    at org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:279)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:116)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:99)
    at com.cloudera.impala.catalog.Catalog.createHiveMetaStoreClient(Catalog.java:113)
    at com.cloudera.impala.catalog.Catalog.<init>(Catalog.java:89)
    at com.cloudera.impala.service.Frontend.<init>(Frontend.java:86)
    at com.cloudera.impala.service.JniFrontend.<init>(JniFrontend.java:58)
    Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "DBCP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(ConnectionFactoryImpl.java:165)
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:84)
    ... 49 more
    Caused by: org.datanucleus.store.rdbms.datasource.DatastoreDriverNotFoundException: The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.
    at org.datanucleus.store.rdbms.datasource.dbcp.DBCPDataSourceFactory.makePooledDataSource(DBCPDataSourceFactory.java:80)
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(ConnectionFactoryImpl.java:144)
    ... 50 more
    + date



    Can you help me with this also

    regads
    Saurav


    On Wed, Jan 30, 2013 at 2:45 AM, Adam Smieszny wrote:

    Hi Saurav,

    You will need to become the HDFS user to change the permissions within
    HDFS. HDFS works very much like the Linux filesystem, so you need to create
    directories and modify owners/permissions in order to be able to create
    files

    Do the following to create the directory that you are attempting to write
    in your test above, and assign the owner to your own user "saurav"

    sudo su - hdfs
    hadoop fs -mkdir /user/Documents
    hadoop fs -chown saurav /user/Documents


    Thanks,
    Adam

    On Tue, Jan 29, 2013 at 1:31 PM, Saurav Sinha wrote:

    Hi Adam,

    Thanks for your advice and link support. Got my tasktraker up and
    running by cnaging the file /etc/hosts

    from
    127.0.0.1 localhost localhost.localdomain localhost4
    localhost4.localdomain4
    ::1 localhost localhost.localdomain localhost6
    localhost6.localdomain6

    to

    127.0.0.1 localhost.localdomain localhost4 localhost4.localdomain4
    localhost
    ::1 localhost localhost.localdomain localhost6
    localhost6.localdomain6

    Now I am facing another issue I am not able to put ant thing to HDFS and
    the hadoop ls command not working as told in traing

    there the output of the hadoop commad I tried

    [saurav@localhost ~]$ hadoop fs -ls
    ls: `.': No such file or directory

    [saurav@localhost ~]$ hadoop fs -ls /
    Found 3 items
    drwxr-xr-x - hbase hbase 0 2013-01-29 10:17 /hbase
    drwxrwxrwt - hdfs hdfs 0 2013-01-26 07:43 /tmp
    drwxr-xr-x - hdfs supergroup 0 2013-01-26 07:45 /user

    hadoop fs -put Documents /user/Documents
    put: Permission denied: user=saurav, access=WRITE,
    inode="/user":hdfs:supergroup:drwxr-xr-x

    Can you help me with this also

    I will be greatful to you for your reply

    regards,
    Saurav
    On Tue, Jan 29, 2013 at 6:22 AM, Adam Smieszny wrote:

    Hi Saurav,

    Please continue to CC the group, as then you will get more support.

    I see the following line in your log file:
    2013-01-29 05:53:44,048 INFO org.apache.hadoop.mapred.TaskTracker:
    Tasktracker disallowed by JobTracker.

    That seems out of place to me.

    I'm wondering if it might be a similar situation to this on the CDH
    mailing list

    https://groups.google.com/a/cloudera.org/d/topic/cdh-user/PRy7GynUI6U/discussion

    Or this thread:

    http://grokbase.com/t/cloudera/scm-users/12b894bz1e/tasktracker-disallowed-by-jobtracker-error-message-on-all-nodes

    Please check your networking configuration - are you using DNS? Are you
    using /etc/hosts file? What does /etc/hosts contain?

    Thanks,
    Adam

    On Tue, Jan 29, 2013 at 9:17 AM, Saurav Sinha wrote:

    Hi,

    I have attched the TT log file from the location *
    /var/log/hadoop-0.20-mapreduce/hadoop-cmf-mapreduce1-TASKTRACKER-localhost.localdomain.log.out
    *

    The the suggested location contains the configuration file of the
    hadoop
    ie
    */var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER
    tasktracker*

    and the command is not working.

    regards,
    Saurav

    On Tue, Jan 29, 2013 at 4:50 AM, Adam Smieszny wrote:

    Can you share the contents of the TaskTracker log? Once the command
    executes as below, the detail we need will show up in the role log file
    (which you can get to via CM in this case)
    *+ exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config
    /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker
    *


    On Tue, Jan 29, 2013 at 7:39 AM, Saurav Sinha <
    sauravsinha76@gmail.com> wrote:

    Supervisor returned FATAL: + eval 'OLD_VALUE=$HADOOP_CLASSPATH'
    ++ OLD_VALUE='/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + '[' -z '/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::' ']'
    + export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + set -x
    + perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER#g' /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml
    + '[' -e /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/topology.py ']'
    + export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true '
    + HADOOP_OPTS='-Djava.net.preferIPv4Stack=true '
    + acquire_kerberos_tgt mapred.keytab
    + '[' -z mapred.keytab ']'
    + '[' -n '' ']'
    + '[' tasktracker = jobtracker ']'
    + '[' tasktracker = tasktracker ']'
    + chmod 0644 /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/log4j.properties
    + exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker






    Can any one have any idea about it.

    regards









    Saurav


    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156

    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156

    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156

  • Saurav Sinha at Feb 1, 2013 at 6:25 am
    Hi,

    I have solved my one issue of hive query from CLI by creating the instants
    for hive in Cloudera Manger hive tab.

    But still my impala is not working.

    I am also trying to use Beewax and Hue on the http://localhost:8888

    For the first time I login as admin but I am not able to browse in my HDFS
    nor I am able to fire the hive query.I have created one more user but once
    I logout from beewax noy been able to login as other user.I need to restart
    the server once agin.

    I am following the steps from the Clouera manger installation guide

    Can any one have any idea what is going on.

    regards,
    Saurav
    On Thu, Jan 31, 2013 at 10:37 AM, Philip Zeyliger wrote:

    Hi Saurav,

    The error below suggests you don't have the MySQL JDBC driver installed.
    Unfortunately, because of MySQL's licensing, Cloudera Manager can't just
    do that for you. See
    https://ccp.cloudera.com/display/CDH4DOC/Hive+Installation#HiveInstallation-ConfiguringaremoteMySQLdatabaseasHiveMetastore for
    some instructions.

    The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.

    Cheers,

    -- Philip
    On Wed, Jan 30, 2013 at 11:17 AM, Saurav Sinha wrote:

    hi Adam,

    Contuning previous mail
    Now am am facing an issue that through CLI I am not able to run the hive

    [saurav@localhost ~]$ hive
    Logging initialized using configuration in
    file:/etc/hive/conf.dist/hive-log4j.properties
    Hive history
    file=/tmp/saurav/hive_job_log_saurav_201301301109_1273106907.txt
    hive> show tables
    ;
    FAILED: Error in metadata: javax.jdo.JDOFatalDataStoreException: Failed
    to create database '/var/lib/hive/metastore/metastore_db', see the next
    exception for details.
    NestedThrowables:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    FAILED: Execution Error, return code 1 from
    org.apache.hadoop.hive.ql.exec.DDLTask

    This the information of the hive command

    the log file is

    2013-01-30 11:09:54,230 ERROR Datastore.Schema
    (Log4JLogger.java:error(125)) - Failed initialising database.
    Failed to create database '/var/lib/hive/metastore/metastore_db', see the
    next exception for details.
    org.datanucleus.exceptions.NucleusDataStoreException: Failed to create
    database '/var/lib/hive/metastore/metastore_db', see the next exception for
    details.
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:536)
    at
    org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:290)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:593)
    at
    org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
    org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
    at
    javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at
    javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at
    javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:248)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:277)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:210)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:185)
    at
    org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:62)

    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:279)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:116)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2093)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2103)
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1077)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1066)
    at
    org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1998)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:324)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at
    org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1331)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1117)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:950)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:744)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:607)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
    at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown
    Source)
    at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at
    org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75)
    at
    org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
    at
    org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148)
    at
    org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521)
    ... 56 more
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    ... 73 more
    Caused by: java.sql.SQLException: Directory
    /var/lib/hive/metastore/metastore_db cannot be created.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
    Source)
    ... 70 more
    Caused by: ERROR XBM0H: Directory /var/lib/hive/metastore/metastore_db
    cannot be created.
    at org.apache.derby.iapi.error.StandardException.newException(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown
    Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
    Source)
    at
    org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
    Source)
    ... 70 more
    Nested Throwables StackTrace:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
    at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown
    Source)
    at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at
    org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75)
    at
    org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
    at
    org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148)
    at
    org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521)
    at
    org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:290)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:593)
    at
    org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
    org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
    at
    javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at
    javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at
    javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:248)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:277)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:210)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:185)
    at
    org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:62)

    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:279)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:116)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2093)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2103)
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1077)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1066)
    at
    org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1998)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:324)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at
    org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1331)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1117)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:950)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:744)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:607)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    ... 73 more
    Caused by: java.sql.SQLException: Directory
    /var/lib/hive/metastore/metastore_db cannot be created.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
    Source)
    ... 70 more
    Caused by: ERROR XBM0H: Directory /var/lib/hive/metastore/metastore_db
    cannot be created.
    at org.apache.derby.iapi.error.StandardException.newException(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown
    Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
    Source)
    at
    org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
    Source)
    ... 70 more
    2013-01-30 11:09:54,230 ERROR Datastore.Schema
    (Log4JLogger.java:error(125)) - Failed initialising database.
    Failed to create database '/var/lib/hive/metastore/metastore_db', see the
    next exception for details.
    org.datanucleus.exceptions.NucleusDataStoreException: Failed to create
    database '/var/lib/hive/metastore/metastore_db', see the next exception for
    details.
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:536)
    at
    org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:290)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:593)
    at
    org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
    org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
    at
    javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at
    javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at
    javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:248)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:277)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:210)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:185)
    at
    org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:62)

    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:279)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:116)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2093)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2103)
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1077)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1066)
    at
    org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1998)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:324)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at
    org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1331)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1117)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:950)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:744)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:607)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
    at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown
    Source)
    at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at
    org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75)
    at
    org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
    at
    org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148)
    at
    org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521)
    ... 56 more
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    ... 73 more
    Caused by: java.sql.SQLException: Directory
    /var/lib/hive/metastore/metastore_db cannot be created.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
    Source)
    ... 70 more
    Caused by: ERROR XBM0H: Directory /var/lib/hive/metastore/metastore_db
    cannot be created.
    at org.apache.derby.iapi.error.StandardException.newException(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown
    Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
    Source)
    at
    org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
    Source)
    ... 70 more
    Nested Throwables StackTrace:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
    at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown
    Source)
    at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at
    org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75)
    at
    org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
    at
    org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148)
    at
    org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521)
    at
    org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:290)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:593)
    at
    org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
    org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
    at
    javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at
    javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at
    javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:248)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:277)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:210)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:185)
    at
    org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:62)

    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:279)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:116)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2093)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2103)
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1077)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1066)
    at
    org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1998)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:324)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at
    org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1331)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1117)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:950)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:744)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:607)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    ... 73 more
    Caused by: java.sql.SQLException: Directory
    /var/lib/hive/metastore/metastore_db cannot be created.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
    Source)
    ... 70 more
    Caused by: ERROR XBM0H: Directory /var/lib/hive/metastore/metastore_db
    cannot be created.
    at org.apache.derby.iapi.error.StandardException.newException(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown
    Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
    Source)
    at
    org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
    Source)
    ... 70 more
    2013-01-30 11:09:54,537 ERROR exec.Task
    (SessionState.java:printError(403)) - FAILED: Error in metadata:
    javax.jdo.JDOFatalDataStoreException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    NestedThrowables:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    org.apache.hadoop.hive.ql.metadata.HiveException:
    javax.jdo.JDOFatalDataStoreException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    NestedThrowables:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1081)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1066)
    at
    org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1998)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:324)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at
    org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1331)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1117)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:950)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:744)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:607)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
    Caused by: javax.jdo.JDOFatalDataStoreException: Failed to create
    database '/var/lib/hive/metastore/metastore_db', see the next exception for
    details.
    NestedThrowables:
    java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:298)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:601)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:286)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
    at
    javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
    at
    javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
    at
    javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:248)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:277)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:210)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:185)
    at
    org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at
    org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:62)

    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:279)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:116)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2093)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2103)
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1077)
    ... 18 more
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection30.<init>(Unknown Source)
    at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
    at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown
    Source)
    at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    at java.sql.DriverManager.getConnection(DriverManager.java:582)
    at java.sql.DriverManager.getConnection(DriverManager.java:185)
    at
    org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:75)
    at
    org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
    at
    org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1148)
    at
    org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
    at
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:521)
    at
    org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:290)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at
    org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:593)
    at
    org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:300)
    at
    org.datanucleus.ObjectManagerFactoryImpl.initialiseStoreManager(ObjectManagerFactoryImpl.java:161)
    at
    org.datanucleus.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:583)
    ... 47 more
    Caused by: java.sql.SQLException: Failed to create database
    '/var/lib/hive/metastore/metastore_db', see the next exception for details.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    ... 73 more
    Caused by: java.sql.SQLException: Directory
    /var/lib/hive/metastore/metastore_db cannot be created.
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
    Source)
    at
    org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
    Source)
    at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
    Source)
    ... 70 more
    Caused by: ERROR XBM0H: Directory /var/lib/hive/metastore/metastore_db
    cannot be created.
    at org.apache.derby.iapi.error.StandardException.newException(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService$9.run(Unknown
    Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at
    org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
    Source)
    at
    org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
    Source)
    at
    org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
    Source)
    ... 70 more

    2013-01-30 11:09:54,622 ERROR ql.Driver
    (SessionState.java:printError(403)) - FAILED: Execution Error, return code
    1 from org.apache.hadoop.hive.ql.exec.DDLTask


    It is quite similar to data to the impala demon


    regards

    Saurav


    On Wed, Jan 30, 2013 at 10:56 AM, Saurav Sinha wrote:

    Hi Adam,

    Thanks for your help and advice now I have my TT is up and running.

    And I am able to connect to my hdfs with CLI command

    Next I have start impala I have followed the steps as written in
    Cloudera Manger Guide 4.5.

    I have updated hive and hdfs setting.

    Now when I restarted impala services Impala demon is in bad state.

    These are the logs i get from the manger

    at org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:377)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:364)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:279)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:116)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:99)
    at com.cloudera.impala.catalog.Catalog.createHiveMetaStoreClient(Catalog.java:113)
    at com.cloudera.impala.catalog.Catalog.<init>(Catalog.java:89)
    at com.cloudera.impala.service.Frontend.<init>(Frontend.java:86)
    at com.cloudera.impala.service.JniFrontend.<init>(JniFrontend.java:58)
    Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "DBCP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(ConnectionFactoryImpl.java:165)
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:84)
    ... 49 more
    Caused by: org.datanucleus.store.rdbms.datasource.DatastoreDriverNotFoundException: The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.
    at org.datanucleus.store.rdbms.datasource.dbcp.DBCPDataSourceFactory.makePooledDataSource(DBCPDataSourceFactory.java:80)
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initDataSourceTx(ConnectionFactoryImpl.java:144)
    ... 50 more
    + date



    Can you help me with this also

    regads
    Saurav


    On Wed, Jan 30, 2013 at 2:45 AM, Adam Smieszny wrote:

    Hi Saurav,

    You will need to become the HDFS user to change the permissions within
    HDFS. HDFS works very much like the Linux filesystem, so you need to create
    directories and modify owners/permissions in order to be able to create
    files

    Do the following to create the directory that you are attempting to
    write in your test above, and assign the owner to your own user "saurav"

    sudo su - hdfs
    hadoop fs -mkdir /user/Documents
    hadoop fs -chown saurav /user/Documents


    Thanks,
    Adam

    On Tue, Jan 29, 2013 at 1:31 PM, Saurav Sinha wrote:

    Hi Adam,

    Thanks for your advice and link support. Got my tasktraker up and
    running by cnaging the file /etc/hosts

    from
    127.0.0.1 localhost localhost.localdomain localhost4
    localhost4.localdomain4
    ::1 localhost localhost.localdomain localhost6
    localhost6.localdomain6

    to

    127.0.0.1 localhost.localdomain localhost4 localhost4.localdomain4
    localhost
    ::1 localhost localhost.localdomain localhost6
    localhost6.localdomain6

    Now I am facing another issue I am not able to put ant thing to HDFS
    and the hadoop ls command not working as told in traing

    there the output of the hadoop commad I tried

    [saurav@localhost ~]$ hadoop fs -ls
    ls: `.': No such file or directory

    [saurav@localhost ~]$ hadoop fs -ls /
    Found 3 items
    drwxr-xr-x - hbase hbase 0 2013-01-29 10:17 /hbase
    drwxrwxrwt - hdfs hdfs 0 2013-01-26 07:43 /tmp
    drwxr-xr-x - hdfs supergroup 0 2013-01-26 07:45 /user

    hadoop fs -put Documents /user/Documents
    put: Permission denied: user=saurav, access=WRITE,
    inode="/user":hdfs:supergroup:drwxr-xr-x

    Can you help me with this also

    I will be greatful to you for your reply

    regards,
    Saurav
    On Tue, Jan 29, 2013 at 6:22 AM, Adam Smieszny wrote:

    Hi Saurav,

    Please continue to CC the group, as then you will get more support.

    I see the following line in your log file:
    2013-01-29 05:53:44,048 INFO org.apache.hadoop.mapred.TaskTracker:
    Tasktracker disallowed by JobTracker.

    That seems out of place to me.

    I'm wondering if it might be a similar situation to this on the CDH
    mailing list

    https://groups.google.com/a/cloudera.org/d/topic/cdh-user/PRy7GynUI6U/discussion

    Or this thread:

    http://grokbase.com/t/cloudera/scm-users/12b894bz1e/tasktracker-disallowed-by-jobtracker-error-message-on-all-nodes

    Please check your networking configuration - are you using DNS? Are
    you using /etc/hosts file? What does /etc/hosts contain?

    Thanks,
    Adam


    On Tue, Jan 29, 2013 at 9:17 AM, Saurav Sinha <sauravsinha76@gmail.com
    wrote:
    Hi,

    I have attched the TT log file from the location *
    /var/log/hadoop-0.20-mapreduce/hadoop-cmf-mapreduce1-TASKTRACKER-localhost.localdomain.log.out
    *

    The the suggested location contains the configuration file of the
    hadoop
    ie
    */var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER
    tasktracker*

    and the command is not working.

    regards,
    Saurav

    On Tue, Jan 29, 2013 at 4:50 AM, Adam Smieszny wrote:

    Can you share the contents of the TaskTracker log? Once the command
    executes as below, the detail we need will show up in the role log file
    (which you can get to via CM in this case)
    *+ exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config
    /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker
    *


    On Tue, Jan 29, 2013 at 7:39 AM, Saurav Sinha <
    sauravsinha76@gmail.com> wrote:

    Supervisor returned FATAL: + eval 'OLD_VALUE=$HADOOP_CLASSPATH'
    ++ OLD_VALUE='/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + '[' -z '/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::' ']'
    + export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-4.4.908-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-4.4.908.jar:/usr/share/cmf/lib/plugins/governor-plugin-4.4.908-shaded.jar:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*::'
    + set -x
    + perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER#g' /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml
    + '[' -e /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/topology.py ']'
    + export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true '
    + HADOOP_OPTS='-Djava.net.preferIPv4Stack=true '
    + acquire_kerberos_tgt mapred.keytab
    + '[' -z mapred.keytab ']'
    + '[' -n '' ']'
    + '[' tasktracker = jobtracker ']'
    + '[' tasktracker = tasktracker ']'
    + chmod 0644 /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/mapred-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/hdfs-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/core-site.xml /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER/log4j.properties
    + exec /usr/lib/hadoop-0.20-mapreduce/bin/hadoop --config /var/run/cloudera-scm-agent/process/101-mapreduce-TASKTRACKER tasktracker






    Can any one have any idea about it.

    regards










    Saurav


    --
    Adam Smieszny
    Cloudera | Systems Engineer |
    http://www.linkedin.com/in/adamsmieszny
    917.830.4156

    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156

    --
    Adam Smieszny
    Cloudera | Systems Engineer | http://www.linkedin.com/in/adamsmieszny
    917.830.4156

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupscm-users @
categorieshadoop
postedJan 29, '13 at 12:39p
activeFeb 1, '13 at 6:25a
posts9
users3
websitecloudera.com
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase