FAQ
Hi,

we had been running cloudera distribution of hadoop. We installed hive
following this document
https://wiki.cloudera.com/display/DOC/Hive+Installation. hive-site.xml was
later modified for storing metastore in mysql very similar to the config in
this blog
http://blog.milford.io/2010/06/installing-apache-hive-with-a-mysql-metastore-in-centos/

I have set hadoop configs in hive-site.xml correctly and also I can connect
to mysql server

fs.default.name
mapred.job.tracker
hadoop.bin.path
hadoop.config.dir
hive.exec.scratchdir - set to /tmp on hdfs
hive.metastore.local - true
javax.jdo.option.ConnectionURL -
jdbc:mysql://mysqlserver:3306/hadoop?createDatabaseIfNotExist=true
javax.jdo.option.ConnectionUserName
javax.jdo.option.ConnectionPassword
javax.jdo.option.ConnectionDriverName
hive.metastore.warehouse.dir - /user/hive/warehouse on hdfs

I have also copied mysql-connector-java-5.0.8-bin.jar to /usr/lib/hive/lib

I can see the database getting created in mysql. I have also deleted derby
metastore directory and all lock files. I have created tmp and warehouse
directory on hdfs and have assigned r+w permissions to it. When I start hive
in cli and run show table command I get this error


aws-qahdn01:~# hive -hiveconf hive.root.logger=INFO,console
Hive history file=/tmp/root/hive_job_log_root_201101191558_1388620546.txt
11/01/19 15:58:45 INFO exec.HiveHistory: Hive history
file=/tmp/root/hive_job_log_root_201101191558_1388620546.txt
hive> show tables;
11/01/19 15:58:50 INFO parse.ParseDriver: Parsing command: show tables
11/01/19 15:58:51 INFO parse.ParseDriver: Parse Completed
11/01/19 15:58:51 INFO ql.Driver: Semantic Analysis Completed
11/01/19 15:58:51 INFO ql.Driver: Starting command: show tables
11/01/19 15:58:51 INFO metastore.HiveMetaStore: 0: Opening raw store with
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
11/01/19 15:58:51 INFO metastore.ObjectStore: ObjectStore, initialize called
11/01/19 15:58:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core"
requires "org.eclipse.core.resources" but it cannot be resolved.
11/01/19 15:58:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core"
requires "org.eclipse.core.runtime" but it cannot be resolved.
11/01/19 15:58:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core"
requires "org.eclipse.text" but it cannot be resolved.
11/01/19 15:58:53 INFO metastore.ObjectStore: Initialized ObjectStore
11/01/19 15:58:53 WARN conf.Configuration: hdfs-site.xml:a attempt to
override final parameter: dfs.data.dir; Ignoring.
FAILED: Error in metadata: javax.jdo.JDOException: Couldnt obtain a new
sequence (unique id) : Binary logging not possible. Message: Transaction
level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
NestedThrowables:
java.sql.SQLException: Binary logging not possible. Message: Transaction
level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
11/01/19 15:58:54 ERROR exec.DDLTask: FAILED: Error in metadata:
javax.jdo.JDOException: Couldnt obtain a new sequence (unique id) : Binary
logging not possible. Message: Transaction level 'READ-COMMITTED' in InnoDB
is not safe for binlog mode 'STATEMENT'
NestedThrowables:
java.sql.SQLException: Binary logging not possible. Message: Transaction
level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
org.apache.hadoop.hive.ql.metadata.HiveException: javax.jdo.JDOException:
Couldnt obtain a new sequence (unique id) : Binary logging not possible.
Message: Transaction level 'READ-COMMITTED' in InnoDB is not safe for binlog
mode 'STATEMENT'
NestedThrowables:
java.sql.SQLException: Binary logging not possible. Message: Transaction
level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
at
org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:400)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllTables(Hive.java:387)
at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:352)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:143)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:379)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:285)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:123)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:181)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:287)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: javax.jdo.JDOException: Couldnt obtain a new sequence (unique id)
: Binary logging not possible. Message: Transaction level 'READ-COMMITTED'
in InnoDB is not safe for binlog mode 'STATEMENT'
NestedThrowables:
java.sql.SQLException: Binary logging not possible. Message: Transaction
level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
at
org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:404)
at
org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:673)
at
org.datanucleus.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:693)
at
org.apache.hadoop.hive.metastore.ObjectStore.createDatabase(ObjectStore.java:259)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:148)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:118)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.(HiveMetaStoreClient.java:74)
at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:783)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:794)
at
org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:398)
... 13 more
Caused by: java.sql.SQLException: Binary logging not possible. Message:
Transaction level 'READ-COMMITTED' in InnoDB is not safe for binlog mode
'STATEMENT'
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:946)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2985)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1631)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1723)
at com.mysql.jdbc.Connection.execSQL(Connection.java:3283)
at
com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1332)
at
com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1467)
at
org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:450)
at
org.datanucleus.store.rdbms.table.SequenceTable.getNextVal(SequenceTable.java:196)
at
org.datanucleus.store.rdbms.valuegenerator.TableGenerator.reserveBlock(TableGenerator.java:169)
at
org.datanucleus.store.valuegenerator.AbstractGenerator.reserveBlock(AbstractGenerator.java:306)
at
org.datanucleus.store.rdbms.valuegenerator.AbstractRDBMSGenerator.obtainGenerationBlock(AbstractRDBMSGenerator.java:170)
at
org.datanucleus.store.valuegenerator.AbstractGenerator.obtainGenerationBlock(AbstractGenerator.java:198)
at
org.datanucleus.store.valuegenerator.AbstractGenerator.next(AbstractGenerator.java:106)
at
org.datanucleus.store.rdbms.RDBMSManager.getStrategyValueForGenerator(RDBMSManager.java:1511)
at
org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1065)
at
org.datanucleus.ObjectManagerImpl.newObjectId(ObjectManagerImpl.java:2577)
at
org.datanucleus.state.JDOStateManagerImpl.setIdentity(JDOStateManagerImpl.java:873)
at
org.datanucleus.state.JDOStateManagerImpl.initialiseForPersistentNew(JDOStateManagerImpl.java:458)
at
org.datanucleus.state.StateManagerFactory.newStateManagerForPersistentNew(StateManagerFactory.java:151)
at
org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1279)
at
org.datanucleus.ObjectManagerImpl.persistObject(ObjectManagerImpl.java:1157)
at
org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:668)
... 22 more

FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask
11/01/19 15:58:54 ERROR ql.Driver: FAILED: Execution Error, return code 1
from org.apache.hadoop.hive.ql.exec.DDLTask

Please let me know if I am doing something wrong or missing something. Your
help is greatly appreciated!

Search Discussions

  • Jean-Daniel Cryans at Jan 20, 2011 at 12:13 am
    Try setting this in your hive-site:

    <property>
    <name>datanucleus.transactionIsolation</name>
    <value>repeatable-read</value>
    </property>

    <property>
    <name>datanucleus.valuegeneration.transactionIsolation</name>
    <value>repeatable-read</value>
    </property>

    J-D
    On Wed, Jan 19, 2011 at 4:05 PM, vipul sharma wrote:
    Hi,

    we had been running cloudera distribution of hadoop. We installed hive
    following this document
    https://wiki.cloudera.com/display/DOC/Hive+Installation. hive-site.xml was
    later modified for storing metastore in mysql very similar to the config in
    this blog
    http://blog.milford.io/2010/06/installing-apache-hive-with-a-mysql-metastore-in-centos/

    I have set hadoop configs in hive-site.xml correctly and also I can connect
    to mysql server

    fs.default.name
    mapred.job.tracker
    hadoop.bin.path
    hadoop.config.dir
    hive.exec.scratchdir - set to /tmp on hdfs
    hive.metastore.local - true
    javax.jdo.option.ConnectionURL -
    jdbc:mysql://mysqlserver:3306/hadoop?createDatabaseIfNotExist=true
    javax.jdo.option.ConnectionUserName
    javax.jdo.option.ConnectionPassword
    javax.jdo.option.ConnectionDriverName
    hive.metastore.warehouse.dir - /user/hive/warehouse on hdfs

    I have also copied mysql-connector-java-5.0.8-bin.jar to /usr/lib/hive/lib

    I can see the database getting created in mysql. I have also deleted derby
    metastore directory and all lock files. I have created tmp and warehouse
    directory on hdfs and have assigned r+w permissions to it. When I start hive
    in cli and run show table command I get this error


    aws-qahdn01:~# hive -hiveconf hive.root.logger=INFO,console
    Hive history file=/tmp/root/hive_job_log_root_201101191558_1388620546.txt
    11/01/19 15:58:45 INFO exec.HiveHistory: Hive history
    file=/tmp/root/hive_job_log_root_201101191558_1388620546.txt
    hive> show tables;
    11/01/19 15:58:50 INFO parse.ParseDriver: Parsing command: show tables
    11/01/19 15:58:51 INFO parse.ParseDriver: Parse Completed
    11/01/19 15:58:51 INFO ql.Driver: Semantic Analysis Completed
    11/01/19 15:58:51 INFO ql.Driver: Starting command: show tables
    11/01/19 15:58:51 INFO metastore.HiveMetaStore: 0: Opening raw store with
    implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
    11/01/19 15:58:51 INFO metastore.ObjectStore: ObjectStore, initialize called
    11/01/19 15:58:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core"
    requires "org.eclipse.core.resources" but it cannot be resolved.
    11/01/19 15:58:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core"
    requires "org.eclipse.core.runtime" but it cannot be resolved.
    11/01/19 15:58:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core"
    requires "org.eclipse.text" but it cannot be resolved.
    11/01/19 15:58:53 INFO metastore.ObjectStore: Initialized ObjectStore
    11/01/19 15:58:53 WARN conf.Configuration: hdfs-site.xml:a attempt to
    override final parameter: dfs.data.dir;  Ignoring.
    FAILED: Error in metadata: javax.jdo.JDOException: Couldnt obtain a new
    sequence (unique id) : Binary logging not possible. Message: Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
    NestedThrowables:
    java.sql.SQLException: Binary logging not possible. Message: Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
    11/01/19 15:58:54 ERROR exec.DDLTask: FAILED: Error in metadata:
    javax.jdo.JDOException: Couldnt obtain a new sequence (unique id) : Binary
    logging not possible. Message: Transaction level 'READ-COMMITTED' in InnoDB
    is not safe for binlog mode 'STATEMENT'
    NestedThrowables:
    java.sql.SQLException: Binary logging not possible. Message: Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
    org.apache.hadoop.hive.ql.metadata.HiveException: javax.jdo.JDOException:
    Couldnt obtain a new sequence (unique id) : Binary logging not possible.
    Message: Transaction level 'READ-COMMITTED' in InnoDB is not safe for binlog
    mode 'STATEMENT'
    NestedThrowables:
    java.sql.SQLException: Binary logging not possible. Message: Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
    at
    org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:400)
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllTables(Hive.java:387)
    at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:352)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:143)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:379)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:285)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:123)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:181)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:287)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
    Caused by: javax.jdo.JDOException: Couldnt obtain a new sequence (unique id)
    : Binary logging not possible. Message: Transaction level 'READ-COMMITTED'
    in InnoDB is not safe for binlog mode 'STATEMENT'
    NestedThrowables:
    java.sql.SQLException: Binary logging not possible. Message: Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
    at
    org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:404)
    at
    org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:673)
    at
    org.datanucleus.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:693)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.createDatabase(ObjectStore.java:259)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:148)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:118)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:100)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:74)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:783)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:794)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:398)
    ... 13 more
    Caused by: java.sql.SQLException: Binary logging not possible. Message:
    Transaction level 'READ-COMMITTED' in InnoDB is not safe for binlog mode
    'STATEMENT'
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:946)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2985)
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1631)
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1723)
    at com.mysql.jdbc.Connection.execSQL(Connection.java:3283)
    at
    com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1332)
    at
    com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1467)
    at
    org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:450)
    at
    org.datanucleus.store.rdbms.table.SequenceTable.getNextVal(SequenceTable.java:196)
    at
    org.datanucleus.store.rdbms.valuegenerator.TableGenerator.reserveBlock(TableGenerator.java:169)
    at
    org.datanucleus.store.valuegenerator.AbstractGenerator.reserveBlock(AbstractGenerator.java:306)
    at
    org.datanucleus.store.rdbms.valuegenerator.AbstractRDBMSGenerator.obtainGenerationBlock(AbstractRDBMSGenerator.java:170)
    at
    org.datanucleus.store.valuegenerator.AbstractGenerator.obtainGenerationBlock(AbstractGenerator.java:198)
    at
    org.datanucleus.store.valuegenerator.AbstractGenerator.next(AbstractGenerator.java:106)
    at
    org.datanucleus.store.rdbms.RDBMSManager.getStrategyValueForGenerator(RDBMSManager.java:1511)
    at
    org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1065)
    at
    org.datanucleus.ObjectManagerImpl.newObjectId(ObjectManagerImpl.java:2577)
    at
    org.datanucleus.state.JDOStateManagerImpl.setIdentity(JDOStateManagerImpl.java:873)
    at
    org.datanucleus.state.JDOStateManagerImpl.initialiseForPersistentNew(JDOStateManagerImpl.java:458)
    at
    org.datanucleus.state.StateManagerFactory.newStateManagerForPersistentNew(StateManagerFactory.java:151)
    at
    org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1279)
    at
    org.datanucleus.ObjectManagerImpl.persistObject(ObjectManagerImpl.java:1157)
    at
    org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:668)
    ... 22 more

    FAILED: Execution Error, return code 1 from
    org.apache.hadoop.hive.ql.exec.DDLTask
    11/01/19 15:58:54 ERROR ql.Driver: FAILED: Execution Error, return code 1
    from org.apache.hadoop.hive.ql.exec.DDLTask

    Please let me know if I am doing something wrong or missing something. Your
    help is greatly appreciated!
  • Vipul sharma at Jan 20, 2011 at 12:38 am
    Thanks!

    Now I am hitting mysql bug of max key length: Specified key was too long;
    max key length is 767 bytes

    11/01/19 16:34:47 ERROR DataNucleus.Datastore: Error thrown executing CREATE
    TABLE `SD_PARAMS`
    (
    `SD_ID` BIGINT NOT NULL,
    `PARAM_KEY` VARCHAR(256) BINARY NOT NULL,
    `PARAM_VALUE` VARCHAR(767) BINARY NULL,
    PRIMARY KEY (`SD_ID`,`PARAM_KEY`)
    ) ENGINE=INNODB : Specified key was too long; max key length is 767 bytes

    On Wed, Jan 19, 2011 at 4:12 PM, Jean-Daniel Cryans wrote:

    Try setting this in your hive-site:

    <property>
    <name>datanucleus.transactionIsolation</name>
    <value>repeatable-read</value>
    </property>

    <property>
    <name>datanucleus.valuegeneration.transactionIsolation</name>
    <value>repeatable-read</value>
    </property>

    J-D
    On Wed, Jan 19, 2011 at 4:05 PM, vipul sharma wrote:
    Hi,

    we had been running cloudera distribution of hadoop. We installed hive
    following this document
    https://wiki.cloudera.com/display/DOC/Hive+Installation. hive-site.xml was
    later modified for storing metastore in mysql very similar to the config in
    this blog
    http://blog.milford.io/2010/06/installing-apache-hive-with-a-mysql-metastore-in-centos/
    I have set hadoop configs in hive-site.xml correctly and also I can connect
    to mysql server

    fs.default.name
    mapred.job.tracker
    hadoop.bin.path
    hadoop.config.dir
    hive.exec.scratchdir - set to /tmp on hdfs
    hive.metastore.local - true
    javax.jdo.option.ConnectionURL -
    jdbc:mysql://mysqlserver:3306/hadoop?createDatabaseIfNotExist=true
    javax.jdo.option.ConnectionUserName
    javax.jdo.option.ConnectionPassword
    javax.jdo.option.ConnectionDriverName
    hive.metastore.warehouse.dir - /user/hive/warehouse on hdfs

    I have also copied mysql-connector-java-5.0.8-bin.jar to
    /usr/lib/hive/lib
    I can see the database getting created in mysql. I have also deleted derby
    metastore directory and all lock files. I have created tmp and warehouse
    directory on hdfs and have assigned r+w permissions to it. When I start hive
    in cli and run show table command I get this error


    aws-qahdn01:~# hive -hiveconf hive.root.logger=INFO,console
    Hive history file=/tmp/root/hive_job_log_root_201101191558_1388620546.txt
    11/01/19 15:58:45 INFO exec.HiveHistory: Hive history
    file=/tmp/root/hive_job_log_root_201101191558_1388620546.txt
    hive> show tables;
    11/01/19 15:58:50 INFO parse.ParseDriver: Parsing command: show tables
    11/01/19 15:58:51 INFO parse.ParseDriver: Parse Completed
    11/01/19 15:58:51 INFO ql.Driver: Semantic Analysis Completed
    11/01/19 15:58:51 INFO ql.Driver: Starting command: show tables
    11/01/19 15:58:51 INFO metastore.HiveMetaStore: 0: Opening raw store with
    implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
    11/01/19 15:58:51 INFO metastore.ObjectStore: ObjectStore, initialize called
    11/01/19 15:58:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core"
    requires "org.eclipse.core.resources" but it cannot be resolved.
    11/01/19 15:58:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core"
    requires "org.eclipse.core.runtime" but it cannot be resolved.
    11/01/19 15:58:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core"
    requires "org.eclipse.text" but it cannot be resolved.
    11/01/19 15:58:53 INFO metastore.ObjectStore: Initialized ObjectStore
    11/01/19 15:58:53 WARN conf.Configuration: hdfs-site.xml:a attempt to
    override final parameter: dfs.data.dir; Ignoring.
    FAILED: Error in metadata: javax.jdo.JDOException: Couldnt obtain a new
    sequence (unique id) : Binary logging not possible. Message: Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
    NestedThrowables:
    java.sql.SQLException: Binary logging not possible. Message: Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
    11/01/19 15:58:54 ERROR exec.DDLTask: FAILED: Error in metadata:
    javax.jdo.JDOException: Couldnt obtain a new sequence (unique id) : Binary
    logging not possible. Message: Transaction level 'READ-COMMITTED' in InnoDB
    is not safe for binlog mode 'STATEMENT'
    NestedThrowables:
    java.sql.SQLException: Binary logging not possible. Message: Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
    org.apache.hadoop.hive.ql.metadata.HiveException: javax.jdo.JDOException:
    Couldnt obtain a new sequence (unique id) : Binary logging not possible.
    Message: Transaction level 'READ-COMMITTED' in InnoDB is not safe for binlog
    mode 'STATEMENT'
    NestedThrowables:
    java.sql.SQLException: Binary logging not possible. Message: Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
    at
    org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:400)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.getAllTables(Hive.java:387)
    at
    org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:352)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:143)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:379)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:285)
    at
    org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:123)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:181)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:287)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
    Caused by: javax.jdo.JDOException: Couldnt obtain a new sequence (unique id)
    : Binary logging not possible. Message: Transaction level
    'READ-COMMITTED'
    in InnoDB is not safe for binlog mode 'STATEMENT'
    NestedThrowables:
    java.sql.SQLException: Binary logging not possible. Message: Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
    at
    org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:404)
    at
    org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:673)
    at
    org.datanucleus.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:693)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.createDatabase(ObjectStore.java:259)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:148)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:118)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:100)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:74)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:783)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:794)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:398)
    ... 13 more
    Caused by: java.sql.SQLException: Binary logging not possible. Message:
    Transaction level 'READ-COMMITTED' in InnoDB is not safe for binlog mode
    'STATEMENT'
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:946)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2985)
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1631)
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1723)
    at com.mysql.jdbc.Connection.execSQL(Connection.java:3283)
    at
    com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1332)
    at
    com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1467)
    at
    org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:450)
    at
    org.datanucleus.store.rdbms.table.SequenceTable.getNextVal(SequenceTable.java:196)
    at
    org.datanucleus.store.rdbms.valuegenerator.TableGenerator.reserveBlock(TableGenerator.java:169)
    at
    org.datanucleus.store.valuegenerator.AbstractGenerator.reserveBlock(AbstractGenerator.java:306)
    at
    org.datanucleus.store.rdbms.valuegenerator.AbstractRDBMSGenerator.obtainGenerationBlock(AbstractRDBMSGenerator.java:170)
    at
    org.datanucleus.store.valuegenerator.AbstractGenerator.obtainGenerationBlock(AbstractGenerator.java:198)
    at
    org.datanucleus.store.valuegenerator.AbstractGenerator.next(AbstractGenerator.java:106)
    at
    org.datanucleus.store.rdbms.RDBMSManager.getStrategyValueForGenerator(RDBMSManager.java:1511)
    at
    org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1065)
    at
    org.datanucleus.ObjectManagerImpl.newObjectId(ObjectManagerImpl.java:2577)
    at
    org.datanucleus.state.JDOStateManagerImpl.setIdentity(JDOStateManagerImpl.java:873)
    at
    org.datanucleus.state.JDOStateManagerImpl.initialiseForPersistentNew(JDOStateManagerImpl.java:458)
    at
    org.datanucleus.state.StateManagerFactory.newStateManagerForPersistentNew(StateManagerFactory.java:151)
    at
    org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1279)
    at
    org.datanucleus.ObjectManagerImpl.persistObject(ObjectManagerImpl.java:1157)
    at
    org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:668)
    ... 22 more

    FAILED: Execution Error, return code 1 from
    org.apache.hadoop.hive.ql.exec.DDLTask
    11/01/19 15:58:54 ERROR ql.Driver: FAILED: Execution Error, return code 1
    from org.apache.hadoop.hive.ql.exec.DDLTask

    Please let me know if I am doing something wrong or missing something. Your
    help is greatly appreciated!


    --
    Vipul Sharma
    sharmavipul AT gmail DOT com
  • Jean-Daniel Cryans at Jan 20, 2011 at 12:41 am
    Have a looksee here http://wiki.apache.org/hadoop/Hive/FAQ

    J-D
    On Wed, Jan 19, 2011 at 4:38 PM, vipul sharma wrote:
    Thanks!

    Now I am hitting mysql bug of max key length: Specified key was too long;
    max key length is 767 bytes

    11/01/19 16:34:47 ERROR DataNucleus.Datastore: Error thrown executing CREATE
    TABLE `SD_PARAMS`
    (
    `SD_ID` BIGINT NOT NULL,
    `PARAM_KEY` VARCHAR(256) BINARY NOT NULL,
    `PARAM_VALUE` VARCHAR(767) BINARY NULL,
    PRIMARY KEY (`SD_ID`,`PARAM_KEY`)
    ) ENGINE=INNODB : Specified key was too long; max key length is 767 bytes

    On Wed, Jan 19, 2011 at 4:12 PM, Jean-Daniel Cryans wrote:

    Try setting this in your hive-site:

    <property>
    <name>datanucleus.transactionIsolation</name>
    <value>repeatable-read</value>
    </property>

    <property>
    <name>datanucleus.valuegeneration.transactionIsolation</name>
    <value>repeatable-read</value>
    </property>

    J-D

    On Wed, Jan 19, 2011 at 4:05 PM, vipul sharma <sharmavipulwork@gmail.com>
    wrote:
    Hi,

    we had been running cloudera distribution of hadoop. We installed hive
    following this document
    https://wiki.cloudera.com/display/DOC/Hive+Installation. hive-site.xml
    was
    later modified for storing metastore in mysql very similar to the config
    in
    this blog

    http://blog.milford.io/2010/06/installing-apache-hive-with-a-mysql-metastore-in-centos/

    I have set hadoop configs in hive-site.xml correctly and also I can
    connect
    to mysql server

    fs.default.name
    mapred.job.tracker
    hadoop.bin.path
    hadoop.config.dir
    hive.exec.scratchdir - set to /tmp on hdfs
    hive.metastore.local - true
    javax.jdo.option.ConnectionURL -
    jdbc:mysql://mysqlserver:3306/hadoop?createDatabaseIfNotExist=true
    javax.jdo.option.ConnectionUserName
    javax.jdo.option.ConnectionPassword
    javax.jdo.option.ConnectionDriverName
    hive.metastore.warehouse.dir - /user/hive/warehouse on hdfs

    I have also copied mysql-connector-java-5.0.8-bin.jar to
    /usr/lib/hive/lib

    I can see the database getting created in mysql. I have also deleted
    derby
    metastore directory and all lock files. I have created tmp and warehouse
    directory on hdfs and have assigned r+w permissions to it. When I start
    hive
    in cli and run show table command I get this error


    aws-qahdn01:~# hive -hiveconf hive.root.logger=INFO,console
    Hive history
    file=/tmp/root/hive_job_log_root_201101191558_1388620546.txt
    11/01/19 15:58:45 INFO exec.HiveHistory: Hive history
    file=/tmp/root/hive_job_log_root_201101191558_1388620546.txt
    hive> show tables;
    11/01/19 15:58:50 INFO parse.ParseDriver: Parsing command: show tables
    11/01/19 15:58:51 INFO parse.ParseDriver: Parse Completed
    11/01/19 15:58:51 INFO ql.Driver: Semantic Analysis Completed
    11/01/19 15:58:51 INFO ql.Driver: Starting command: show tables
    11/01/19 15:58:51 INFO metastore.HiveMetaStore: 0: Opening raw store
    with
    implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
    11/01/19 15:58:51 INFO metastore.ObjectStore: ObjectStore, initialize
    called
    11/01/19 15:58:51 ERROR DataNucleus.Plugin: Bundle
    "org.eclipse.jdt.core"
    requires "org.eclipse.core.resources" but it cannot be resolved.
    11/01/19 15:58:51 ERROR DataNucleus.Plugin: Bundle
    "org.eclipse.jdt.core"
    requires "org.eclipse.core.runtime" but it cannot be resolved.
    11/01/19 15:58:51 ERROR DataNucleus.Plugin: Bundle
    "org.eclipse.jdt.core"
    requires "org.eclipse.text" but it cannot be resolved.
    11/01/19 15:58:53 INFO metastore.ObjectStore: Initialized ObjectStore
    11/01/19 15:58:53 WARN conf.Configuration: hdfs-site.xml:a attempt to
    override final parameter: dfs.data.dir;  Ignoring.
    FAILED: Error in metadata: javax.jdo.JDOException: Couldnt obtain a new
    sequence (unique id) : Binary logging not possible. Message: Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
    NestedThrowables:
    java.sql.SQLException: Binary logging not possible. Message: Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
    11/01/19 15:58:54 ERROR exec.DDLTask: FAILED: Error in metadata:
    javax.jdo.JDOException: Couldnt obtain a new sequence (unique id) :
    Binary
    logging not possible. Message: Transaction level 'READ-COMMITTED' in
    InnoDB
    is not safe for binlog mode 'STATEMENT'
    NestedThrowables:
    java.sql.SQLException: Binary logging not possible. Message: Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
    org.apache.hadoop.hive.ql.metadata.HiveException:
    javax.jdo.JDOException:
    Couldnt obtain a new sequence (unique id) : Binary logging not possible.
    Message: Transaction level 'READ-COMMITTED' in InnoDB is not safe for
    binlog
    mode 'STATEMENT'
    NestedThrowables:
    java.sql.SQLException: Binary logging not possible. Message: Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
    at

    org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:400)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.getAllTables(Hive.java:387)
    at
    org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:352)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:143)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:379)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:285)
    at
    org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:123)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:181)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:287)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at

    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at

    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
    Caused by: javax.jdo.JDOException: Couldnt obtain a new sequence (unique
    id)
    : Binary logging not possible. Message: Transaction level
    'READ-COMMITTED'
    in InnoDB is not safe for binlog mode 'STATEMENT'
    NestedThrowables:
    java.sql.SQLException: Binary logging not possible. Message: Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode 'STATEMENT'
    at

    org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:404)
    at

    org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:673)
    at

    org.datanucleus.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:693)
    at

    org.apache.hadoop.hive.metastore.ObjectStore.createDatabase(ObjectStore.java:259)
    at

    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:148)
    at

    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:118)
    at

    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:100)
    at

    org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:74)
    at

    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:783)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:794)
    at

    org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:398)
    ... 13 more
    Caused by: java.sql.SQLException: Binary logging not possible. Message:
    Transaction level 'READ-COMMITTED' in InnoDB is not safe for binlog mode
    'STATEMENT'
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:946)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2985)
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1631)
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1723)
    at com.mysql.jdbc.Connection.execSQL(Connection.java:3283)
    at

    com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1332)
    at

    com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1467)
    at

    org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:450)
    at

    org.datanucleus.store.rdbms.table.SequenceTable.getNextVal(SequenceTable.java:196)
    at

    org.datanucleus.store.rdbms.valuegenerator.TableGenerator.reserveBlock(TableGenerator.java:169)
    at

    org.datanucleus.store.valuegenerator.AbstractGenerator.reserveBlock(AbstractGenerator.java:306)
    at

    org.datanucleus.store.rdbms.valuegenerator.AbstractRDBMSGenerator.obtainGenerationBlock(AbstractRDBMSGenerator.java:170)
    at

    org.datanucleus.store.valuegenerator.AbstractGenerator.obtainGenerationBlock(AbstractGenerator.java:198)
    at

    org.datanucleus.store.valuegenerator.AbstractGenerator.next(AbstractGenerator.java:106)
    at

    org.datanucleus.store.rdbms.RDBMSManager.getStrategyValueForGenerator(RDBMSManager.java:1511)
    at

    org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1065)
    at

    org.datanucleus.ObjectManagerImpl.newObjectId(ObjectManagerImpl.java:2577)
    at

    org.datanucleus.state.JDOStateManagerImpl.setIdentity(JDOStateManagerImpl.java:873)
    at

    org.datanucleus.state.JDOStateManagerImpl.initialiseForPersistentNew(JDOStateManagerImpl.java:458)
    at

    org.datanucleus.state.StateManagerFactory.newStateManagerForPersistentNew(StateManagerFactory.java:151)
    at

    org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1279)
    at

    org.datanucleus.ObjectManagerImpl.persistObject(ObjectManagerImpl.java:1157)
    at

    org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:668)
    ... 22 more

    FAILED: Execution Error, return code 1 from
    org.apache.hadoop.hive.ql.exec.DDLTask
    11/01/19 15:58:54 ERROR ql.Driver: FAILED: Execution Error, return code
    1
    from org.apache.hadoop.hive.ql.exec.DDLTask

    Please let me know if I am doing something wrong or missing something.
    Your
    help is greatly appreciated!


    --
    Vipul Sharma
    sharmavipul AT gmail DOT com
  • Vipul sharma at Jan 20, 2011 at 12:43 am
    Yup. Thanks for you help J-D, much appreciated!
    On Wed, Jan 19, 2011 at 4:41 PM, Jean-Daniel Cryans wrote:

    Have a looksee here http://wiki.apache.org/hadoop/Hive/FAQ

    J-D
    On Wed, Jan 19, 2011 at 4:38 PM, vipul sharma wrote:
    Thanks!

    Now I am hitting mysql bug of max key length: Specified key was too long;
    max key length is 767 bytes

    11/01/19 16:34:47 ERROR DataNucleus.Datastore: Error thrown executing CREATE
    TABLE `SD_PARAMS`
    (
    `SD_ID` BIGINT NOT NULL,
    `PARAM_KEY` VARCHAR(256) BINARY NOT NULL,
    `PARAM_VALUE` VARCHAR(767) BINARY NULL,
    PRIMARY KEY (`SD_ID`,`PARAM_KEY`)
    ) ENGINE=INNODB : Specified key was too long; max key length is 767 bytes


    On Wed, Jan 19, 2011 at 4:12 PM, Jean-Daniel Cryans <jdcryans@apache.org

    wrote:
    Try setting this in your hive-site:

    <property>
    <name>datanucleus.transactionIsolation</name>
    <value>repeatable-read</value>
    </property>

    <property>
    <name>datanucleus.valuegeneration.transactionIsolation</name>
    <value>repeatable-read</value>
    </property>

    J-D

    On Wed, Jan 19, 2011 at 4:05 PM, vipul sharma <
    sharmavipulwork@gmail.com>
    wrote:
    Hi,

    we had been running cloudera distribution of hadoop. We installed hive
    following this document
    https://wiki.cloudera.com/display/DOC/Hive+Installation.
    hive-site.xml
    was
    later modified for storing metastore in mysql very similar to the
    config
    in
    this blog
    http://blog.milford.io/2010/06/installing-apache-hive-with-a-mysql-metastore-in-centos/
    I have set hadoop configs in hive-site.xml correctly and also I can
    connect
    to mysql server

    fs.default.name
    mapred.job.tracker
    hadoop.bin.path
    hadoop.config.dir
    hive.exec.scratchdir - set to /tmp on hdfs
    hive.metastore.local - true
    javax.jdo.option.ConnectionURL -
    jdbc:mysql://mysqlserver:3306/hadoop?createDatabaseIfNotExist=true
    javax.jdo.option.ConnectionUserName
    javax.jdo.option.ConnectionPassword
    javax.jdo.option.ConnectionDriverName
    hive.metastore.warehouse.dir - /user/hive/warehouse on hdfs

    I have also copied mysql-connector-java-5.0.8-bin.jar to
    /usr/lib/hive/lib

    I can see the database getting created in mysql. I have also deleted
    derby
    metastore directory and all lock files. I have created tmp and
    warehouse
    directory on hdfs and have assigned r+w permissions to it. When I
    start
    hive
    in cli and run show table command I get this error


    aws-qahdn01:~# hive -hiveconf hive.root.logger=INFO,console
    Hive history
    file=/tmp/root/hive_job_log_root_201101191558_1388620546.txt
    11/01/19 15:58:45 INFO exec.HiveHistory: Hive history
    file=/tmp/root/hive_job_log_root_201101191558_1388620546.txt
    hive> show tables;
    11/01/19 15:58:50 INFO parse.ParseDriver: Parsing command: show tables
    11/01/19 15:58:51 INFO parse.ParseDriver: Parse Completed
    11/01/19 15:58:51 INFO ql.Driver: Semantic Analysis Completed
    11/01/19 15:58:51 INFO ql.Driver: Starting command: show tables
    11/01/19 15:58:51 INFO metastore.HiveMetaStore: 0: Opening raw store
    with
    implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
    11/01/19 15:58:51 INFO metastore.ObjectStore: ObjectStore, initialize
    called
    11/01/19 15:58:51 ERROR DataNucleus.Plugin: Bundle
    "org.eclipse.jdt.core"
    requires "org.eclipse.core.resources" but it cannot be resolved.
    11/01/19 15:58:51 ERROR DataNucleus.Plugin: Bundle
    "org.eclipse.jdt.core"
    requires "org.eclipse.core.runtime" but it cannot be resolved.
    11/01/19 15:58:51 ERROR DataNucleus.Plugin: Bundle
    "org.eclipse.jdt.core"
    requires "org.eclipse.text" but it cannot be resolved.
    11/01/19 15:58:53 INFO metastore.ObjectStore: Initialized ObjectStore
    11/01/19 15:58:53 WARN conf.Configuration: hdfs-site.xml:a attempt to
    override final parameter: dfs.data.dir; Ignoring.
    FAILED: Error in metadata: javax.jdo.JDOException: Couldnt obtain a
    new
    sequence (unique id) : Binary logging not possible. Message:
    Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode
    'STATEMENT'
    NestedThrowables:
    java.sql.SQLException: Binary logging not possible. Message:
    Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode
    'STATEMENT'
    11/01/19 15:58:54 ERROR exec.DDLTask: FAILED: Error in metadata:
    javax.jdo.JDOException: Couldnt obtain a new sequence (unique id) :
    Binary
    logging not possible. Message: Transaction level 'READ-COMMITTED' in
    InnoDB
    is not safe for binlog mode 'STATEMENT'
    NestedThrowables:
    java.sql.SQLException: Binary logging not possible. Message:
    Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode
    'STATEMENT'
    org.apache.hadoop.hive.ql.metadata.HiveException:
    javax.jdo.JDOException:
    Couldnt obtain a new sequence (unique id) : Binary logging not
    possible.
    Message: Transaction level 'READ-COMMITTED' in InnoDB is not safe for
    binlog
    mode 'STATEMENT'
    NestedThrowables:
    java.sql.SQLException: Binary logging not possible. Message:
    Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode
    'STATEMENT'
    at
    org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:400)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.getAllTables(Hive.java:387)
    at
    org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:352)
    at
    org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:143)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:379)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:285)
    at
    org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:123)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:181)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:287)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
    Caused by: javax.jdo.JDOException: Couldnt obtain a new sequence
    (unique
    id)
    : Binary logging not possible. Message: Transaction level
    'READ-COMMITTED'
    in InnoDB is not safe for binlog mode 'STATEMENT'
    NestedThrowables:
    java.sql.SQLException: Binary logging not possible. Message:
    Transaction
    level 'READ-COMMITTED' in InnoDB is not safe for binlog mode
    'STATEMENT'
    at
    org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:404)
    at
    org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:673)
    at
    org.datanucleus.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:693)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.createDatabase(ObjectStore.java:259)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:148)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:118)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:100)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:74)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:783)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:794)
    at
    org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:398)
    ... 13 more
    Caused by: java.sql.SQLException: Binary logging not possible.
    Message:
    Transaction level 'READ-COMMITTED' in InnoDB is not safe for binlog
    mode
    'STATEMENT'
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:946)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2985)
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1631)
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1723)
    at com.mysql.jdbc.Connection.execSQL(Connection.java:3283)
    at
    com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1332)
    at
    com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1467)
    at
    org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:450)
    at
    org.datanucleus.store.rdbms.table.SequenceTable.getNextVal(SequenceTable.java:196)
    at
    org.datanucleus.store.rdbms.valuegenerator.TableGenerator.reserveBlock(TableGenerator.java:169)
    at
    org.datanucleus.store.valuegenerator.AbstractGenerator.reserveBlock(AbstractGenerator.java:306)
    at
    org.datanucleus.store.rdbms.valuegenerator.AbstractRDBMSGenerator.obtainGenerationBlock(AbstractRDBMSGenerator.java:170)
    at
    org.datanucleus.store.valuegenerator.AbstractGenerator.obtainGenerationBlock(AbstractGenerator.java:198)
    at
    org.datanucleus.store.valuegenerator.AbstractGenerator.next(AbstractGenerator.java:106)
    at
    org.datanucleus.store.rdbms.RDBMSManager.getStrategyValueForGenerator(RDBMSManager.java:1511)
    at
    org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1065)
    at
    org.datanucleus.ObjectManagerImpl.newObjectId(ObjectManagerImpl.java:2577)
    at
    org.datanucleus.state.JDOStateManagerImpl.setIdentity(JDOStateManagerImpl.java:873)
    at
    org.datanucleus.state.JDOStateManagerImpl.initialiseForPersistentNew(JDOStateManagerImpl.java:458)
    at
    org.datanucleus.state.StateManagerFactory.newStateManagerForPersistentNew(StateManagerFactory.java:151)
    at
    org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1279)
    at
    org.datanucleus.ObjectManagerImpl.persistObject(ObjectManagerImpl.java:1157)
    at
    org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:668)
    ... 22 more

    FAILED: Execution Error, return code 1 from
    org.apache.hadoop.hive.ql.exec.DDLTask
    11/01/19 15:58:54 ERROR ql.Driver: FAILED: Execution Error, return
    code
    1
    from org.apache.hadoop.hive.ql.exec.DDLTask

    Please let me know if I am doing something wrong or missing something.
    Your
    help is greatly appreciated!


    --
    Vipul Sharma
    sharmavipul AT gmail DOT com


    --
    Vipul Sharma
    sharmavipul AT gmail DOT com

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupuser @
categorieshive, hadoop
postedJan 20, '11 at 12:05a
activeJan 20, '11 at 12:43a
posts5
users2
websitehive.apache.org

People

Translate

site design / logo © 2021 Grokbase