FAQ
Experts,

Not able to drop an impala table (parquet format with Snappy compression)
after running - COMPUTE STATS on the table.

Impala Version: 1.3.1-cdh5
Using Mysql as metastore.

Any workarounds on this issue?

Query: drop table PCM_COST_MGMT
ERROR: InternalException: javax.jdo.JDODataStoreException: Exception thrown
flushing changes to datastore
         at
org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451)
         at
org.datanucleus.api.jdo.JDOTransaction.commit(JDOTransaction.java:165)
         at
org.apache.hadoop.hive.metastore.ObjectStore.commitTransaction(ObjectStore.java:401)
         at sun.reflect.GeneratedMethodAccessor39.invoke(Unknown Source)
         at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at
org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:122)
         at com.sun.proxy.$Proxy0.commitTransaction(Unknown Source)
         at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1199)
         at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1332)
         at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
         at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
         at
com.sun.proxy.$Proxy5.drop_table_with_environment_context(Unknown Source)
         at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6539)
         at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6523)
         at
org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
         at
org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
         at
org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
         at
org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:554)
         at
org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
         at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
         at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
         at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
         at java.lang.Thread.run(Thread.java:744)
NestedThrowablesStackTrace:
j*ava.sql.BatchUpdateException: Cannot delete or update a parent row: a
foreign key constraint fails (`metastore`.`TAB_COL_STATS`, CONSTRAINT
`TAB_COL_STATS_FK` FOREIGN KEY (`TBL_ID`) REFERENCES `TBLS` (`TBL_ID`))*
         at
com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:2024)
         at
com.mysql.jdbc.PreparedStatement.executeBatch(PreparedStatement.java:1449)
         at
com.jolbox.bonecp.StatementHandle.executeBatch(StatementHandle.java:469)
         at
org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeBatch(ParamLoggingPreparedStatement.java:372)
         at
org.datanucleus.store.rdbms.SQLController.processConnectionStatement(SQLController.java:628)
         at
org.datanucleus.store.rdbms.SQLController.processStatementsForConnection(SQLController.java:596)
         at
org.datanucleus.store.rdbms.SQLController$1.transactionFlushed(SQLController.java:683)
         at
org.datanucleus.store.connection.AbstractManagedConnection.transactionFlushed(AbstractManagedConnection.java:86)
         at
org.datanucleus.store.connection.ConnectionManagerImpl$2.transactionFlushed(ConnectionManagerImpl.java:454)
         at org.datanucleus.TransactionImpl.flush(TransactionImpl.java:199)
         at org.datanucleus.TransactionImpl.commit(TransactionImpl.java:263)
         at
org.datanucleus.api.jdo.JDOTransaction.commit(JDOTransaction.java:98)
         at
org.apache.hadoop.hive.metastore.ObjectStore.commitTransaction(ObjectStore.java:401)
         at sun.reflect.GeneratedMethodAccessor39.invoke(Unknown Source)
         at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at
org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:122)
         at com.sun.proxy.$Proxy0.commitTransaction(Unknown Source)
         at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1199)
         at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1332)
         at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
         at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
         at
com.sun.proxy.$Proxy5.drop_table_with_environment_context(Unknown Source)
         at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6539)
         at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6523)
         at
org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
         at
org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
         at
org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
         at
org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:554)
         at
org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
         at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
         at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
         at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
         at java.lang.Thread.run(Thread.java:744)
Caused by:
com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException:
Cannot delete or update a parent row: a foreign key constraint fails
(`metastore`.`TAB_COL_STATS`, CONSTRAINT `TAB_COL_STATS_FK` FOREIGN KEY
(`TBL_ID`) REFERENCES `TBLS` (`TBL_ID`))
         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
         at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
         at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
         at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
         at com.mysql.jdbc.Util.getInstance(Util.java:386)
         at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1040)
         at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
         at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
         at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
         at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
         at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2625)
         at
com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2119)
         at
com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2415)
         at
com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:1979)
         ... 38 more

Regards,
Venkat Ankam

To unsubscribe from this group and stop receiving emails from it, send an email to impala-user+unsubscribe@cloudera.org.

Search Discussions

  • John Russell at Jul 7, 2014 at 6:15 pm
    Did you run COMPUTE STATS through impala-shell, JDBC/ODBC, Hue, or some other way? At first glance, this looks like the COMPUTE STATS statement was not 'closed' in Hue. I could imagine also an unclosed statement handle in JDBC.

    John
    On Jul 7, 2014, at 9:44 AM, Venkat Ankam wrote:

    Experts,

    Not able to drop an impala table (parquet format with Snappy compression) after running - COMPUTE STATS on the table.

    Impala Version: 1.3.1-cdh5
    Using Mysql as metastore.

    Any workarounds on this issue?

    Query: drop table PCM_COST_MGMT
    ERROR: InternalException: javax.jdo.JDODataStoreException: Exception thrown flushing changes to datastore
    at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451)
    at org.datanucleus.api.jdo.JDOTransaction.commit(JDOTransaction.java:165)
    at org.apache.hadoop.hive.metastore.ObjectStore.commitTransaction(ObjectStore.java:401)
    at sun.reflect.GeneratedMethodAccessor39.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:122)
    at com.sun.proxy.$Proxy0.commitTransaction(Unknown Source)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1199)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1332)
    at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
    at com.sun.proxy.$Proxy5.drop_table_with_environment_context(Unknown Source)
    at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6539)
    at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6523)
    at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
    at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
    at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
    at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:554)
    at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
    at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)
    NestedThrowablesStackTrace:
    java.sql.BatchUpdateException: Cannot delete or update a parent row: a foreign key constraint fails (`metastore`.`TAB_COL_STATS`, CONSTRAINT `TAB_COL_STATS_FK` FOREIGN KEY (`TBL_ID`) REFERENCES `TBLS` (`TBL_ID`))
    at com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:2024)
    at com.mysql.jdbc.PreparedStatement.executeBatch(PreparedStatement.java:1449)
    at com.jolbox.bonecp.StatementHandle.executeBatch(StatementHandle.java:469)
    at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeBatch(ParamLoggingPreparedStatement.java:372)
    at org.datanucleus.store.rdbms.SQLController.processConnectionStatement(SQLController.java:628)
    at org.datanucleus.store.rdbms.SQLController.processStatementsForConnection(SQLController.java:596)
    at org.datanucleus.store.rdbms.SQLController$1.transactionFlushed(SQLController.java:683)
    at org.datanucleus.store.connection.AbstractManagedConnection.transactionFlushed(AbstractManagedConnection.java:86)
    at org.datanucleus.store.connection.ConnectionManagerImpl$2.transactionFlushed(ConnectionManagerImpl.java:454)
    at org.datanucleus.TransactionImpl.flush(TransactionImpl.java:199)
    at org.datanucleus.TransactionImpl.commit(TransactionImpl.java:263)
    at org.datanucleus.api.jdo.JDOTransaction.commit(JDOTransaction.java:98)
    at org.apache.hadoop.hive.metastore.ObjectStore.commitTransaction(ObjectStore.java:401)
    at sun.reflect.GeneratedMethodAccessor39.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:122)
    at com.sun.proxy.$Proxy0.commitTransaction(Unknown Source)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1199)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1332)
    at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
    at com.sun.proxy.$Proxy5.drop_table_with_environment_context(Unknown Source)
    at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6539)
    at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6523)
    at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
    at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
    at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
    at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:554)
    at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
    at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)
    Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Cannot delete or update a parent row: a foreign key constraint fails (`metastore`.`TAB_COL_STATS`, CONSTRAINT `TAB_COL_STATS_FK` FOREIGN KEY (`TBL_ID`) REFERENCES `TBLS` (`TBL_ID`))
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
    at com.mysql.jdbc.Util.getInstance(Util.java:386)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1040)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
    at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2625)
    at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2119)
    at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2415)
    at com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:1979)
    ... 38 more

    Regards,
    Venkat Ankam

    To unsubscribe from this group and stop receiving emails from it, send an email to impala-user+unsubscribe@cloudera.org.
    To unsubscribe from this group and stop receiving emails from it, send an email to impala-user+unsubscribe@cloudera.org.
  • Venkat Ankam at Jul 7, 2014 at 7:05 pm
    John,

    I ran COMPUTE STATS in impala-shell and it was closed successfully.

    Regards,
    Venkat

    On Mon, Jul 7, 2014 at 12:13 PM, John Russell wrote:

    Did you run COMPUTE STATS through impala-shell, JDBC/ODBC, Hue, or some
    other way? At first glance, this looks like the COMPUTE STATS statement
    was not 'closed' in Hue. I could imagine also an unclosed statement handle
    in JDBC.

    John

    On Jul 7, 2014, at 9:44 AM, Venkat Ankam wrote:

    Experts,

    Not able to drop an impala table (parquet format with Snappy compression)
    after running - COMPUTE STATS on the table.

    Impala Version: 1.3.1-cdh5
    Using Mysql as metastore.

    Any workarounds on this issue?

    Query: drop table PCM_COST_MGMT
    ERROR: InternalException: javax.jdo.JDODataStoreException: Exception
    thrown flushing changes to datastore
    at
    org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451)
    at
    org.datanucleus.api.jdo.JDOTransaction.commit(JDOTransaction.java:165)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.commitTransaction(ObjectStore.java:401)
    at sun.reflect.GeneratedMethodAccessor39.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:122)
    at com.sun.proxy.$Proxy0.commitTransaction(Unknown Source)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1199)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1332)
    at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
    org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
    at
    com.sun.proxy.$Proxy5.drop_table_with_environment_context(Unknown Source)
    at
    org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6539)
    at
    org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6523)
    at
    org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
    at
    org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:554)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
    at
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
    at
    java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at
    java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)
    NestedThrowablesStackTrace:
    j*ava.sql.BatchUpdateException: Cannot delete or update a parent row: a
    foreign key constraint fails (`metastore`.`TAB_COL_STATS`, CONSTRAINT
    `TAB_COL_STATS_FK` FOREIGN KEY (`TBL_ID`) REFERENCES `TBLS` (`TBL_ID`))*
    at
    com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:2024)
    at
    com.mysql.jdbc.PreparedStatement.executeBatch(PreparedStatement.java:1449)
    at
    com.jolbox.bonecp.StatementHandle.executeBatch(StatementHandle.java:469)
    at
    org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeBatch(ParamLoggingPreparedStatement.java:372)
    at
    org.datanucleus.store.rdbms.SQLController.processConnectionStatement(SQLController.java:628)
    at
    org.datanucleus.store.rdbms.SQLController.processStatementsForConnection(SQLController.java:596)
    at
    org.datanucleus.store.rdbms.SQLController$1.transactionFlushed(SQLController.java:683)
    at
    org.datanucleus.store.connection.AbstractManagedConnection.transactionFlushed(AbstractManagedConnection.java:86)
    at
    org.datanucleus.store.connection.ConnectionManagerImpl$2.transactionFlushed(ConnectionManagerImpl.java:454)
    at org.datanucleus.TransactionImpl.flush(TransactionImpl.java:199)
    at org.datanucleus.TransactionImpl.commit(TransactionImpl.java:263)
    at
    org.datanucleus.api.jdo.JDOTransaction.commit(JDOTransaction.java:98)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.commitTransaction(ObjectStore.java:401)
    at sun.reflect.GeneratedMethodAccessor39.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:122)
    at com.sun.proxy.$Proxy0.commitTransaction(Unknown Source)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1199)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1332)
    at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
    org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
    at
    com.sun.proxy.$Proxy5.drop_table_with_environment_context(Unknown Source)
    at
    org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6539)
    at
    org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6523)
    at
    org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
    at
    org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:554)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
    at
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
    at
    java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at
    java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)
    Caused by:
    com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException:
    Cannot delete or update a parent row: a foreign key constraint fails
    (`metastore`.`TAB_COL_STATS`, CONSTRAINT `TAB_COL_STATS_FK` FOREIGN KEY
    (`TBL_ID`) REFERENCES `TBLS` (`TBL_ID`))
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
    at com.mysql.jdbc.Util.getInstance(Util.java:386)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1040)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
    at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2625)
    at
    com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2119)
    at
    com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2415)
    at
    com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:1979)
    ... 38 more

    Regards,
    Venkat Ankam

    To unsubscribe from this group and stop receiving emails from it, send an
    email to impala-user+unsubscribe@cloudera.org.


    To unsubscribe from this group and stop receiving emails from it, send an
    email to impala-user+unsubscribe@cloudera.org.
    To unsubscribe from this group and stop receiving emails from it, send an email to impala-user+unsubscribe@cloudera.org.
  • Venkat Ankam at Jul 7, 2014 at 7:20 pm
    Lenni,

    I am able to drop the table now. Running stats in Hive fixed the problem?

    Regards,
    Venkat

    On Mon, Jul 7, 2014 at 1:16 PM, Venkat Ankam wrote:

    Lenni,

    hive> analyze table PCM_COST_MGMT compute statistics for columns
    dtl_cost_af_id;

    Command was successful.

    Ended Job = job_1402686439210_0044
    MapReduce Jobs Launched:
    Job 0: Map: 22 Reduce: 1 Cumulative CPU: 2403.22 sec HDFS Read:
    4886317060 HDFS Write: 38 SUCCESS
    Total MapReduce CPU Time Spent: 40 minutes 3 seconds 220 msec
    OK
    Time taken: 109.282 seconds
    hive>

    Regards,
    Venkat

    On Mon, Jul 7, 2014 at 12:12 PM, Lenni Kuff wrote:

    Hi Venkat,
    This appears to be an issue with the Hive Metastore rather than with
    Impala.

    To confirm, can you verify whether you get the same/similar error when
    you compute stats with Hive? You can do this by attempting to run column
    stats via Hive using the syntax:
    analyze table t [partition p] compute statistics for [columns c,...];

    My suspicion is that Hive will also fail.

    Thanks,
    Lenni

    On Mon, Jul 7, 2014 at 9:44 AM, Venkat Ankam <venkat@cloudwick.com>
    wrote:
    Experts,

    Not able to drop an impala table (parquet format with Snappy
    compression) after running - COMPUTE STATS on the table.

    Impala Version: 1.3.1-cdh5
    Using Mysql as metastore.

    Any workarounds on this issue?

    Query: drop table PCM_COST_MGMT
    ERROR: InternalException: javax.jdo.JDODataStoreException: Exception
    thrown flushing changes to datastore
    at
    org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451)
    at
    org.datanucleus.api.jdo.JDOTransaction.commit(JDOTransaction.java:165)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.commitTransaction(ObjectStore.java:401)
    at sun.reflect.GeneratedMethodAccessor39.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:122)
    at com.sun.proxy.$Proxy0.commitTransaction(Unknown Source)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1199)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1332)
    at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
    org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
    at
    com.sun.proxy.$Proxy5.drop_table_with_environment_context(Unknown Source)
    at
    org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6539)
    at
    org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6523)
    at
    org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
    at
    org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:554)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
    at
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
    at
    java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at
    java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)
    NestedThrowablesStackTrace:
    j*ava.sql.BatchUpdateException: Cannot delete or update a parent row: a
    foreign key constraint fails (`metastore`.`TAB_COL_STATS`, CONSTRAINT
    `TAB_COL_STATS_FK` FOREIGN KEY (`TBL_ID`) REFERENCES `TBLS` (`TBL_ID`))*
    at
    com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:2024)
    at
    com.mysql.jdbc.PreparedStatement.executeBatch(PreparedStatement.java:1449)
    at
    com.jolbox.bonecp.StatementHandle.executeBatch(StatementHandle.java:469)
    at
    org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeBatch(ParamLoggingPreparedStatement.java:372)
    at
    org.datanucleus.store.rdbms.SQLController.processConnectionStatement(SQLController.java:628)
    at
    org.datanucleus.store.rdbms.SQLController.processStatementsForConnection(SQLController.java:596)
    at
    org.datanucleus.store.rdbms.SQLController$1.transactionFlushed(SQLController.java:683)
    at
    org.datanucleus.store.connection.AbstractManagedConnection.transactionFlushed(AbstractManagedConnection.java:86)
    at
    org.datanucleus.store.connection.ConnectionManagerImpl$2.transactionFlushed(ConnectionManagerImpl.java:454)
    at
    org.datanucleus.TransactionImpl.flush(TransactionImpl.java:199)
    at
    org.datanucleus.TransactionImpl.commit(TransactionImpl.java:263)
    at
    org.datanucleus.api.jdo.JDOTransaction.commit(JDOTransaction.java:98)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.commitTransaction(ObjectStore.java:401)
    at sun.reflect.GeneratedMethodAccessor39.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:122)
    at com.sun.proxy.$Proxy0.commitTransaction(Unknown Source)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1199)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1332)
    at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
    org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
    at
    com.sun.proxy.$Proxy5.drop_table_with_environment_context(Unknown Source)
    at
    org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6539)
    at
    org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6523)
    at
    org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
    at
    org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:554)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
    at
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
    at
    java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at
    java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)
    Caused by:
    com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException:
    Cannot delete or update a parent row: a foreign key constraint fails
    (`metastore`.`TAB_COL_STATS`, CONSTRAINT `TAB_COL_STATS_FK` FOREIGN KEY
    (`TBL_ID`) REFERENCES `TBLS` (`TBL_ID`))
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
    Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at
    java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
    at com.mysql.jdbc.Util.getInstance(Util.java:386)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1040)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
    at
    com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2625)
    at
    com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2119)
    at
    com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2415)
    at
    com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:1979)
    ... 38 more

    Regards,
    Venkat Ankam

    To unsubscribe from this group and stop receiving emails from it, send
    an email to impala-user+unsubscribe@cloudera.org.
    To unsubscribe from this group and stop receiving emails from it, send
    an email to impala-user+unsubscribe@cloudera.org.
    To unsubscribe from this group and stop receiving emails from it, send an email to impala-user+unsubscribe@cloudera.org.
  • Lenni Kuff at Jul 8, 2014 at 8:29 pm
    Yes, that's what it appears to have done. I'm glad you were able to get
    unblocked, I believe your Hive Metastore may have been in an inconsistent
    state. I do have a few follow on questions so we can root cause this
    problem and fix it in a future release of Impala or Hive:

    * Are you able to reproduce this issue with a different table?
    * Do you remember how you got into this state? For example, did you happen
    to cancel a COMPUTE STATS or other DDL query?
    * If you still have the Hive Metastore, impalad. and catalogd logs from
    when this problem occurred it would be useful to get them so we can attempt
    to reproduce this problem.

    Thanks,
    Lenni

    On Mon, Jul 7, 2014 at 12:20 PM, Venkat Ankam wrote:

    Lenni,

    I am able to drop the table now. Running stats in Hive fixed the problem?

    Regards,
    Venkat

    On Mon, Jul 7, 2014 at 1:16 PM, Venkat Ankam wrote:

    Lenni,

    hive> analyze table PCM_COST_MGMT compute statistics for columns
    dtl_cost_af_id;

    Command was successful.

    Ended Job = job_1402686439210_0044
    MapReduce Jobs Launched:
    Job 0: Map: 22 Reduce: 1 Cumulative CPU: 2403.22 sec HDFS Read:
    4886317060 HDFS Write: 38 SUCCESS
    Total MapReduce CPU Time Spent: 40 minutes 3 seconds 220 msec
    OK
    Time taken: 109.282 seconds
    hive>

    Regards,
    Venkat

    On Mon, Jul 7, 2014 at 12:12 PM, Lenni Kuff wrote:

    Hi Venkat,
    This appears to be an issue with the Hive Metastore rather than with
    Impala.

    To confirm, can you verify whether you get the same/similar error when
    you compute stats with Hive? You can do this by attempting to run column
    stats via Hive using the syntax:
    analyze table t [partition p] compute statistics for [columns c,...];

    My suspicion is that Hive will also fail.

    Thanks,
    Lenni

    On Mon, Jul 7, 2014 at 9:44 AM, Venkat Ankam <venkat@cloudwick.com>
    wrote:
    Experts,

    Not able to drop an impala table (parquet format with Snappy
    compression) after running - COMPUTE STATS on the table.

    Impala Version: 1.3.1-cdh5
    Using Mysql as metastore.

    Any workarounds on this issue?

    Query: drop table PCM_COST_MGMT
    ERROR: InternalException: javax.jdo.JDODataStoreException: Exception
    thrown flushing changes to datastore
    at
    org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451)
    at
    org.datanucleus.api.jdo.JDOTransaction.commit(JDOTransaction.java:165)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.commitTransaction(ObjectStore.java:401)
    at sun.reflect.GeneratedMethodAccessor39.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:122)
    at com.sun.proxy.$Proxy0.commitTransaction(Unknown Source)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1199)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1332)
    at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
    org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
    at
    com.sun.proxy.$Proxy5.drop_table_with_environment_context(Unknown Source)
    at
    org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6539)
    at
    org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6523)
    at
    org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
    at
    org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:554)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
    at
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
    at
    java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at
    java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)
    NestedThrowablesStackTrace:
    j*ava.sql.BatchUpdateException: Cannot delete or update a parent row:
    a foreign key constraint fails (`metastore`.`TAB_COL_STATS`, CONSTRAINT
    `TAB_COL_STATS_FK` FOREIGN KEY (`TBL_ID`) REFERENCES `TBLS` (`TBL_ID`))*
    at
    com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:2024)
    at
    com.mysql.jdbc.PreparedStatement.executeBatch(PreparedStatement.java:1449)
    at
    com.jolbox.bonecp.StatementHandle.executeBatch(StatementHandle.java:469)
    at
    org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeBatch(ParamLoggingPreparedStatement.java:372)
    at
    org.datanucleus.store.rdbms.SQLController.processConnectionStatement(SQLController.java:628)
    at
    org.datanucleus.store.rdbms.SQLController.processStatementsForConnection(SQLController.java:596)
    at
    org.datanucleus.store.rdbms.SQLController$1.transactionFlushed(SQLController.java:683)
    at
    org.datanucleus.store.connection.AbstractManagedConnection.transactionFlushed(AbstractManagedConnection.java:86)
    at
    org.datanucleus.store.connection.ConnectionManagerImpl$2.transactionFlushed(ConnectionManagerImpl.java:454)
    at
    org.datanucleus.TransactionImpl.flush(TransactionImpl.java:199)
    at
    org.datanucleus.TransactionImpl.commit(TransactionImpl.java:263)
    at
    org.datanucleus.api.jdo.JDOTransaction.commit(JDOTransaction.java:98)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.commitTransaction(ObjectStore.java:401)
    at sun.reflect.GeneratedMethodAccessor39.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:122)
    at com.sun.proxy.$Proxy0.commitTransaction(Unknown Source)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1199)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1332)
    at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
    org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
    at
    com.sun.proxy.$Proxy5.drop_table_with_environment_context(Unknown Source)
    at
    org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6539)
    at
    org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6523)
    at
    org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
    at
    org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:554)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
    at
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
    at
    java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at
    java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)
    Caused by:
    com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException:
    Cannot delete or update a parent row: a foreign key constraint fails
    (`metastore`.`TAB_COL_STATS`, CONSTRAINT `TAB_COL_STATS_FK` FOREIGN KEY
    (`TBL_ID`) REFERENCES `TBLS` (`TBL_ID`))
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at
    java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
    at com.mysql.jdbc.Util.getInstance(Util.java:386)
    at
    com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1040)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
    at
    com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2625)
    at
    com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2119)
    at
    com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2415)
    at
    com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:1979)
    ... 38 more

    Regards,
    Venkat Ankam

    To unsubscribe from this group and stop receiving emails from it, send
    an email to impala-user+unsubscribe@cloudera.org.
    To unsubscribe from this group and stop receiving emails from it, send
    an email to impala-user+unsubscribe@cloudera.org.
    To unsubscribe from this group and stop receiving emails from it, send an
    email to impala-user+unsubscribe@cloudera.org.
    To unsubscribe from this group and stop receiving emails from it, send an email to impala-user+unsubscribe@cloudera.org.
  • Venkat Ankam at Jul 9, 2014 at 3:49 pm
    Lenni,

    * I am not able to reproduce the error. I tried with few other tables
    today.
    * I did not CANCEL the compute stats. I have cancelled some other queries
    though.
    * I will download the logs and send it to you whenever I get same problem
    again.

    Regards,
    Venkat Ankam

    On Tue, Jul 8, 2014 at 2:29 PM, Lenni Kuff wrote:

    Yes, that's what it appears to have done. I'm glad you were able to get
    unblocked, I believe your Hive Metastore may have been in an inconsistent
    state. I do have a few follow on questions so we can root cause this
    problem and fix it in a future release of Impala or Hive:

    * Are you able to reproduce this issue with a different table?
    * Do you remember how you got into this state? For example, did you happen
    to cancel a COMPUTE STATS or other DDL query?
    * If you still have the Hive Metastore, impalad. and catalogd logs from
    when this problem occurred it would be useful to get them so we can attempt
    to reproduce this problem.

    Thanks,
    Lenni

    On Mon, Jul 7, 2014 at 12:20 PM, Venkat Ankam wrote:

    Lenni,

    I am able to drop the table now. Running stats in Hive fixed the problem?

    Regards,
    Venkat


    On Mon, Jul 7, 2014 at 1:16 PM, Venkat Ankam <venkat@cloudwick.com>
    wrote:
    Lenni,

    hive> analyze table PCM_COST_MGMT compute statistics for columns
    dtl_cost_af_id;

    Command was successful.

    Ended Job = job_1402686439210_0044
    MapReduce Jobs Launched:
    Job 0: Map: 22 Reduce: 1 Cumulative CPU: 2403.22 sec HDFS Read:
    4886317060 HDFS Write: 38 SUCCESS
    Total MapReduce CPU Time Spent: 40 minutes 3 seconds 220 msec
    OK
    Time taken: 109.282 seconds
    hive>

    Regards,
    Venkat

    On Mon, Jul 7, 2014 at 12:12 PM, Lenni Kuff wrote:

    Hi Venkat,
    This appears to be an issue with the Hive Metastore rather than with
    Impala.

    To confirm, can you verify whether you get the same/similar error when
    you compute stats with Hive? You can do this by attempting to run column
    stats via Hive using the syntax:
    analyze table t [partition p] compute statistics for [columns c,...];

    My suspicion is that Hive will also fail.

    Thanks,
    Lenni

    On Mon, Jul 7, 2014 at 9:44 AM, Venkat Ankam <venkat@cloudwick.com>
    wrote:
    Experts,

    Not able to drop an impala table (parquet format with Snappy
    compression) after running - COMPUTE STATS on the table.

    Impala Version: 1.3.1-cdh5
    Using Mysql as metastore.

    Any workarounds on this issue?

    Query: drop table PCM_COST_MGMT
    ERROR: InternalException: javax.jdo.JDODataStoreException: Exception
    thrown flushing changes to datastore
    at
    org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451)
    at
    org.datanucleus.api.jdo.JDOTransaction.commit(JDOTransaction.java:165)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.commitTransaction(ObjectStore.java:401)
    at sun.reflect.GeneratedMethodAccessor39.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:122)
    at com.sun.proxy.$Proxy0.commitTransaction(Unknown Source)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1199)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1332)
    at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
    org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
    at
    com.sun.proxy.$Proxy5.drop_table_with_environment_context(Unknown Source)
    at
    org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6539)
    at
    org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6523)
    at
    org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
    at
    org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:554)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
    at
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
    at
    java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at
    java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)
    NestedThrowablesStackTrace:
    j*ava.sql.BatchUpdateException: Cannot delete or update a parent row:
    a foreign key constraint fails (`metastore`.`TAB_COL_STATS`, CONSTRAINT
    `TAB_COL_STATS_FK` FOREIGN KEY (`TBL_ID`) REFERENCES `TBLS` (`TBL_ID`))*
    at
    com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:2024)
    at
    com.mysql.jdbc.PreparedStatement.executeBatch(PreparedStatement.java:1449)
    at
    com.jolbox.bonecp.StatementHandle.executeBatch(StatementHandle.java:469)
    at
    org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeBatch(ParamLoggingPreparedStatement.java:372)
    at
    org.datanucleus.store.rdbms.SQLController.processConnectionStatement(SQLController.java:628)
    at
    org.datanucleus.store.rdbms.SQLController.processStatementsForConnection(SQLController.java:596)
    at
    org.datanucleus.store.rdbms.SQLController$1.transactionFlushed(SQLController.java:683)
    at
    org.datanucleus.store.connection.AbstractManagedConnection.transactionFlushed(AbstractManagedConnection.java:86)
    at
    org.datanucleus.store.connection.ConnectionManagerImpl$2.transactionFlushed(ConnectionManagerImpl.java:454)
    at
    org.datanucleus.TransactionImpl.flush(TransactionImpl.java:199)
    at
    org.datanucleus.TransactionImpl.commit(TransactionImpl.java:263)
    at
    org.datanucleus.api.jdo.JDOTransaction.commit(JDOTransaction.java:98)
    at
    org.apache.hadoop.hive.metastore.ObjectStore.commitTransaction(ObjectStore.java:401)
    at sun.reflect.GeneratedMethodAccessor39.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
    org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:122)
    at com.sun.proxy.$Proxy0.commitTransaction(Unknown Source)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1199)
    at
    org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1332)
    at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
    org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
    at
    com.sun.proxy.$Proxy5.drop_table_with_environment_context(Unknown Source)
    at
    org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6539)
    at
    org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_table_with_environment_context.getResult(ThriftHiveMetastore.java:6523)
    at
    org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
    at
    org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:554)
    at
    org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
    at
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
    at
    java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at
    java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)
    Caused by:
    com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException:
    Cannot delete or update a parent row: a foreign key constraint fails
    (`metastore`.`TAB_COL_STATS`, CONSTRAINT `TAB_COL_STATS_FK` FOREIGN KEY
    (`TBL_ID`) REFERENCES `TBLS` (`TBL_ID`))
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at
    sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at
    sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at
    java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
    at com.mysql.jdbc.Util.getInstance(Util.java:386)
    at
    com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1040)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
    at
    com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2625)
    at
    com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2119)
    at
    com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2415)
    at
    com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:1979)
    ... 38 more

    Regards,
    Venkat Ankam

    To unsubscribe from this group and stop receiving emails from it, send
    an email to impala-user+unsubscribe@cloudera.org.
    To unsubscribe from this group and stop receiving emails from it, send
    an email to impala-user+unsubscribe@cloudera.org.
    To unsubscribe from this group and stop receiving emails from it, send
    an email to impala-user+unsubscribe@cloudera.org.
    To unsubscribe from this group and stop receiving emails from it, send an
    email to impala-user+unsubscribe@cloudera.org.
    To unsubscribe from this group and stop receiving emails from it, send an email to impala-user+unsubscribe@cloudera.org.

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupimpala-user @
categorieshadoop
postedJul 7, '14 at 4:44p
activeJul 9, '14 at 3:49p
posts6
users3
websitecloudera.com
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase