Grokbase Groups Sqoop dev May 2014
FAQ
Andrey Dmitriev created SQOOP-1329:
--------------------------------------

              Summary: JDBC connection to Oracle timeout after data import but before hive metadata import
                  Key: SQOOP-1329
                  URL: https://issues.apache.org/jira/browse/SQOOP-1329
              Project: Sqoop
           Issue Type: Bug
           Components: connectors/oracle
     Affects Versions: 1.4.4
          Environment: Red Hat Enterprise Linux Server release 6.5
             Reporter: Andrey Dmitriev
             Priority: Critical


When I'm importing a table from Oracle which takes more than 1 hour to extract, I'm getting following error message at the stage when Sqoop tries to import data from temporary HDFS location to Hive:

{quote}
14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Transferred 47.2606 GB in 6,389.4644 seconds (6.7206 MB/sec)
14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Retrieved 98235461 records.
14/05/27 13:05:51 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@566d0085
14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.inputTable: WAREHOUSE.MY_BIG_TABLE
14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.outputTable: WAREHOUSE.MY_BIG_TABLE
14/05/27 13:05:51 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
14/05/27 13:05:51 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
14/05/27 13:05:51 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again

java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again

  at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
  at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)
{quote}

With small tables (under 1 hour) everything is fine.

I'm using Sqoop v1.4.4

{quote}
14/05/27 13:49:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.0
Sqoop 1.4.4-cdh5.0.0
git commit id 8e266e052e423af592871e2dfe09d54c03f6a0e8
{quote}

This problems looks exactly as described in (SQOOP-934) issue.




--
This message was sent by Atlassian JIRA
(v6.2#6252)

Search Discussions

  • Andrey Dmitriev (JIRA) at May 29, 2014 at 10:54 pm
    [ https://issues.apache.org/jira/browse/SQOOP-1329?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Andrey Dmitriev updated SQOOP-1329:
    -----------------------------------

         Attachment: SQOOP-1329.patch

    Catches expired connection and resets it.
    JDBC connection to Oracle timeout after data import but before hive metadata import
    -----------------------------------------------------------------------------------

    Key: SQOOP-1329
    URL: https://issues.apache.org/jira/browse/SQOOP-1329
    Project: Sqoop
    Issue Type: Bug
    Components: connectors/oracle
    Affects Versions: 1.4.4
    Environment: Red Hat Enterprise Linux Server release 6.5
    Reporter: Andrey Dmitriev
    Priority: Critical
    Labels: oracle
    Attachments: SQOOP-1329.patch


    When I'm importing a table from Oracle which takes more than 1 hour to extract, I'm getting following error message at the stage when Sqoop tries to import data from temporary HDFS location to Hive:
    {quote}
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Transferred 47.2606 GB in 6,389.4644 seconds (6.7206 MB/sec)
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Retrieved 98235461 records.
    14/05/27 13:05:51 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@566d0085
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.inputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.outputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)
    {quote}
    With small tables (under 1 hour) everything is fine.
    I'm using Sqoop v1.4.4
    {quote}
    14/05/27 13:49:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.0
    Sqoop 1.4.4-cdh5.0.0
    git commit id 8e266e052e423af592871e2dfe09d54c03f6a0e8
    {quote}
    This problems looks exactly as described in (SQOOP-934) issue.


    --
    This message was sent by Atlassian JIRA
    (v6.2#6252)
  • Gwen Shapira (JIRA) at May 29, 2014 at 11:14 pm
    [ https://issues.apache.org/jira/browse/SQOOP-1329?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14013082#comment-14013082 ]

    Gwen Shapira commented on SQOOP-1329:
    -------------------------------------

    Andrey,

    Is it tested?
    In first glance, I can't see why a connection that got timed-out by the server will be null.
    I thought fixing the issue will require implementing discardConnection() for OracleManager, which will close and null all connections in the cache.

    If its tested and resolves the issue, then never mind :)
    JDBC connection to Oracle timeout after data import but before hive metadata import
    -----------------------------------------------------------------------------------

    Key: SQOOP-1329
    URL: https://issues.apache.org/jira/browse/SQOOP-1329
    Project: Sqoop
    Issue Type: Bug
    Components: connectors/oracle
    Affects Versions: 1.4.4
    Environment: Red Hat Enterprise Linux Server release 6.5
    Reporter: Andrey Dmitriev
    Priority: Critical
    Labels: oracle
    Fix For: 1.4.4

    Attachments: SQOOP-1329.patch


    When I'm importing a table from Oracle which takes more than 1 hour to extract, I'm getting following error message at the stage when Sqoop tries to import data from temporary HDFS location to Hive:
    {quote}
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Transferred 47.2606 GB in 6,389.4644 seconds (6.7206 MB/sec)
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Retrieved 98235461 records.
    14/05/27 13:05:51 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@566d0085
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.inputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.outputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)
    {quote}
    With small tables (under 1 hour) everything is fine.
    I'm using Sqoop v1.4.4
    {quote}
    14/05/27 13:49:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.0
    Sqoop 1.4.4-cdh5.0.0
    git commit id 8e266e052e423af592871e2dfe09d54c03f6a0e8
    {quote}
    This problems looks exactly as described in (SQOOP-934) issue.


    --
    This message was sent by Atlassian JIRA
    (v6.2#6252)
  • Andrey Dmitriev (JIRA) at May 29, 2014 at 11:55 pm
    [ https://issues.apache.org/jira/browse/SQOOP-1329?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14013115#comment-14013115 ]

    Andrey Dmitriev commented on SQOOP-1329:
    ----------------------------------------

    Hi Gwen,

    Yes it is tested. Maybe this is not the best solution, but it works and solves the problem and it has minimal impact to the existing code.
    I set the connection to null, to make sure that it will be passed to the next step where new connection will be established.
    This is my first time, so please let me know if I'm doing something wrong.

    Thank you,
    Andrey


    JDBC connection to Oracle timeout after data import but before hive metadata import
    -----------------------------------------------------------------------------------

    Key: SQOOP-1329
    URL: https://issues.apache.org/jira/browse/SQOOP-1329
    Project: Sqoop
    Issue Type: Bug
    Components: connectors/oracle
    Affects Versions: 1.4.4
    Environment: Red Hat Enterprise Linux Server release 6.5
    Reporter: Andrey Dmitriev
    Priority: Critical
    Labels: oracle
    Fix For: 1.4.4

    Attachments: SQOOP-1329.patch


    When I'm importing a table from Oracle which takes more than 1 hour to extract, I'm getting following error message at the stage when Sqoop tries to import data from temporary HDFS location to Hive:
    {quote}
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Transferred 47.2606 GB in 6,389.4644 seconds (6.7206 MB/sec)
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Retrieved 98235461 records.
    14/05/27 13:05:51 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@566d0085
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.inputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.outputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)
    {quote}
    With small tables (under 1 hour) everything is fine.
    I'm using Sqoop v1.4.4
    {quote}
    14/05/27 13:49:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.0
    Sqoop 1.4.4-cdh5.0.0
    git commit id 8e266e052e423af592871e2dfe09d54c03f6a0e8
    {quote}
    This problems looks exactly as described in (SQOOP-934) issue.


    --
    This message was sent by Atlassian JIRA
    (v6.2#6252)
  • Gwen Shapira (JIRA) at May 30, 2014 at 12:01 am
    [ https://issues.apache.org/jira/browse/SQOOP-1329?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14013120#comment-14013120 ]

    Gwen Shapira commented on SQOOP-1329:
    -------------------------------------

    Thanks a million for the patch, Andrey.

    I support the patch, but since I'm not a committer, my +1 doesn't do much.
    Lets wait for one of the committers to review and comment.
    JDBC connection to Oracle timeout after data import but before hive metadata import
    -----------------------------------------------------------------------------------

    Key: SQOOP-1329
    URL: https://issues.apache.org/jira/browse/SQOOP-1329
    Project: Sqoop
    Issue Type: Bug
    Components: connectors/oracle
    Affects Versions: 1.4.4
    Environment: Red Hat Enterprise Linux Server release 6.5
    Reporter: Andrey Dmitriev
    Priority: Critical
    Labels: oracle
    Fix For: 1.4.4

    Attachments: SQOOP-1329.patch


    When I'm importing a table from Oracle which takes more than 1 hour to extract, I'm getting following error message at the stage when Sqoop tries to import data from temporary HDFS location to Hive:
    {quote}
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Transferred 47.2606 GB in 6,389.4644 seconds (6.7206 MB/sec)
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Retrieved 98235461 records.
    14/05/27 13:05:51 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@566d0085
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.inputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.outputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)
    {quote}
    With small tables (under 1 hour) everything is fine.
    I'm using Sqoop v1.4.4
    {quote}
    14/05/27 13:49:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.0
    Sqoop 1.4.4-cdh5.0.0
    git commit id 8e266e052e423af592871e2dfe09d54c03f6a0e8
    {quote}
    This problems looks exactly as described in (SQOOP-934) issue.


    --
    This message was sent by Atlassian JIRA
    (v6.2#6252)
  • Jarek Jarcec Cecho (JIRA) at May 30, 2014 at 1:39 am
    [ https://issues.apache.org/jira/browse/SQOOP-1329?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Jarek Jarcec Cecho reassigned SQOOP-1329:
    -----------------------------------------

         Assignee: Andrey Dmitriev
    JDBC connection to Oracle timeout after data import but before hive metadata import
    -----------------------------------------------------------------------------------

    Key: SQOOP-1329
    URL: https://issues.apache.org/jira/browse/SQOOP-1329
    Project: Sqoop
    Issue Type: Bug
    Components: connectors/oracle
    Affects Versions: 1.4.4
    Environment: Red Hat Enterprise Linux Server release 6.5
    Reporter: Andrey Dmitriev
    Assignee: Andrey Dmitriev
    Priority: Critical
    Labels: oracle
    Fix For: 1.4.4

    Attachments: SQOOP-1329.patch


    When I'm importing a table from Oracle which takes more than 1 hour to extract, I'm getting following error message at the stage when Sqoop tries to import data from temporary HDFS location to Hive:
    {quote}
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Transferred 47.2606 GB in 6,389.4644 seconds (6.7206 MB/sec)
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Retrieved 98235461 records.
    14/05/27 13:05:51 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@566d0085
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.inputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.outputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)
    {quote}
    With small tables (under 1 hour) everything is fine.
    I'm using Sqoop v1.4.4
    {quote}
    14/05/27 13:49:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.0
    Sqoop 1.4.4-cdh5.0.0
    git commit id 8e266e052e423af592871e2dfe09d54c03f6a0e8
    {quote}
    This problems looks exactly as described in (SQOOP-934) issue.


    --
    This message was sent by Atlassian JIRA
    (v6.2#6252)
  • Jarek Jarcec Cecho (JIRA) at May 30, 2014 at 1:39 am
    [ https://issues.apache.org/jira/browse/SQOOP-1329?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14013195#comment-14013195 ]

    Jarek Jarcec Cecho commented on SQOOP-1329:
    -------------------------------------------

    On the contrary, your +1 do count [~gwenshap]! I don't have further comments, so I'll commit it momentarily.
    JDBC connection to Oracle timeout after data import but before hive metadata import
    -----------------------------------------------------------------------------------

    Key: SQOOP-1329
    URL: https://issues.apache.org/jira/browse/SQOOP-1329
    Project: Sqoop
    Issue Type: Bug
    Components: connectors/oracle
    Affects Versions: 1.4.4
    Environment: Red Hat Enterprise Linux Server release 6.5
    Reporter: Andrey Dmitriev
    Priority: Critical
    Labels: oracle
    Fix For: 1.4.4

    Attachments: SQOOP-1329.patch


    When I'm importing a table from Oracle which takes more than 1 hour to extract, I'm getting following error message at the stage when Sqoop tries to import data from temporary HDFS location to Hive:
    {quote}
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Transferred 47.2606 GB in 6,389.4644 seconds (6.7206 MB/sec)
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Retrieved 98235461 records.
    14/05/27 13:05:51 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@566d0085
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.inputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.outputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)
    {quote}
    With small tables (under 1 hour) everything is fine.
    I'm using Sqoop v1.4.4
    {quote}
    14/05/27 13:49:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.0
    Sqoop 1.4.4-cdh5.0.0
    git commit id 8e266e052e423af592871e2dfe09d54c03f6a0e8
    {quote}
    This problems looks exactly as described in (SQOOP-934) issue.


    --
    This message was sent by Atlassian JIRA
    (v6.2#6252)
  • ASF subversion and git services (JIRA) at May 30, 2014 at 1:39 am
    [ https://issues.apache.org/jira/browse/SQOOP-1329?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14013196#comment-14013196 ]

    ASF subversion and git services commented on SQOOP-1329:
    --------------------------------------------------------

    Commit d902d2449f60584de58a101fd128626944d6cd53 in sqoop's branch refs/heads/trunk from [~jarcec]
    [ https://git-wip-us.apache.org/repos/asf?p=sqoop.git;h=d902d24 ]

    SQOOP-1329: JDBC connection to Oracle timeout after data import but before hive metadata import

    (Andrey Dmitriev via Jarek Jarcec Cecho)

    JDBC connection to Oracle timeout after data import but before hive metadata import
    -----------------------------------------------------------------------------------

    Key: SQOOP-1329
    URL: https://issues.apache.org/jira/browse/SQOOP-1329
    Project: Sqoop
    Issue Type: Bug
    Components: connectors/oracle
    Affects Versions: 1.4.4
    Environment: Red Hat Enterprise Linux Server release 6.5
    Reporter: Andrey Dmitriev
    Assignee: Andrey Dmitriev
    Priority: Critical
    Labels: oracle
    Fix For: 1.4.4

    Attachments: SQOOP-1329.patch


    When I'm importing a table from Oracle which takes more than 1 hour to extract, I'm getting following error message at the stage when Sqoop tries to import data from temporary HDFS location to Hive:
    {quote}
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Transferred 47.2606 GB in 6,389.4644 seconds (6.7206 MB/sec)
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Retrieved 98235461 records.
    14/05/27 13:05:51 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@566d0085
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.inputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.outputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)
    {quote}
    With small tables (under 1 hour) everything is fine.
    I'm using Sqoop v1.4.4
    {quote}
    14/05/27 13:49:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.0
    Sqoop 1.4.4-cdh5.0.0
    git commit id 8e266e052e423af592871e2dfe09d54c03f6a0e8
    {quote}
    This problems looks exactly as described in (SQOOP-934) issue.


    --
    This message was sent by Atlassian JIRA
    (v6.2#6252)
  • Jarek Jarcec Cecho (JIRA) at May 30, 2014 at 1:41 am
    [ https://issues.apache.org/jira/browse/SQOOP-1329?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Jarek Jarcec Cecho updated SQOOP-1329:
    --------------------------------------

         Fix Version/s: (was: 1.4.4)
                        1.4.5
    JDBC connection to Oracle timeout after data import but before hive metadata import
    -----------------------------------------------------------------------------------

    Key: SQOOP-1329
    URL: https://issues.apache.org/jira/browse/SQOOP-1329
    Project: Sqoop
    Issue Type: Bug
    Components: connectors/oracle
    Affects Versions: 1.4.4
    Environment: Red Hat Enterprise Linux Server release 6.5
    Reporter: Andrey Dmitriev
    Assignee: Andrey Dmitriev
    Priority: Critical
    Labels: oracle
    Fix For: 1.4.5

    Attachments: SQOOP-1329.patch


    When I'm importing a table from Oracle which takes more than 1 hour to extract, I'm getting following error message at the stage when Sqoop tries to import data from temporary HDFS location to Hive:
    {quote}
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Transferred 47.2606 GB in 6,389.4644 seconds (6.7206 MB/sec)
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Retrieved 98235461 records.
    14/05/27 13:05:51 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@566d0085
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.inputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.outputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)
    {quote}
    With small tables (under 1 hour) everything is fine.
    I'm using Sqoop v1.4.4
    {quote}
    14/05/27 13:49:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.0
    Sqoop 1.4.4-cdh5.0.0
    git commit id 8e266e052e423af592871e2dfe09d54c03f6a0e8
    {quote}
    This problems looks exactly as described in (SQOOP-934) issue.


    --
    This message was sent by Atlassian JIRA
    (v6.2#6252)
  • Hudson (JIRA) at May 30, 2014 at 2:58 am
    [ https://issues.apache.org/jira/browse/SQOOP-1329?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14013237#comment-14013237 ]

    Hudson commented on SQOOP-1329:
    -------------------------------

    SUCCESS: Integrated in Sqoop-ant-jdk-1.6-hadoop200 #891 (See [https://builds.apache.org/job/Sqoop-ant-jdk-1.6-hadoop200/891/])
    SQOOP-1329: JDBC connection to Oracle timeout after data import but before hive metadata import (jarcec: https://git-wip-us.apache.org/repos/asf?p=sqoop.git&a=commit&h=d902d2449f60584de58a101fd128626944d6cd53)
    * src/java/org/apache/sqoop/manager/OracleManager.java

    JDBC connection to Oracle timeout after data import but before hive metadata import
    -----------------------------------------------------------------------------------

    Key: SQOOP-1329
    URL: https://issues.apache.org/jira/browse/SQOOP-1329
    Project: Sqoop
    Issue Type: Bug
    Components: connectors/oracle
    Affects Versions: 1.4.4
    Environment: Red Hat Enterprise Linux Server release 6.5
    Reporter: Andrey Dmitriev
    Assignee: Andrey Dmitriev
    Priority: Critical
    Labels: oracle
    Fix For: 1.4.5

    Attachments: SQOOP-1329.patch


    When I'm importing a table from Oracle which takes more than 1 hour to extract, I'm getting following error message at the stage when Sqoop tries to import data from temporary HDFS location to Hive:
    {quote}
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Transferred 47.2606 GB in 6,389.4644 seconds (6.7206 MB/sec)
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Retrieved 98235461 records.
    14/05/27 13:05:51 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@566d0085
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.inputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.outputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)
    {quote}
    With small tables (under 1 hour) everything is fine.
    I'm using Sqoop v1.4.4
    {quote}
    14/05/27 13:49:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.0
    Sqoop 1.4.4-cdh5.0.0
    git commit id 8e266e052e423af592871e2dfe09d54c03f6a0e8
    {quote}
    This problems looks exactly as described in (SQOOP-934) issue.


    --
    This message was sent by Atlassian JIRA
    (v6.2#6252)
  • Hudson (JIRA) at May 30, 2014 at 3:16 am
    [ https://issues.apache.org/jira/browse/SQOOP-1329?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14013244#comment-14013244 ]

    Hudson commented on SQOOP-1329:
    -------------------------------

    SUCCESS: Integrated in Sqoop-ant-jdk-1.6-hadoop20 #885 (See [https://builds.apache.org/job/Sqoop-ant-jdk-1.6-hadoop20/885/])
    SQOOP-1329: JDBC connection to Oracle timeout after data import but before hive metadata import (jarcec: https://git-wip-us.apache.org/repos/asf?p=sqoop.git&a=commit&h=d902d2449f60584de58a101fd128626944d6cd53)
    * src/java/org/apache/sqoop/manager/OracleManager.java

    JDBC connection to Oracle timeout after data import but before hive metadata import
    -----------------------------------------------------------------------------------

    Key: SQOOP-1329
    URL: https://issues.apache.org/jira/browse/SQOOP-1329
    Project: Sqoop
    Issue Type: Bug
    Components: connectors/oracle
    Affects Versions: 1.4.4
    Environment: Red Hat Enterprise Linux Server release 6.5
    Reporter: Andrey Dmitriev
    Assignee: Andrey Dmitriev
    Priority: Critical
    Labels: oracle
    Fix For: 1.4.5

    Attachments: SQOOP-1329.patch


    When I'm importing a table from Oracle which takes more than 1 hour to extract, I'm getting following error message at the stage when Sqoop tries to import data from temporary HDFS location to Hive:
    {quote}
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Transferred 47.2606 GB in 6,389.4644 seconds (6.7206 MB/sec)
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Retrieved 98235461 records.
    14/05/27 13:05:51 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@566d0085
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.inputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.outputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)
    {quote}
    With small tables (under 1 hour) everything is fine.
    I'm using Sqoop v1.4.4
    {quote}
    14/05/27 13:49:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.0
    Sqoop 1.4.4-cdh5.0.0
    git commit id 8e266e052e423af592871e2dfe09d54c03f6a0e8
    {quote}
    This problems looks exactly as described in (SQOOP-934) issue.


    --
    This message was sent by Atlassian JIRA
    (v6.2#6252)
  • Hudson (JIRA) at May 30, 2014 at 3:24 am
    [ https://issues.apache.org/jira/browse/SQOOP-1329?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14013251#comment-14013251 ]

    Hudson commented on SQOOP-1329:
    -------------------------------

    SUCCESS: Integrated in Sqoop-ant-jdk-1.6-hadoop23 #1088 (See [https://builds.apache.org/job/Sqoop-ant-jdk-1.6-hadoop23/1088/])
    SQOOP-1329: JDBC connection to Oracle timeout after data import but before hive metadata import (jarcec: https://git-wip-us.apache.org/repos/asf?p=sqoop.git&a=commit&h=d902d2449f60584de58a101fd128626944d6cd53)
    * src/java/org/apache/sqoop/manager/OracleManager.java

    JDBC connection to Oracle timeout after data import but before hive metadata import
    -----------------------------------------------------------------------------------

    Key: SQOOP-1329
    URL: https://issues.apache.org/jira/browse/SQOOP-1329
    Project: Sqoop
    Issue Type: Bug
    Components: connectors/oracle
    Affects Versions: 1.4.4
    Environment: Red Hat Enterprise Linux Server release 6.5
    Reporter: Andrey Dmitriev
    Assignee: Andrey Dmitriev
    Priority: Critical
    Labels: oracle
    Fix For: 1.4.5

    Attachments: SQOOP-1329.patch


    When I'm importing a table from Oracle which takes more than 1 hour to extract, I'm getting following error message at the stage when Sqoop tries to import data from temporary HDFS location to Hive:
    {quote}
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Transferred 47.2606 GB in 6,389.4644 seconds (6.7206 MB/sec)
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Retrieved 98235461 records.
    14/05/27 13:05:51 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@566d0085
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.inputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.outputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)
    {quote}
    With small tables (under 1 hour) everything is fine.
    I'm using Sqoop v1.4.4
    {quote}
    14/05/27 13:49:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.0
    Sqoop 1.4.4-cdh5.0.0
    git commit id 8e266e052e423af592871e2dfe09d54c03f6a0e8
    {quote}
    This problems looks exactly as described in (SQOOP-934) issue.


    --
    This message was sent by Atlassian JIRA
    (v6.2#6252)
  • Hudson (JIRA) at May 30, 2014 at 3:58 am
    [ https://issues.apache.org/jira/browse/SQOOP-1329?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14013268#comment-14013268 ]

    Hudson commented on SQOOP-1329:
    -------------------------------

    SUCCESS: Integrated in Sqoop-ant-jdk-1.6-hadoop100 #850 (See [https://builds.apache.org/job/Sqoop-ant-jdk-1.6-hadoop100/850/])
    SQOOP-1329: JDBC connection to Oracle timeout after data import but before hive metadata import (jarcec: https://git-wip-us.apache.org/repos/asf?p=sqoop.git&a=commit&h=d902d2449f60584de58a101fd128626944d6cd53)
    * src/java/org/apache/sqoop/manager/OracleManager.java

    JDBC connection to Oracle timeout after data import but before hive metadata import
    -----------------------------------------------------------------------------------

    Key: SQOOP-1329
    URL: https://issues.apache.org/jira/browse/SQOOP-1329
    Project: Sqoop
    Issue Type: Bug
    Components: connectors/oracle
    Affects Versions: 1.4.4
    Environment: Red Hat Enterprise Linux Server release 6.5
    Reporter: Andrey Dmitriev
    Assignee: Andrey Dmitriev
    Priority: Critical
    Labels: oracle
    Fix For: 1.4.5

    Attachments: SQOOP-1329.patch


    When I'm importing a table from Oracle which takes more than 1 hour to extract, I'm getting following error message at the stage when Sqoop tries to import data from temporary HDFS location to Hive:
    {quote}
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Transferred 47.2606 GB in 6,389.4644 seconds (6.7206 MB/sec)
    14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Retrieved 98235461 records.
    14/05/27 13:05:51 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@566d0085
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.inputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.outputTable: WAREHOUSE.MY_BIG_TABLE
    14/05/27 13:05:51 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
    14/05/27 13:05:51 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)
    {quote}
    With small tables (under 1 hour) everything is fine.
    I'm using Sqoop v1.4.4
    {quote}
    14/05/27 13:49:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.0
    Sqoop 1.4.4-cdh5.0.0
    git commit id 8e266e052e423af592871e2dfe09d54c03f6a0e8
    {quote}
    This problems looks exactly as described in (SQOOP-934) issue.


    --
    This message was sent by Atlassian JIRA
    (v6.2#6252)

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupdev @
categoriessqoop, hadoop
postedMay 28, '14 at 8:51a
activeMay 30, '14 at 3:58a
posts13
users1
websitesqoop.apache.org

1 user in discussion

Hudson (JIRA): 13 posts

People

Translate

site design / logo © 2018 Grokbase