FAQ
Hi, I get this exception when I try to join two hive tables or even when I use a specific WHERE clause. "SELECT *" from any individual table seems to work fine. Any idea what is missing here. I am on hive version hive-0.7.0-cdh3u0.

java.lang.IllegalArgumentException: Can not create a Path from an empty string
at org.apache.hadoop.fs.Path.checkPathArg(Path.java:82)
at org.apache.hadoop.fs.Path.(Path.java:50)
at org.apache.hadoop.mapred.JobClient.copyRemoteFiles(JobClient.java:608)
at org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:713)
at org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:637)
at org.apache.hadoop.mapred.JobClient.access$300(JobClient.java:170)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:848)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807)
at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:657)
at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:123)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:164)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:241)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:456)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:186)
Job Submission failed with exception 'java.lang.IllegalArgumentException(Can not create a Path from an empty string)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask

Search Discussions

  • Koert Kuipers at Sep 21, 2011 at 8:49 pm
    "select * from table" does not use map-reduce
    so it seems your error has to do with hadoop/map-reduce, not hive
    i would run some test for map-reduce

    On Wed, Sep 21, 2011 at 4:11 PM, Krish Khambadkone
    wrote:
    Hi, I get this exception when I try to join two hive tables or even when I
    use a specific WHERE clause. "SELECT *" from any individual table seems to
    work fine. Any idea what is missing here. I am on hive version
    hive-0.7.0-cdh3u0.

    java.lang.IllegalArgumentException: Can not create a Path from an empty
    string
    at org.apache.hadoop.fs.Path.checkPathArg(Path.java:82)
    at org.apache.hadoop.fs.Path.<init>(Path.java:90)
    at org.apache.hadoop.fs.Path.<init>(Path.java:50)
    at
    org.apache.hadoop.mapred.JobClient.copyRemoteFiles(JobClient.java:608)
    at
    org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:713)
    at
    org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:637)
    at org.apache.hadoop.mapred.JobClient.access$300(JobClient.java:170)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:848)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
    at
    org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
    at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807)
    at
    org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:657)
    at
    org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:123)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130)
    at
    org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748)
    at
    org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:164)
    at
    org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:241)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:456)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:186)
    Job Submission failed with exception
    'java.lang.IllegalArgumentException(Can not create a Path from an empty
    string)'
    FAILED: Execution Error, return code 1 from
    org.apache.hadoop.hive.ql.exec.MapRedTask
  • Ayon Sinha at Sep 21, 2011 at 9:11 pm
    If you can share details of your tables and query we might be able to help. Do a desc formatted <tablename>

    -Ayon
    See My Photos on Flickr
    Also check out my Blog for answers to commonly asked questions.



    ________________________________
    From: Krish Khambadkone <kkhambadkone@apple.com>
    To: user@hive.apache.org
    Sent: Wednesday, September 21, 2011 1:11 PM
    Subject: Exception when joining HIVE tables

    Hi,  I get this exception when I try to join two hive tables or even when I use a specific WHERE clause.  "SELECT *" from any individual table seems to work fine.  Any idea what is missing here.  I am on hive version hive-0.7.0-cdh3u0.

    java.lang.IllegalArgumentException: Can not create a Path from an empty string
    at org.apache.hadoop.fs.Path.checkPathArg(Path.java:82)
    at org.apache.hadoop.fs.Path.(Path.java:50)
    at org.apache.hadoop.mapred.JobClient.copyRemoteFiles(JobClient.java:608)
    at org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:713)
    at org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:637)
    at org.apache.hadoop.mapred.JobClient.access$300(JobClient.java:170)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:848)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
    at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:657)
    at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:123)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:164)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:241)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:456)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:186)
    Job Submission failed with exception 'java.lang.IllegalArgumentException(Can not create a Path from an empty string)'
    FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask
  • Krish Khambadkone at Sep 21, 2011 at 9:46 pm
    Here is the table info, and the query is "select acctbal, availqty, partkey from partsupp JOIN supplier ON (partsupp.suppkey == supplier.suppkey); "

    desc formatted supplier;
    OK
    # col_name data_type comment

    key string None
    acctbal string None
    address string None
    name string None
    nationkey bigint None
    phone string None
    suppkey bigint None

    # Detailed Table Information
    Database: default
    Owner: hadoop
    CreateTime: Wed Sep 21 13:05:50 PDT 2011
    LastAccessTime: UNKNOWN
    Protect Mode: None
    Retention: 0
    Location: hdfs://localhost:9000/user/hive/warehouse/supplier
    Table Type: MANAGED_TABLE
    Table Parameters:
    transient_lastDdlTime 1316635649

    # Storage Information
    SerDe Library: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
    InputFormat: org.apache.hadoop.mapred.TextInputFormat
    OutputFormat: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
    Compressed: No
    Num Buckets: -1
    Bucket Columns: []
    Sort Columns: []
    Storage Desc Params:
    field.delim ,
    serialization.format ,
    Time taken: 0.213 seconds


    desc formatted partsupp;
    OK
    # col_name data_type comment

    key string None
    availqty int None
    partkey bigint None
    suppkey bigint None
    supplycost double None

    # Detailed Table Information
    Database: default
    Owner: hadoop
    CreateTime: Wed Sep 21 13:05:37 PDT 2011
    LastAccessTime: UNKNOWN
    Protect Mode: None
    Retention: 0
    Location: hdfs://localhost:9000/user/hive/warehouse/partsupp
    Table Type: MANAGED_TABLE
    Table Parameters:
    transient_lastDdlTime 1316635698

    # Storage Information
    SerDe Library: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
    InputFormat: org.apache.hadoop.mapred.TextInputFormat
    OutputFormat: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
    Compressed: No
    Num Buckets: -1
    Bucket Columns: []
    Sort Columns: []
    Storage Desc Params:
    field.delim ,
    serialization.format ,
    Time taken: 2.192 seconds

    On Sep 21, 2011, at 2:10 PM, Ayon Sinha wrote:

    If you can share details of your tables and query we might be able to help. Do a desc formatted <tablename>

    -Ayon
    See My Photos on Flickr
    Also check out my Blog for answers to commonly asked questions.

    From: Krish Khambadkone <kkhambadkone@apple.com>
    To: user@hive.apache.org
    Sent: Wednesday, September 21, 2011 1:11 PM
    Subject: Exception when joining HIVE tables

    Hi, I get this exception when I try to join two hive tables or even when I use a specific WHERE clause. "SELECT *" from any individual table seems to work fine. Any idea what is missing here. I am on hive version hive-0.7.0-cdh3u0.

    java.lang.IllegalArgumentException: Can not create a Path from an empty string
    at org.apache.hadoop.fs.Path.checkPathArg(Path.java:82)
    at org.apache.hadoop.fs.Path.<init>(Path.java:90)
    at org.apache.hadoop.fs.Path.<init>(Path.java:50)
    at org.apache.hadoop.mapred.JobClient.copyRemoteFiles(JobClient.java:608)
    at org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:713)
    at org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:637)
    at org.apache.hadoop.mapred.JobClient.access$300(JobClient.java:170)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:848)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
    at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:657)
    at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:123)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:164)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:241)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:456)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:186)
    Job Submission failed with exception 'java.lang.IllegalArgumentException(Can not create a Path from an empty string)'
    FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask
  • Ayon Sinha at Sep 21, 2011 at 10:01 pm
    I'm a bit concerned about port 9000 for the HDFS location. Is your namenode at port 9000? Can you run
    hadoop dfs -ls  hdfs://localhost:9000/user/hive/warehouse/supplier

    -Ayon
    See My Photos on Flickr
    Also check out my Blog for answers to commonly asked questions.



    ________________________________
    From: Krish Khambadkone <kkhambadkone@apple.com>
    To: user@hive.apache.org
    Sent: Wednesday, September 21, 2011 2:45 PM
    Subject: Re: Exception when joining HIVE tables


    Here is the table info,   and the query is "select acctbal, availqty, partkey from partsupp JOIN supplier ON (partsupp.suppkey == supplier.suppkey); "

    desc formatted supplier;
    OK
    # col_name            data_type           comment

    key                 string              None
    acctbal             string              None
    address             string              None
    name                string              None
    nationkey           bigint              None
    phone               string              None
    suppkey             bigint              None

    # Detailed Table Information
    Database:           default
    Owner:              hadoop
    CreateTime:         Wed Sep 21 13:05:50 PDT 2011
    LastAccessTime:     UNKNOWN
    Protect Mode:       None
    Retention:          0
    Location:           hdfs://localhost:9000/user/hive/warehouse/supplier
    Table Type:         MANAGED_TABLE
    Table Parameters:
    transient_lastDdlTime1316635649

    # Storage Information
    SerDe Library:      org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
    InputFormat:        org.apache.hadoop.mapred.TextInputFormat
    OutputFormat:       org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
    Compressed:         No
    Num Buckets:        -1
    Bucket Columns:     []
    Sort Columns:       []
    Storage Desc Params:
    field.delim         ,
    serialization.format,
    Time taken: 0.213 seconds


    desc formatted partsupp;
    OK
    # col_name            data_type           comment

    key                 string              None
    availqty            int                 None
    partkey             bigint              None
    suppkey             bigint              None
    supplycost          double              None

    # Detailed Table Information
    Database:           default
    Owner:              hadoop
    CreateTime:         Wed Sep 21 13:05:37 PDT 2011
    LastAccessTime:     UNKNOWN
    Protect Mode:       None
    Retention:          0
    Location:           hdfs://localhost:9000/user/hive/warehouse/partsupp
    Table Type:         MANAGED_TABLE
    Table Parameters:
    transient_lastDdlTime1316635698

    # Storage Information
    SerDe Library:      org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
    InputFormat:        org.apache.hadoop.mapred.TextInputFormat
    OutputFormat:       org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
    Compressed:         No
    Num Buckets:        -1
    Bucket Columns:     []
    Sort Columns:       []
    Storage Desc Params:
    field.delim         ,
    serialization.format,
    Time taken: 2.192 seconds


    On Sep 21, 2011, at 2:10 PM, Ayon Sinha wrote:

    If you can share details of your tables and query we might be able to help. Do a desc formatted <tablename>
    -Ayon
    See My Photos on Flickr
    Also check out my Blog for answers to commonly asked questions.



    ________________________________
    From: Krish Khambadkone <kkhambadkone@apple.com>
    To: user@hive.apache.org
    Sent: Wednesday, September 21, 2011 1:11 PM
    Subject: Exception when joining HIVE tables

    Hi,  I get this exception when I try to join two hive tables or even when I use a specific WHERE clause.  "SELECT *" from any individual table seems to work fine.  Any idea what is missing here.  I am on hive version hive-0.7.0-cdh3u0.

    java.lang.IllegalArgumentException: Can not create a Path from an empty string
    at org.apache.hadoop.fs.Path.checkPathArg(Path.java:82)
    at org.apache.hadoop.fs.Path.<init>(Path.java:90)
    at org.apache.hadoop.fs.Path.<init>(Path.java:50)
    at org.apache.hadoop.mapred.JobClient.copyRemoteFiles(JobClient.java:608)
    at org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:713)
    at org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:637)
    at org.apache.hadoop.mapred.JobClient.access$300(JobClient.java:170)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:848)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
    at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:657)
    at
    org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:123)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:164)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:241)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:456)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:186)
    Job Submission failed with exception 'java.lang.IllegalArgumentException(Can not create a Path from an empty string)'
    FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask

  • Krish Khambadkone at Sep 21, 2011 at 10:32 pm
    It does return this,

    -rw-r--r-- 3 hadoop supergroup 99 2011-09-21 13:07 /user/hive/warehouse/supplier/supplier.txt
    On Sep 21, 2011, at 3:01 PM, Ayon Sinha wrote:

    I'm a bit concerned about port 9000 for the HDFS location. Is your namenode at port 9000? Can you run
    hadoop dfs -ls hdfs://localhost:9000/user/hive/warehouse/supplier

    -Ayon
    See My Photos on Flickr
    Also check out my Blog for answers to commonly asked questions.

    From: Krish Khambadkone <kkhambadkone@apple.com>
    To: user@hive.apache.org
    Sent: Wednesday, September 21, 2011 2:45 PM
    Subject: Re: Exception when joining HIVE tables

    Here is the table info, and the query is "select acctbal, availqty, partkey from partsupp JOIN supplier ON (partsupp.suppkey == supplier.suppkey); "

    desc formatted supplier;
    OK
    # col_name data_type comment

    key string None
    acctbal string None
    address string None
    name string None
    nationkey bigint None
    phone string None
    suppkey bigint None

    # Detailed Table Information
    Database: default
    Owner: hadoop
    CreateTime: Wed Sep 21 13:05:50 PDT 2011
    LastAccessTime: UNKNOWN
    Protect Mode: None
    Retention: 0
    Location: hdfs://localhost:9000/user/hive/warehouse/supplier
    Table Type: MANAGED_TABLE
    Table Parameters:
    transient_lastDdlTime 1316635649

    # Storage Information

    SerDe Library: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
    InputFormat: org.apache.hadoop.mapred.TextInputFormat
    OutputFormat: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
    Compressed: No
    Num Buckets: -1
    Bucket Columns: []
    Sort Columns: []
    Storage Desc Params:
    field.delim ,
    serialization.format ,
    Time taken: 0.213 seconds


    desc formatted partsupp;
    OK
    # col_name data_type comment

    key string None
    availqty int None
    partkey bigint None
    suppkey bigint None
    supplycost double None

    # Detailed Table Information
    Database: default
    Owner: hadoop
    CreateTime: Wed Sep 21 13:05:37 PDT 2011
    LastAccessTime: UNKNOWN
    Protect Mode: None
    Retention: 0

    Location: hdfs://localhost:9000/user/hive/warehouse/partsupp
    Table Type: MANAGED_TABLE
    Table Parameters:
    transient_lastDdlTime 1316635698

    # Storage Information
    SerDe Library: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
    InputFormat: org.apache.hadoop.mapred.TextInputFormat
    OutputFormat:
    org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
    Compressed: No
    Num Buckets: -1
    Bucket Columns: []
    Sort Columns: []
    Storage Desc Params:
    field.delim ,
    serialization.format ,
    Time taken: 2.192 seconds

    On Sep 21, 2011, at 2:10 PM, Ayon Sinha wrote:

    If you can share details of your tables and query we might be able to help. Do a desc formatted <tablename>

    -Ayon
    See My Photos on Flickr
    Also check out my Blog for answers to commonly asked questions.

    From: Krish Khambadkone <kkhambadkone@apple.com>
    To: user@hive.apache.org
    Sent: Wednesday, September 21, 2011 1:11 PM
    Subject: Exception when joining HIVE tables

    Hi, I get this exception when I try to join two hive tables or even when I use a specific WHERE clause. "SELECT *" from any individual table seems to work fine. Any idea what is missing here. I am on hive version hive-0.7.0-cdh3u0.

    java.lang.IllegalArgumentException: Can not create a Path from an empty string
    at org.apache.hadoop.fs.Path.checkPathArg(Path.java:82)
    at org.apache.hadoop.fs.Path.<init>(Path.java:90)
    at org.apache.hadoop.fs.Path.<init>(Path.java:50)
    at org.apache.hadoop.mapred.JobClient.copyRemoteFiles(JobClient.java:608)
    at org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:713)
    at org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:637)
    at org.apache.hadoop.mapred.JobClient.access$300(JobClient.java:170)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:848)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
    at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:657)
    at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:123)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:164)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:241)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:456)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:186)
    Job Submission failed with exception 'java.lang.IllegalArgumentException(Can not create a Path from an empty string)'
    FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask
  • Krish Khambadkone at Sep 22, 2011 at 2:29 am
    I am now getting this exception, all my hadoop and hbase processes are running (nn,dnn,dn,jt,tt, hm, hr, hq)

    java.net.ConnectException: Call to localhost/127.0.0.1:9000 failed on connection exception: java.net.ConnectException: Connection refused
    at org.apache.hadoop.ipc.Client.wrapException(Client.java:1131)
    at org.apache.hadoop.ipc.Client.call(Client.java:1107)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
    at $Proxy4.getProtocolVersion(Unknown Source)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)
    at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111)
    at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:180)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:183)
    at org.apache.hadoop.hive.ql.exec.Utilities.isEmptyPath(Utilities.java:1475)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.addInputPaths(ExecDriver.java:1253)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:632)
    at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:123)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:164)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:241)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:456)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:186)
    Caused by: java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:408)
    at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:425)
    at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:532)
    at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:210)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1244)
    at org.apache.hadoop.ipc.Client.call(Client.java:1075)

    On Sep 21, 2011, at 3:01 PM, Ayon Sinha wrote:

    I'm a bit concerned about port 9000 for the HDFS location. Is your namenode at port 9000? Can you run
    hadoop dfs -ls hdfs://localhost:9000/user/hive/warehouse/supplier

    -Ayon
    See My Photos on Flickr
    Also check out my Blog for answers to commonly asked questions.

    From: Krish Khambadkone <kkhambadkone@apple.com>
    To: user@hive.apache.org
    Sent: Wednesday, September 21, 2011 2:45 PM
    Subject: Re: Exception when joining HIVE tables

    Here is the table info, and the query is "select acctbal, availqty, partkey from partsupp JOIN supplier ON (partsupp.suppkey == supplier.suppkey); "

    desc formatted supplier;
    OK
    # col_name data_type comment

    key string None
    acctbal string None
    address string None
    name string None
    nationkey bigint None
    phone string None
    suppkey bigint None

    # Detailed Table Information
    Database: default
    Owner: hadoop
    CreateTime: Wed Sep 21 13:05:50 PDT 2011
    LastAccessTime: UNKNOWN
    Protect Mode: None
    Retention: 0
    Location: hdfs://localhost:9000/user/hive/warehouse/supplier
    Table Type: MANAGED_TABLE
    Table Parameters:
    transient_lastDdlTime 1316635649

    # Storage Information

    SerDe Library: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
    InputFormat: org.apache.hadoop.mapred.TextInputFormat
    OutputFormat: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
    Compressed: No
    Num Buckets: -1
    Bucket Columns: []
    Sort Columns: []
    Storage Desc Params:
    field.delim ,
    serialization.format ,
    Time taken: 0.213 seconds


    desc formatted partsupp;
    OK
    # col_name data_type comment

    key string None
    availqty int None
    partkey bigint None
    suppkey bigint None
    supplycost double None

    # Detailed Table Information
    Database: default
    Owner: hadoop
    CreateTime: Wed Sep 21 13:05:37 PDT 2011
    LastAccessTime: UNKNOWN
    Protect Mode: None
    Retention: 0

    Location: hdfs://localhost:9000/user/hive/warehouse/partsupp
    Table Type: MANAGED_TABLE
    Table Parameters:
    transient_lastDdlTime 1316635698

    # Storage Information
    SerDe Library: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
    InputFormat: org.apache.hadoop.mapred.TextInputFormat
    OutputFormat:
    org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
    Compressed: No
    Num Buckets: -1
    Bucket Columns: []
    Sort Columns: []
    Storage Desc Params:
    field.delim ,
    serialization.format ,
    Time taken: 2.192 seconds

    On Sep 21, 2011, at 2:10 PM, Ayon Sinha wrote:

    If you can share details of your tables and query we might be able to help. Do a desc formatted <tablename>

    -Ayon
    See My Photos on Flickr
    Also check out my Blog for answers to commonly asked questions.

    From: Krish Khambadkone <kkhambadkone@apple.com>
    To: user@hive.apache.org
    Sent: Wednesday, September 21, 2011 1:11 PM
    Subject: Exception when joining HIVE tables

    Hi, I get this exception when I try to join two hive tables or even when I use a specific WHERE clause. "SELECT *" from any individual table seems to work fine. Any idea what is missing here. I am on hive version hive-0.7.0-cdh3u0.

    java.lang.IllegalArgumentException: Can not create a Path from an empty string
    at org.apache.hadoop.fs.Path.checkPathArg(Path.java:82)
    at org.apache.hadoop.fs.Path.<init>(Path.java:90)
    at org.apache.hadoop.fs.Path.<init>(Path.java:50)
    at org.apache.hadoop.mapred.JobClient.copyRemoteFiles(JobClient.java:608)
    at org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:713)
    at org.apache.hadoop.mapred.JobClient.copyAndConfigureFiles(JobClient.java:637)
    at org.apache.hadoop.mapred.JobClient.access$300(JobClient.java:170)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:848)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)
    at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807)
    at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:657)
    at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:123)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:164)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:241)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:456)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:186)
    Job Submission failed with exception 'java.lang.IllegalArgumentException(Can not create a Path from an empty string)'
    FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupuser @
categorieshive, hadoop
postedSep 21, '11 at 8:12p
activeSep 22, '11 at 2:29a
posts7
users3
websitehive.apache.org

People

Translate

site design / logo © 2021 Grokbase