FAQ
Hello friends,
I have built my own java application that performs some
map-reduce operations on the input files. I have loaded my files into HDFS
whose path is as follows:
/user/sam/input/1.txt
/user/sam/input/corrected
/user/sam/input/in

when i used the command $hadoop dfs -cat /user/sam/input/1.txt.. it outputs
the contents of the file correctly. My application uses the files on HDFS as
java strings as follows
String str = "hdfs://192.168.1.1:9000/user/sam/input"

String file1 = str + "1.txt"
String file2 = str + "Corrected"

Here file1 file2 are fed as input to my mapper functions. After i started my
daemons, i ran my application as follows:
$hadoop jar maximum.jar /user/sam/input/in output

It is generating an error as follows
Java.io.FileNotFoundException: hdfs://192.168.1.1:9000/user/sam/input/1.txt
(No such file or directory)

But, when i type $hadoop dfs -cat
hdfs://192.168.1.1:9000/user/sam/input/1.txt ..... it outputs the contents
of the file correctly.

I tried other possible ways as follows:
String str = "/user/sam/input/"
String str = "hdfs:/user/sam/input"

But none of the above paths works.

Could anyone point out the possible mistake. Any kind of suggestions are
welcome.

Thanks,
Bharath


--
View this message in context: http://old.nabble.com/Java-run-time-error-while-executing-my-application---unable-to-find-the-files-on-the-HDFS-tp28843314p28843314.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.

Search Discussions

  • Ted Yu at Jun 10, 2010 at 4:23 pm
    In the future, post snippet of your code.

    I suggest you take a look at this method in
    core/org/apache/hadoop/fs/FsShell.java:
    private FileSystem getSrcFileSystem(Path src, boolean verifyChecksum
    ) throws IOException {
    On Thu, Jun 10, 2010 at 6:42 AM, samanthula wrote:


    Hello friends,
    I have built my own java application that performs some
    map-reduce operations on the input files. I have loaded my files into HDFS
    whose path is as follows:
    /user/sam/input/1.txt
    /user/sam/input/corrected
    /user/sam/input/in

    when i used the command $hadoop dfs -cat /user/sam/input/1.txt.. it outputs
    the contents of the file correctly. My application uses the files on HDFS
    as
    java strings as follows
    String str = "hdfs://192.168.1.1:9000/user/sam/input"

    String file1 = str + "1.txt"
    String file2 = str + "Corrected"

    Here file1 file2 are fed as input to my mapper functions. After i started
    my
    daemons, i ran my application as follows:
    $hadoop jar maximum.jar /user/sam/input/in output

    It is generating an error as follows
    Java.io.FileNotFoundException: hdfs://
    192.168.1.1:9000/user/sam/input/1.txt
    (No such file or directory)

    But, when i type $hadoop dfs -cat
    hdfs://192.168.1.1:9000/user/sam/input/1.txt ..... it outputs the contents
    of the file correctly.

    I tried other possible ways as follows:
    String str = "/user/sam/input/"
    String str = "hdfs:/user/sam/input"

    But none of the above paths works.

    Could anyone point out the possible mistake. Any kind of suggestions are
    welcome.

    Thanks,
    Bharath


    --
    View this message in context:
    http://old.nabble.com/Java-run-time-error-while-executing-my-application---unable-to-find-the-files-on-the-HDFS-tp28843314p28843314.html
    Sent from the Hadoop core-user mailing list archive at Nabble.com.
  • Boris Shkolnik at Jun 10, 2010 at 9:05 pm
    Bharath,
    One thing -
    String str = "hdfs://192.168.1.1:9000/user/sam/input"

    String file1 = str + "1.txt"
    Will result in "hdfs://192.168.1.1:9000/user/sam/input1.txt"

    Second-
    Can you please paste code that opens the file? Which method are you using?
    On 6/10/10 6:42 AM, "samanthula" wrote:


    Hello friends,
    I have built my own java application that performs some
    map-reduce operations on the input files. I have loaded my files into HDFS
    whose path is as follows:
    /user/sam/input/1.txt
    /user/sam/input/corrected
    /user/sam/input/in

    when i used the command $hadoop dfs -cat /user/sam/input/1.txt.. it outputs
    the contents of the file correctly. My application uses the files on HDFS as
    java strings as follows
    String str = "hdfs://192.168.1.1:9000/user/sam/input"

    String file1 = str + "1.txt"
    String file2 = str + "Corrected"

    Here file1 file2 are fed as input to my mapper functions. After i started my
    daemons, i ran my application as follows:
    $hadoop jar maximum.jar /user/sam/input/in output

    It is generating an error as follows
    Java.io.FileNotFoundException: hdfs://192.168.1.1:9000/user/sam/input/1.txt
    (No such file or directory)

    But, when i type $hadoop dfs -cat
    hdfs://192.168.1.1:9000/user/sam/input/1.txt ..... it outputs the contents
    of the file correctly.

    I tried other possible ways as follows:
    String str = "/user/sam/input/"
    String str = "hdfs:/user/sam/input"

    But none of the above paths works.

    Could anyone point out the possible mistake. Any kind of suggestions are
    welcome.

    Thanks,

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedJun 10, '10 at 1:43p
activeJun 10, '10 at 9:05p
posts3
users3
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase