FAQ
Hey all,

I'm trying to write a simple YARN-based application but I'm stuck at
writing files to HDFS. I've managed to figure out the mechanics of running
jar files in containers on the various machines, but when trying to create
a FileSystem object, I get the following exception:

java.io.IOException: No FileSystem for scheme: hdfs
         at
org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2138)
         at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2145)
         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:80)
         at
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2184)
         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2166)
         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:302)
         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:158)

The line causing the error is this one:

FileSystem hdfs = FileSystem.get(conf);

The conf object (hadoop.io.conf) is made up of the core-site.xml and
hdfs-site.xml files. I'm running the JAR as a user called "yarn" and the
directory I'm trying to access does exist, though we are not near that
point yet - I'm just trying to create the FileSystem object.

I've checked Google and the only help there is related to a bug that used
to be in HDFS 2.0.0-alpha. I checked the fs.default.name value in the
configuration, and it is reported as "hdfs://namenode.local:8020", which
seems right.

Any help/ideas would be greatly appreciated.

Thanks,
Thinus

Search Discussions

  • Thinus Prinsloo at Aug 8, 2012 at 2:56 pm
    Ok, I fixed it by just running the distributed jar with "hadoop jar", in
    stead of trying to execute a standalone "java -jar".
  • Sambit Tripathy at Sep 12, 2012 at 5:11 am
    Yes it works.You have to build the jar with all required dependencies
    included.
    On Wednesday, August 8, 2012 8:26:27 PM UTC+5:30, Thinus Prinsloo wrote:

    Ok, I fixed it by just running the distributed jar with "hadoop jar", in
    stead of trying to execute a standalone "java -jar".
  • Jghuang at Sep 14, 2012 at 6:22 pm
    I have received similar error. Since I am using C, I cannot create a jar
    and use "hadoop jar" to run it.

    I have read somewhere that Phil Z. said the way to correct it is to add the
    following to hdfs-site.xml. Because that posting was in 2009, is it still
    valid?
    <property>
       <name>fs.hdfs.impl</name>
       <value>org.apache.hadoop.dfs.DistributedFileSystem</value>
       <description>The FileSystem for hdfs: uris.</description>
    </property>
    The error:
    SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
    SLF4J: Defaulting to no-operation (NOP) logger implementation
    SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further
    details.
    12/09/14 14:11:49 ERROR security.UserGroupInformation:
    PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
    No FileSystem for scheme: hdfs
    Exception in thread "main" java.io.IOException: No FileSystem for scheme:
    hdfs
             at
    org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2138)
             at
    org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2145)
             at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:80)
             at
    org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2184)
             at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2166)
             at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:302)
             at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:148)
             at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:146)
             at java.security.AccessController.doPrivileged(Native Method)
             at javax.security.auth.Subject.doAs(Subject.java:396)
             at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
             at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:146)
    Call to org.apache.hadoop.fs.Filesystem::get(URI, Configuration) failed!
    On Wednesday, September 12, 2012 1:11:52 AM UTC-4, Sambit Tripathy wrote:

    Yes it works.You have to build the jar with all required dependencies
    included.
    On Wednesday, August 8, 2012 8:26:27 PM UTC+5:30, Thinus Prinsloo wrote:

    Ok, I fixed it by just running the distributed jar with "hadoop jar", in
    stead of trying to execute a standalone "java -jar".
  • Harsh J at Sep 15, 2012 at 2:43 am
    Hi,

    You do not need to do that if you have the proper hdfs jars on the
    classpath of your application. Is there a way to know if that is
    included properly?
    On Fri, Sep 14, 2012 at 11:51 PM, jghuang wrote:
    I have received similar error. Since I am using C, I cannot create a jar and
    use "hadoop jar" to run it.

    I have read somewhere that Phil Z. said the way to correct it is to add the
    following to hdfs-site.xml. Because that posting was in 2009, is it still
    valid?
    <property>
    <name>fs.hdfs.impl</name>
    <value>org.apache.hadoop.dfs.DistributedFileSystem</value>
    <description>The FileSystem for hdfs: uris.</description>
    </property>
    The error:
    SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
    SLF4J: Defaulting to no-operation (NOP) logger implementation
    SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further
    details.
    12/09/14 14:11:49 ERROR security.UserGroupInformation:
    PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
    No FileSystem for scheme: hdfs
    Exception in thread "main" java.io.IOException: No FileSystem for scheme:
    hdfs

    at
    org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2138)
    at
    org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2145)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:80)
    at
    org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2184)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2166)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:302)
    at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:148)
    at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:146)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:146)
    Call to org.apache.hadoop.fs.Filesystem::get(URI, Configuration) failed!
    On Wednesday, September 12, 2012 1:11:52 AM UTC-4, Sambit Tripathy wrote:

    Yes it works.You have to build the jar with all required dependencies
    included.
    On Wednesday, August 8, 2012 8:26:27 PM UTC+5:30, Thinus Prinsloo wrote:

    Ok, I fixed it by just running the distributed jar with "hadoop jar", in
    stead of trying to execute a standalone "java -jar".


    --
    Harsh J
  • Jghuang at Sep 17, 2012 at 5:37 pm
    Classpath:
    /usr/lib/hadoop/lib/slf4j-log4j12-1.6.1.jar:.:/usr/lib/hadoop/lib/slf4j-api-1.6.1.jar:/usr/lib/hadoop-0.20-mapreduce/hadoop-core.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/commons-lang-2.5.jar:/usr/lib/hadoop/lib/commons-logging-api-1.1.1.jar:/usr/lib/hadoop/lib/commons-logging-1.1.1.jar:/usr/lib/hadoop/etc/hadoop:/usr/lib/hadoop/core-3.1.1.jar:/usr/lib/hadoop/hadoop-client-3.1.1.jar


    On Friday, September 14, 2012 10:43:13 PM UTC-4, Harsh J wrote:

    Hi,

    You do not need to do that if you have the proper hdfs jars on the
    classpath of your application. Is there a way to know if that is
    included properly?
    On Fri, Sep 14, 2012 at 11:51 PM, jghuang wrote:
    I have received similar error. Since I am using C, I cannot create a jar and
    use "hadoop jar" to run it.

    I have read somewhere that Phil Z. said the way to correct it is to add the
    following to hdfs-site.xml. Because that posting was in 2009, is it still
    valid?
    <property>
    <name>fs.hdfs.impl</name>
    <value>org.apache.hadoop.dfs.DistributedFileSystem</value>
    <description>The FileSystem for hdfs: uris.</description>
    </property>
    The error:
    SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
    SLF4J: Defaulting to no-operation (NOP) logger implementation
    SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further
    details.
    12/09/14 14:11:49 ERROR security.UserGroupInformation:
    PriviledgedActionException as:hdfs (auth:SIMPLE)
    cause:java.io.IOException:
    No FileSystem for scheme: hdfs
    Exception in thread "main" java.io.IOException: No FileSystem for scheme:
    hdfs

    at
    org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2138)
    at
    org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2145)
    at
    org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:80)
    at
    org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2184)
    at
    org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2166)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:302)
    at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:148)
    at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:146)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:146)
    Call to org.apache.hadoop.fs.Filesystem::get(URI, Configuration) failed!

    On Wednesday, September 12, 2012 1:11:52 AM UTC-4, Sambit Tripathy
    wrote:
    Yes it works.You have to build the jar with all required dependencies
    included.

    On Wednesday, August 8, 2012 8:26:27 PM UTC+5:30, Thinus Prinsloo
    wrote:
    Ok, I fixed it by just running the distributed jar with "hadoop jar",
    in
    stead of trying to execute a standalone "java -jar".


    --
    Harsh J
  • Kunpengk at Sep 4, 2013 at 8:41 am
    I‘ve got the similar problem with "java -jar xx.jar" in hadoop-2.0.5-alpha:
    java.io.IOException: No FileSystem for scheme: file

    but it works well when running with "hadoop jar".

    When I add the follow config into core-default.xml and it works with "java
    -jar"

    <property>
       <name>fs.file.impl</name>
       <value>org.apache.hadoop.fs.LocalFileSystem</value>
       <description>The FileSystem for file: uris.</description>
    </property>

    <property>
       <name>fs.hdfs.impl</name>
       <value>org.apache.hadoop.hdfs.DistributedFileSystem</value>
       <description>The FileSystem for hdfs: uris.</description>
    </property>

    So,maybe it is not the problem of missing required dependencies. I don't
    why,but it works!

    To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupscm-users @
categorieshadoop
postedAug 8, '12 at 9:28a
activeSep 4, '13 at 8:41a
posts7
users5
websitecloudera.com
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase