FAQ
Hi,

I am trying to develop and run Hadoop tests in NetBeans at Windows and I
have encountered few problems (no surprise, however, I do believe Hadoop
development should be platform independent as much as possible...):

1) target "generate-test-records" produces some java files under
build\test\src folder. Why? This forces me to set dependency on
build\test\src folder in NetBeans which is problem under Windows because the
system holds lock and I am not able to delete this folder. So target "clean"
can not be executed. Does anybody know how to workaround this or is there
any chance these generated files can be located outside the build folder?

2) for some tests I am getting the following exception:
Testcase: testCopyFromLocalToLocal(org.apache.hadoop.fs.TestCopyFiles):
Caused an ERROR
No FileSystem for scheme: file
java.io.IOException: No FileSystem for scheme: file
at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1267)
at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:56)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1281)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:203)
at
org.apache.hadoop.fs.TestCopyFiles.createFiles(TestCopyFiles.java:99)
at
org.apache.hadoop.fs.TestCopyFiles.testCopyFromLocalToLocal(TestCopyFiles.java:226)

What is the point here? Did I forget to put specific config file on
CLASSPATH?

As of now my CLASSPATH (or Package Folders in terms of NetBeans) is set up
as follows:

Source Package Folders:
src\java
src\examples
src\contrib\data_join\src\java
src\contrib\data_join\src\examples
src\contrib\streaming\src\java

Test Package Folders:
src\test
src\contrib\streaming\src\test
build\test\src


I am quite new to Hadoop development but I would like to dive in...

Regards,
Lukas

Search Discussions

  • Steve Loughran at May 22, 2008 at 10:43 am

    Lukas Vlcek wrote:
    Hi,

    I am trying to develop and run Hadoop tests in NetBeans at Windows and I
    have encountered few problems (no surprise, however, I do believe Hadoop
    development should be platform independent as much as possible...):

    1) target "generate-test-records" produces some java files under
    build\test\src folder. Why? This forces me to set dependency on
    build\test\src folder in NetBeans which is problem under Windows because the
    system holds lock and I am not able to delete this folder.
    ahh, I hate windows file locks, especially the way you get told off for
    the lock, but not who holds it. I hear rumours that the server editions
    can delete files anyway, which almost makes it worth paying the premium for.
    So target "clean"
    can not be executed. Does anybody know how to workaround this or is there
    any chance these generated files can be located outside the build folder?
    That's part of the JSP to java compile process. You could probably
    fiddle with build.properties to put the source files elsewhere.

    create the file build.properties
    add a line like

    build.src=generatedsource

    this should put the generated source into the directory generatedsource

    >
    2) for some tests I am getting the following exception:
    Testcase: testCopyFromLocalToLocal(org.apache.hadoop.fs.TestCopyFiles):
    Caused an ERROR
    No FileSystem for scheme: file
    java.io.IOException: No FileSystem for scheme: file
    at
    org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1267)
    at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:56)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1281)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:203)
    at
    org.apache.hadoop.fs.TestCopyFiles.createFiles(TestCopyFiles.java:99)
    at
    org.apache.hadoop.fs.TestCopyFiles.testCopyFromLocalToLocal(TestCopyFiles.java:226)

    What is the point here? Did I forget to put specific config file on
    CLASSPATH?
    yes, the hadoop-default.xml, which contains the mapping property:

    <property>
    <name>fs.file.impl</name>
    <value>org.apache.hadoop.fs.LocalFileSystem</value>
    <description>The FileSystem for file: uris.</description>
    </property>

    fix: Add conf/ to your classpath.
    As of now my CLASSPATH (or Package Folders in terms of NetBeans) is set up
    as follows:

    Source Package Folders:
    src\java
    src\examples
    src\contrib\data_join\src\java
    src\contrib\data_join\src\examples
    src\contrib\streaming\src\java

    Test Package Folders:
    src\test
    src\contrib\streaming\src\test
    build\test\src
    There is a bit of a bias towards to linux/unix work on hadoop, which can
    complicate windows development. Now is possibly a good time to get more
    RAM and a copy of the free vmware client if you havent already, and put
    together a linux VM, so that you can set up a cluster on your own desktop.

    -steve
  • Lukas Vlcek at May 22, 2008 at 1:46 pm
    Well this makes me think that I should rather try to boot from Ubuntu
    preinstalled USB drive. I can't get rid of the Windows since the notebook is
    not mine.

    Anyway, thanks a lot!

    Lukas
    On Thu, May 22, 2008 at 12:42 PM, Steve Loughran wrote:

    Lukas Vlcek wrote:
    Hi,

    I am trying to develop and run Hadoop tests in NetBeans at Windows and I
    have encountered few problems (no surprise, however, I do believe Hadoop
    development should be platform independent as much as possible...):

    1) target "generate-test-records" produces some java files under
    build\test\src folder. Why? This forces me to set dependency on
    build\test\src folder in NetBeans which is problem under Windows because
    the
    system holds lock and I am not able to delete this folder.
    ahh, I hate windows file locks, especially the way you get told off for
    the lock, but not who holds it. I hear rumours that the server editions can
    delete files anyway, which almost makes it worth paying the premium for.

    So target "clean"
    can not be executed. Does anybody know how to workaround this or is there
    any chance these generated files can be located outside the build folder?
    That's part of the JSP to java compile process. You could probably fiddle
    with build.properties to put the source files elsewhere.

    create the file build.properties
    add a line like

    build.src=generatedsource

    this should put the generated source into the directory generatedsource


    2) for some tests I am getting the following exception:
    Testcase: testCopyFromLocalToLocal(org.apache.hadoop.fs.TestCopyFiles):
    Caused an ERROR
    No FileSystem for scheme: file
    java.io.IOException: No FileSystem for scheme: file
    at
    org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1267)
    at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:56)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1281)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:203)
    at
    org.apache.hadoop.fs.TestCopyFiles.createFiles(TestCopyFiles.java:99)
    at

    org.apache.hadoop.fs.TestCopyFiles.testCopyFromLocalToLocal(TestCopyFiles.java:226)

    What is the point here? Did I forget to put specific config file on
    CLASSPATH?
    yes, the hadoop-default.xml, which contains the mapping property:

    <property>
    <name>fs.file.impl</name>
    <value>org.apache.hadoop.fs.LocalFileSystem</value>
    <description>The FileSystem for file: uris.</description>
    </property>

    fix: Add conf/ to your classpath.

    As of now my CLASSPATH (or Package Folders in terms of NetBeans) is set up
    as follows:

    Source Package Folders:
    src\java
    src\examples
    src\contrib\data_join\src\java
    src\contrib\data_join\src\examples
    src\contrib\streaming\src\java

    Test Package Folders:
    src\test
    src\contrib\streaming\src\test
    build\test\src

    There is a bit of a bias towards to linux/unix work on hadoop, which can
    complicate windows development. Now is possibly a good time to get more RAM
    and a copy of the free vmware client if you havent already, and put together
    a linux VM, so that you can set up a cluster on your own desktop.

    -steve

    --
    http://blog.lukas-vlcek.com/

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-dev @
categorieshadoop
postedMay 21, '08 at 9:55a
activeMay 22, '08 at 1:46p
posts3
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

Lukas Vlcek: 2 posts Steve Loughran: 1 post

People

Translate

site design / logo © 2023 Grokbase