FAQ
Hello everyone,

I am getting this error java.io.IOException: Could not obtain block:,
when running on my new cluster. When I ran the same job on the single
node it worked perfectly, I then added in the second node, and receive
this error. I was running the grep sample job.

I am running Hadoop 0.19.2, because of a dependency on Nutch
(Eventhough this was not a Nutch job). I am not running HBase, the
version of Java is OpenJDK 1.6.0.

Does anybody have any ideas?

Thanks in advance,

-John

Search Discussions

  • Edmund Kohlwey at Nov 11, 2009 at 2:31 am
    I've not encountered an error like this, but here's some suggestions:

    1. Try to make sure that your two node cluster is setup correctly.
    Querying the web interface, using any of the included dfs utils (eg.
    hadoop dfs -ls), or looking in your log directory may yield more useful
    stack traces or errors.

    2. Open up the source and check out the code around the stack trace.
    This sucks, but hadoop is actually pretty easy to surf through in
    Eclipse, and most classes are kept within a reasonable number of lines
    of code and fairly readable.

    3. Rip out the parts of Nutch you need and drop them in your project,
    and forget about 0.19. This isn't ideal, but you have to remember that
    this whole ecosystem is still forming and sometimes it makes sense to
    rip stuff out and transplant it into your project rather than depending
    on 2-3 classes from a project which you otherwise don't use.
    On 11/10/09 11:32 AM, John Martyniak wrote:
    Hello everyone,

    I am getting this error java.io.IOException: Could not obtain block:,
    when running on my new cluster. When I ran the same job on the single
    node it worked perfectly, I then added in the second node, and receive
    this error. I was running the grep sample job.

    I am running Hadoop 0.19.2, because of a dependency on Nutch
    (Eventhough this was not a Nutch job). I am not running HBase, the
    version of Java is OpenJDK 1.6.0.

    Does anybody have any ideas?

    Thanks in advance,

    -John
  • John Martyniak at Nov 11, 2009 at 4:51 am
    Edmund,

    Thanks for the advice. It turns out that it was the firewall running
    on the second cluster node.

    So I stopped that and all is working correctly. Now that I have the
    second node working the way that it is supposed to probably, going to
    bring another couple of nodes online.

    Wish me luck:)

    -John
    On Nov 10, 2009, at 9:30 PM, Edmund Kohlwey wrote:

    I've not encountered an error like this, but here's some suggestions:

    1. Try to make sure that your two node cluster is setup correctly.
    Querying the web interface, using any of the included dfs utils (eg.
    hadoop dfs -ls), or looking in your log directory may yield more
    useful stack traces or errors.

    2. Open up the source and check out the code around the stack trace.
    This sucks, but hadoop is actually pretty easy to surf through in
    Eclipse, and most classes are kept within a reasonable number of
    lines of code and fairly readable.

    3. Rip out the parts of Nutch you need and drop them in your
    project, and forget about 0.19. This isn't ideal, but you have to
    remember that this whole ecosystem is still forming and sometimes it
    makes sense to rip stuff out and transplant it into your project
    rather than depending on 2-3 classes from a project which you
    otherwise don't use.
    On 11/10/09 11:32 AM, John Martyniak wrote:
    Hello everyone,

    I am getting this error java.io.IOException: Could not obtain
    block:, when running on my new cluster. When I ran the same job on
    the single node it worked perfectly, I then added in the second
    node, and receive this error. I was running the grep sample job.

    I am running Hadoop 0.19.2, because of a dependency on Nutch
    (Eventhough this was not a Nutch job). I am not running HBase, the
    version of Java is OpenJDK 1.6.0.

    Does anybody have any ideas?

    Thanks in advance,

    -John

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedNov 10, '09 at 4:32p
activeNov 11, '09 at 4:51a
posts3
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

John Martyniak: 2 posts Edmund Kohlwey: 1 post

People

Translate

site design / logo © 2023 Grokbase