FAQ
Guys,
I was using hadoop eclipse plugin on hadoop 0.20.2 cluster..
It was working fine for me.
I was using Eclipse SDK Helios 3.6.2 with the plugin
"hadoop-eclipse-plugin-0.20.3-SNAPSHOT.jar" downloaded from JIRA
MAPREDUCE-1280

Now for Hbase installation.. I had to use hadoop-0.20-append compiled
jars..and I had to replace the old jar files with new 0.20-append compiled
jar files..
But now after replacing .. my hadoop eclipse plugin is not working well for
me.
Whenever I am trying to connect to my hadoop master node from that and try
to see DFS locations..
it is giving me the following error:
*
Error : Protocol org.apache.hadoop.hdfs.protocol.clientprotocol version
mismatch (client 41 server 43)*

However the hadoop cluster is working fine if I go directly on hadoop
namenode use hadoop commands..
I can add files to HDFS.. run jobs from there.. HDFS web console and
Map-Reduce web console are also working fine. but not able to use my
previous hadoop eclipse plugin.

Any suggestions or help for this issue ?

Thanks,
Praveenesh

Search Discussions

  • Devaraj K at Jun 22, 2011 at 6:19 am
    Hadoop eclipse plugin also uses hadoop-core.jar file communicate to the
    hadoop cluster. For this it needs to have same version of hadoop-core.jar
    for client as well as server(hadoop cluster).

    Update the hadoop eclipse plugin for your eclipse which is provided with
    hadoop-0.20-append release, it will work fine.


    Devaraj K

    -----Original Message-----
    From: praveenesh kumar
    Sent: Wednesday, June 22, 2011 11:25 AM
    To: common-user@hadoop.apache.org
    Subject: Hadoop eclipse plugin stopped working after replacing hadoop-0.20.2
    jar files with hadoop-0.20-append jar files

    Guys,
    I was using hadoop eclipse plugin on hadoop 0.20.2 cluster..
    It was working fine for me.
    I was using Eclipse SDK Helios 3.6.2 with the plugin
    "hadoop-eclipse-plugin-0.20.3-SNAPSHOT.jar" downloaded from JIRA
    MAPREDUCE-1280

    Now for Hbase installation.. I had to use hadoop-0.20-append compiled
    jars..and I had to replace the old jar files with new 0.20-append compiled
    jar files..
    But now after replacing .. my hadoop eclipse plugin is not working well for
    me.
    Whenever I am trying to connect to my hadoop master node from that and try
    to see DFS locations..
    it is giving me the following error:
    *
    Error : Protocol org.apache.hadoop.hdfs.protocol.clientprotocol version
    mismatch (client 41 server 43)*

    However the hadoop cluster is working fine if I go directly on hadoop
    namenode use hadoop commands..
    I can add files to HDFS.. run jobs from there.. HDFS web console and
    Map-Reduce web console are also working fine. but not able to use my
    previous hadoop eclipse plugin.

    Any suggestions or help for this issue ?

    Thanks,
    Praveenesh
  • Praveenesh kumar at Jun 22, 2011 at 6:37 am
    I followed michael noll's tutorial for making hadoop-0-20-append jars..

    http://www.michael-noll.com/blog/2011/04/14/building-an-hadoop-0-20-x-version-for-hbase-0-90-2/

    After following the article.. we get 5 jar files which we need to replace it
    from hadoop.0.20.2 jar file.
    There is no jar file for hadoop-eclipse plugin..that I can see in my
    repository if I follow that tutorial.

    Also the hadoop-plugin I am using..has no info on JIRA MAPREDUCE-1280
    regarding whether it is compatible with hadoop-0.20-append.

    Does anyone else. faced this kind of issue ???

    Thanks,
    Praveenesh

    On Wed, Jun 22, 2011 at 11:48 AM, Devaraj K wrote:

    Hadoop eclipse plugin also uses hadoop-core.jar file communicate to the
    hadoop cluster. For this it needs to have same version of hadoop-core.jar
    for client as well as server(hadoop cluster).

    Update the hadoop eclipse plugin for your eclipse which is provided with
    hadoop-0.20-append release, it will work fine.


    Devaraj K

    -----Original Message-----
    From: praveenesh kumar
    Sent: Wednesday, June 22, 2011 11:25 AM
    To: common-user@hadoop.apache.org
    Subject: Hadoop eclipse plugin stopped working after replacing
    hadoop-0.20.2
    jar files with hadoop-0.20-append jar files

    Guys,
    I was using hadoop eclipse plugin on hadoop 0.20.2 cluster..
    It was working fine for me.
    I was using Eclipse SDK Helios 3.6.2 with the plugin
    "hadoop-eclipse-plugin-0.20.3-SNAPSHOT.jar" downloaded from JIRA
    MAPREDUCE-1280

    Now for Hbase installation.. I had to use hadoop-0.20-append compiled
    jars..and I had to replace the old jar files with new 0.20-append compiled
    jar files..
    But now after replacing .. my hadoop eclipse plugin is not working well for
    me.
    Whenever I am trying to connect to my hadoop master node from that and try
    to see DFS locations..
    it is giving me the following error:
    *
    Error : Protocol org.apache.hadoop.hdfs.protocol.clientprotocol version
    mismatch (client 41 server 43)*

    However the hadoop cluster is working fine if I go directly on hadoop
    namenode use hadoop commands..
    I can add files to HDFS.. run jobs from there.. HDFS web console and
    Map-Reduce web console are also working fine. but not able to use my
    previous hadoop eclipse plugin.

    Any suggestions or help for this issue ?

    Thanks,
    Praveenesh
  • Devaraj K at Jun 22, 2011 at 7:05 am
    Every time when hadoop builds, it also builds the hadoop eclipse plug-in
    using the latest hadoop core jar. In your case eclipse plug-in contains the
    other version jar and cluster is running with other version. That's why it
    is giving the version mismatch error.



    Just replace the hadoop-core jar in your eclipse plug-in with the jar
    whatever the hadoop cluster is using and check.



    Devaraj K

    _____

    From: praveenesh kumar
    Sent: Wednesday, June 22, 2011 12:07 PM
    To: common-user@hadoop.apache.org; devaraj.k@huawei.com
    Subject: Re: Hadoop eclipse plugin stopped working after replacing
    hadoop-0.20.2 jar files with hadoop-0.20-append jar files



    I followed michael noll's tutorial for making hadoop-0-20-append jars..

    http://www.michael-noll.com/blog/2011/04/14/building-an-hadoop-0-20-x-versio
    n-for-hbase-0-90-2/

    After following the article.. we get 5 jar files which we need to replace it
    from hadoop.0.20.2 jar file.
    There is no jar file for hadoop-eclipse plugin..that I can see in my
    repository if I follow that tutorial.

    Also the hadoop-plugin I am using..has no info on JIRA MAPREDUCE-1280
    regarding whether it is compatible with hadoop-0.20-append.

    Does anyone else. faced this kind of issue ???

    Thanks,
    Praveenesh



    On Wed, Jun 22, 2011 at 11:48 AM, Devaraj K wrote:

    Hadoop eclipse plugin also uses hadoop-core.jar file communicate to the
    hadoop cluster. For this it needs to have same version of hadoop-core.jar
    for client as well as server(hadoop cluster).

    Update the hadoop eclipse plugin for your eclipse which is provided with
    hadoop-0.20-append release, it will work fine.


    Devaraj K

    -----Original Message-----
    From: praveenesh kumar
    Sent: Wednesday, June 22, 2011 11:25 AM
    To: common-user@hadoop.apache.org
    Subject: Hadoop eclipse plugin stopped working after replacing hadoop-0.20.2
    jar files with hadoop-0.20-append jar files


    Guys,
    I was using hadoop eclipse plugin on hadoop 0.20.2 cluster..
    It was working fine for me.
    I was using Eclipse SDK Helios 3.6.2 with the plugin
    "hadoop-eclipse-plugin-0.20.3-SNAPSHOT.jar" downloaded from JIRA
    MAPREDUCE-1280

    Now for Hbase installation.. I had to use hadoop-0.20-append compiled
    jars..and I had to replace the old jar files with new 0.20-append compiled
    jar files..
    But now after replacing .. my hadoop eclipse plugin is not working well for
    me.
    Whenever I am trying to connect to my hadoop master node from that and try
    to see DFS locations..
    it is giving me the following error:
    *
    Error : Protocol org.apache.hadoop.hdfs.protocol.clientprotocol version
    mismatch (client 41 server 43)*

    However the hadoop cluster is working fine if I go directly on hadoop
    namenode use hadoop commands..
    I can add files to HDFS.. run jobs from there.. HDFS web console and
    Map-Reduce web console are also working fine. but not able to use my
    previous hadoop eclipse plugin.

    Any suggestions or help for this issue ?

    Thanks,
    Praveenesh
  • Praveenesh kumar at Jun 22, 2011 at 5:23 pm
    I am doing that.. its not working.. If I am replacing the hadoop-core from
    hadoop-plugin.jar.. I am not able to see map-reduce perspective at all.
    Guys.. any help.. !!!

    Thanks,
    Praveenesh
    On Wed, Jun 22, 2011 at 12:34 PM, Devaraj K wrote:

    Every time when hadoop builds, it also builds the hadoop eclipse plug-in
    using the latest hadoop core jar. In your case eclipse plug-in contains the
    other version jar and cluster is running with other version. That's why it
    is giving the version mismatch error.



    Just replace the hadoop-core jar in your eclipse plug-in with the jar
    whatever the hadoop cluster is using and check.



    Devaraj K

    _____

    From: praveenesh kumar
    Sent: Wednesday, June 22, 2011 12:07 PM
    To: common-user@hadoop.apache.org; devaraj.k@huawei.com
    Subject: Re: Hadoop eclipse plugin stopped working after replacing
    hadoop-0.20.2 jar files with hadoop-0.20-append jar files



    I followed michael noll's tutorial for making hadoop-0-20-append jars..


    http://www.michael-noll.com/blog/2011/04/14/building-an-hadoop-0-20-x-versio
    n-for-hbase-0-90-2/

    After following the article.. we get 5 jar files which we need to replace
    it
    from hadoop.0.20.2 jar file.
    There is no jar file for hadoop-eclipse plugin..that I can see in my
    repository if I follow that tutorial.

    Also the hadoop-plugin I am using..has no info on JIRA MAPREDUCE-1280
    regarding whether it is compatible with hadoop-0.20-append.

    Does anyone else. faced this kind of issue ???

    Thanks,
    Praveenesh



    On Wed, Jun 22, 2011 at 11:48 AM, Devaraj K wrote:

    Hadoop eclipse plugin also uses hadoop-core.jar file communicate to the
    hadoop cluster. For this it needs to have same version of hadoop-core.jar
    for client as well as server(hadoop cluster).

    Update the hadoop eclipse plugin for your eclipse which is provided with
    hadoop-0.20-append release, it will work fine.


    Devaraj K

    -----Original Message-----
    From: praveenesh kumar
    Sent: Wednesday, June 22, 2011 11:25 AM
    To: common-user@hadoop.apache.org
    Subject: Hadoop eclipse plugin stopped working after replacing
    hadoop-0.20.2
    jar files with hadoop-0.20-append jar files


    Guys,
    I was using hadoop eclipse plugin on hadoop 0.20.2 cluster..
    It was working fine for me.
    I was using Eclipse SDK Helios 3.6.2 with the plugin
    "hadoop-eclipse-plugin-0.20.3-SNAPSHOT.jar" downloaded from JIRA
    MAPREDUCE-1280

    Now for Hbase installation.. I had to use hadoop-0.20-append compiled
    jars..and I had to replace the old jar files with new 0.20-append compiled
    jar files..
    But now after replacing .. my hadoop eclipse plugin is not working well for
    me.
    Whenever I am trying to connect to my hadoop master node from that and try
    to see DFS locations..
    it is giving me the following error:
    *
    Error : Protocol org.apache.hadoop.hdfs.protocol.clientprotocol version
    mismatch (client 41 server 43)*

    However the hadoop cluster is working fine if I go directly on hadoop
    namenode use hadoop commands..
    I can add files to HDFS.. run jobs from there.. HDFS web console and
    Map-Reduce web console are also working fine. but not able to use my
    previous hadoop eclipse plugin.

    Any suggestions or help for this issue ?

    Thanks,
    Praveenesh


  • Yaozhen Pan at Jun 23, 2011 at 3:17 am
    Hi,

    I am using Eclipse Helios Service Release 2.
    I encountered a similar problem (map/reduce perspective failed to load) when
    upgrading eclipse plugin from 0.20.2 to 0.20.3-append version.

    I compared the source code of eclipse plugin and found only a few
    difference. I tried to revert the differences one by one to see if it can
    work.
    What surprised me was that when I only reverted the jar name from
    "hadoop-0.20.3-eclipse-plugin.jar" to "hadoop-0.20.2-eclipse-plugin.jar", it
    worked in eclipse.

    Yaozhen

    On Thu, Jun 23, 2011 at 1:22 AM, praveenesh kumar wrote:

    I am doing that.. its not working.. If I am replacing the hadoop-core from
    hadoop-plugin.jar.. I am not able to see map-reduce perspective at all.
    Guys.. any help.. !!!

    Thanks,
    Praveenesh
    On Wed, Jun 22, 2011 at 12:34 PM, Devaraj K wrote:

    Every time when hadoop builds, it also builds the hadoop eclipse plug-in
    using the latest hadoop core jar. In your case eclipse plug-in contains the
    other version jar and cluster is running with other version. That's why it
    is giving the version mismatch error.



    Just replace the hadoop-core jar in your eclipse plug-in with the jar
    whatever the hadoop cluster is using and check.



    Devaraj K

    _____

    From: praveenesh kumar
    Sent: Wednesday, June 22, 2011 12:07 PM
    To: common-user@hadoop.apache.org; devaraj.k@huawei.com
    Subject: Re: Hadoop eclipse plugin stopped working after replacing
    hadoop-0.20.2 jar files with hadoop-0.20-append jar files



    I followed michael noll's tutorial for making hadoop-0-20-append jars..


    http://www.michael-noll.com/blog/2011/04/14/building-an-hadoop-0-20-x-versio
    n-for-hbase-0-90-2/

    After following the article.. we get 5 jar files which we need to replace
    it
    from hadoop.0.20.2 jar file.
    There is no jar file for hadoop-eclipse plugin..that I can see in my
    repository if I follow that tutorial.

    Also the hadoop-plugin I am using..has no info on JIRA MAPREDUCE-1280
    regarding whether it is compatible with hadoop-0.20-append.

    Does anyone else. faced this kind of issue ???

    Thanks,
    Praveenesh



    On Wed, Jun 22, 2011 at 11:48 AM, Devaraj K wrote:

    Hadoop eclipse plugin also uses hadoop-core.jar file communicate to the
    hadoop cluster. For this it needs to have same version of hadoop-core.jar
    for client as well as server(hadoop cluster).

    Update the hadoop eclipse plugin for your eclipse which is provided with
    hadoop-0.20-append release, it will work fine.


    Devaraj K

    -----Original Message-----
    From: praveenesh kumar
    Sent: Wednesday, June 22, 2011 11:25 AM
    To: common-user@hadoop.apache.org
    Subject: Hadoop eclipse plugin stopped working after replacing
    hadoop-0.20.2
    jar files with hadoop-0.20-append jar files


    Guys,
    I was using hadoop eclipse plugin on hadoop 0.20.2 cluster..
    It was working fine for me.
    I was using Eclipse SDK Helios 3.6.2 with the plugin
    "hadoop-eclipse-plugin-0.20.3-SNAPSHOT.jar" downloaded from JIRA
    MAPREDUCE-1280

    Now for Hbase installation.. I had to use hadoop-0.20-append compiled
    jars..and I had to replace the old jar files with new 0.20-append compiled
    jar files..
    But now after replacing .. my hadoop eclipse plugin is not working well for
    me.
    Whenever I am trying to connect to my hadoop master node from that and try
    to see DFS locations..
    it is giving me the following error:
    *
    Error : Protocol org.apache.hadoop.hdfs.protocol.clientprotocol version
    mismatch (client 41 server 43)*

    However the hadoop cluster is working fine if I go directly on hadoop
    namenode use hadoop commands..
    I can add files to HDFS.. run jobs from there.. HDFS web console and
    Map-Reduce web console are also working fine. but not able to use my
    previous hadoop eclipse plugin.

    Any suggestions or help for this issue ?

    Thanks,
    Praveenesh


  • 叶达峰 (Jack Ye) at Jun 23, 2011 at 3:31 am
    do you use hadoop 0.20.203.0?
    I also have problem about this plugin.

    Yaozhen Pan <itzhak.pan@gmail.com>编写:
    Hi,

    I am using Eclipse Helios Service Release 2.
    I encountered a similar problem (map/reduce perspective failed to load) when
    upgrading eclipse plugin from 0.20.2 to 0.20.3-append version.

    I compared the source code of eclipse plugin and found only a few
    difference. I tried to revert the differences one by one to see if it can
    work.
    What surprised me was that when I only reverted the jar name from
    "hadoop-0.20.3-eclipse-plugin.jar" to "hadoop-0.20.2-eclipse-plugin.jar", it
    worked in eclipse.

    Yaozhen

    On Thu, Jun 23, 2011 at 1:22 AM, praveenesh kumar wrote:

    I am doing that.. its not working.. If I am replacing the hadoop-core from
    hadoop-plugin.jar.. I am not able to see map-reduce perspective at all.
    Guys.. any help.. !!!

    Thanks,
    Praveenesh
    On Wed, Jun 22, 2011 at 12:34 PM, Devaraj K wrote:

    Every time when hadoop builds, it also builds the hadoop eclipse plug-in
    using the latest hadoop core jar. In your case eclipse plug-in contains the
    other version jar and cluster is running with other version. That's why it
    is giving the version mismatch error.



    Just replace the hadoop-core jar in your eclipse plug-in with the jar
    whatever the hadoop cluster is using and check.



    Devaraj K

    _____

    From: praveenesh kumar
    Sent: Wednesday, June 22, 2011 12:07 PM
    To: common-user@hadoop.apache.org; devaraj.k@huawei.com
    Subject: Re: Hadoop eclipse plugin stopped working after replacing
    hadoop-0.20.2 jar files with hadoop-0.20-append jar files



    I followed michael noll's tutorial for making hadoop-0-20-append jars..


    http://www.michael-noll.com/blog/2011/04/14/building-an-hadoop-0-20-x-versio
    n-for-hbase-0-90-2/

    After following the article.. we get 5 jar files which we need to replace
    it
    from hadoop.0.20.2 jar file.
    There is no jar file for hadoop-eclipse plugin..that I can see in my
    repository if I follow that tutorial.

    Also the hadoop-plugin I am using..has no info on JIRA MAPREDUCE-1280
    regarding whether it is compatible with hadoop-0.20-append.

    Does anyone else. faced this kind of issue ???

    Thanks,
    Praveenesh



    On Wed, Jun 22, 2011 at 11:48 AM, Devaraj K <devaraj.k@huawei.com> wrote:
    Hadoop eclipse plugin also uses hadoop-core.jar file communicate to the
    hadoop cluster. For this it needs to have same version of hadoop-core.jar
    for client as well as server(hadoop cluster).

    Update the hadoop eclipse plugin for your eclipse which is provided with
    hadoop-0.20-append release, it will work fine.


    Devaraj K

    -----Original Message-----
    From: praveenesh kumar
    Sent: Wednesday, June 22, 2011 11:25 AM
    To: common-user@hadoop.apache.org
    Subject: Hadoop eclipse plugin stopped working after replacing
    hadoop-0.20.2
    jar files with hadoop-0.20-append jar files


    Guys,
    I was using hadoop eclipse plugin on hadoop 0.20.2 cluster..
    It was working fine for me.
    I was using Eclipse SDK Helios 3.6.2 with the plugin
    "hadoop-eclipse-plugin-0.20.3-SNAPSHOT.jar" downloaded from JIRA
    MAPREDUCE-1280

    Now for Hbase installation.. I had to use hadoop-0.20-append compiled
    jars..and I had to replace the old jar files with new 0.20-append compiled
    jar files..
    But now after replacing .. my hadoop eclipse plugin is not working well for
    me.
    Whenever I am trying to connect to my hadoop master node from that and try
    to see DFS locations..
    it is giving me the following error:
    *
    Error : Protocol org.apache.hadoop.hdfs.protocol.clientprotocol version
    mismatch (client 41 server 43)*

    However the hadoop cluster is working fine if I go directly on hadoop
    namenode use hadoop commands..
    I can add files to HDFS.. run jobs from there.. HDFS web console and
    Map-Reduce web console are also working fine. but not able to use my
    previous hadoop eclipse plugin.

    Any suggestions or help for this issue ?

    Thanks,
    Praveenesh


  • Yaozhen Pan at Jun 23, 2011 at 3:53 am
    Hi,

    Our hadoop version was built on 0.20-append with a few patches.
    However, I didn't see big differences in eclipse-plugin.

    Yaozhen
    On Thu, Jun 23, 2011 at 11:29 AM, 叶达峰 (Jack Ye) wrote:

    do you use hadoop 0.20.203.0?
    I also have problem about this plugin.

    Yaozhen Pan <itzhak.pan@gmail.com>编写:
    Hi,

    I am using Eclipse Helios Service Release 2.
    I encountered a similar problem (map/reduce perspective failed to load) when
    upgrading eclipse plugin from 0.20.2 to 0.20.3-append version.

    I compared the source code of eclipse plugin and found only a few
    difference. I tried to revert the differences one by one to see if it can
    work.
    What surprised me was that when I only reverted the jar name from
    "hadoop-0.20.3-eclipse-plugin.jar" to "hadoop-0.20.2-eclipse-plugin.jar", it
    worked in eclipse.

    Yaozhen


    On Thu, Jun 23, 2011 at 1:22 AM, praveenesh kumar <praveenesh@gmail.com
    wrote:
    I am doing that.. its not working.. If I am replacing the hadoop-core
    from
    hadoop-plugin.jar.. I am not able to see map-reduce perspective at all.
    Guys.. any help.. !!!

    Thanks,
    Praveenesh
    On Wed, Jun 22, 2011 at 12:34 PM, Devaraj K wrote:

    Every time when hadoop builds, it also builds the hadoop eclipse
    plug-in
    using the latest hadoop core jar. In your case eclipse plug-in
    contains
    the
    other version jar and cluster is running with other version. That's
    why
    it
    is giving the version mismatch error.



    Just replace the hadoop-core jar in your eclipse plug-in with the jar
    whatever the hadoop cluster is using and check.



    Devaraj K

    _____

    From: praveenesh kumar
    Sent: Wednesday, June 22, 2011 12:07 PM
    To: common-user@hadoop.apache.org; devaraj.k@huawei.com
    Subject: Re: Hadoop eclipse plugin stopped working after replacing
    hadoop-0.20.2 jar files with hadoop-0.20-append jar files



    I followed michael noll's tutorial for making hadoop-0-20-append
    jars..
    http://www.michael-noll.com/blog/2011/04/14/building-an-hadoop-0-20-x-versio
    n-for-hbase-0-90-2/

    After following the article.. we get 5 jar files which we need to
    replace
    it
    from hadoop.0.20.2 jar file.
    There is no jar file for hadoop-eclipse plugin..that I can see in my
    repository if I follow that tutorial.

    Also the hadoop-plugin I am using..has no info on JIRA MAPREDUCE-128
    regarding whether it is compatible with hadoop-0.20-append.

    Does anyone else. faced this kind of issue ???

    Thanks,
    Praveenesh



    On Wed, Jun 22, 2011 at 11:48 AM, Devaraj K <devaraj.k@huawei.com> wrote:
    Hadoop eclipse plugin also uses hadoop-core.jar file communicate to
    the
    hadoop cluster. For this it needs to have same version of
    hadoop-core.jar
    for client as well as server(hadoop cluster).

    Update the hadoop eclipse plugin for your eclipse which is provided
    with
    hadoop-0.20-append release, it will work fine.


    Devaraj K

    -----Original Message-----
    From: praveenesh kumar
    Sent: Wednesday, June 22, 2011 11:25 AM
    To: common-user@hadoop.apache.org
    Subject: Hadoop eclipse plugin stopped working after replacing
    hadoop-0.20.2
    jar files with hadoop-0.20-append jar files


    Guys,
    I was using hadoop eclipse plugin on hadoop 0.20.2 cluster..
    It was working fine for me.
    I was using Eclipse SDK Helios 3.6.2 with the plugin
    "hadoop-eclipse-plugin-0.20.3-SNAPSHOT.jar" downloaded from JIRA
    MAPREDUCE-1280

    Now for Hbase installation.. I had to use hadoop-0.20-append compiled
    jars..and I had to replace the old jar files with new 0.20-append compiled
    jar files..
    But now after replacing .. my hadoop eclipse plugin is not working
    well
    for
    me.
    Whenever I am trying to connect to my hadoop master node from that and try
    to see DFS locations..
    it is giving me the following error:
    *
    Error : Protocol org.apache.hadoop.hdfs.protocol.clientprotocol
    version
    mismatch (client 41 server 43)*

    However the hadoop cluster is working fine if I go directly on hadoop
    namenode use hadoop commands..
    I can add files to HDFS.. run jobs from there.. HDFS web console and
    Map-Reduce web console are also working fine. but not able to use my
    previous hadoop eclipse plugin.

    Any suggestions or help for this issue ?

    Thanks,
    Praveenesh


  • 叶达峰 (Jack Ye) at Jun 23, 2011 at 4:19 am
    I used the 0.20.203.0, and can't access the Dfs locations.
    Following is the error:
    failure to login
    internal error:"map/reduce location status updater"
    org/codehaus/jackson/map/jsonmappingexceptoon

    Yaozhen Pan <itzhak.pan@gmail.com>编写:
    Hi,

    Our hadoop version was built on 0.20-append with a few patches.
    However, I didn't see big differences in eclipse-plugin.

    Yaozhen
    On Thu, Jun 23, 2011 at 11:29 AM, 叶达峰 (Jack Ye) wrote:

    do you use hadoop 0.20.203.0?
    I also have problem about this plugin.

    Yaozhen Pan <itzhak.pan@gmail.com>编写:
    Hi,

    I am using Eclipse Helios Service Release 2.
    I encountered a similar problem (map/reduce perspective failed to load) when
    upgrading eclipse plugin from 0.20.2 to 0.20.3-append version.

    I compared the source code of eclipse plugin and found only a few
    difference. I tried to revert the differences one by one to see if it can
    work.
    What surprised me was that when I only reverted the jar name from
    "hadoop-0.20.3-eclipse-plugin.jar" to "hadoop-0.20.2-eclipse-plugin.jar", it
    worked in eclipse.

    Yaozhen


    On Thu, Jun 23, 2011 at 1:22 AM, praveenesh kumar <praveenesh@gmail.com
    wrote:
    I am doing that.. its not working.. If I am replacing the hadoop-core
    from
    hadoop-plugin.jar.. I am not able to see map-reduce perspective at all.
    Guys.. any help.. !!!

    Thanks,
    Praveenesh

    On Wed, Jun 22, 2011 at 12:34 PM, Devaraj K <devaraj.k@huawei.com>
    wrote:
    Every time when hadoop builds, it also builds the hadoop eclipse
    plug-in
    using the latest hadoop core jar. In your case eclipse plug-in
    contains
    the
    other version jar and cluster is running with other version. That's
    why
    it
    is giving the version mismatch error.



    Just replace the hadoop-core jar in your eclipse plug-in with the jar
    whatever the hadoop cluster is using and check.



    Devaraj K

    _____

    From: praveenesh kumar
    Sent: Wednesday, June 22, 2011 12:07 PM
    To: common-user@hadoop.apache.org; devaraj.k@huawei.com
    Subject: Re: Hadoop eclipse plugin stopped working after replacing
    hadoop-0.20.2 jar files with hadoop-0.20-append jar files



    I followed michael noll's tutorial for making hadoop-0-20-append
    jars..
    http://www.michael-noll.com/blog/2011/04/14/building-an-hadoop-0-20-x-versio
    n-for-hbase-0-90-2/

    After following the article.. we get 5 jar files which we need to
    replace
    it
    from hadoop.0.20.2 jar file.
    There is no jar file for hadoop-eclipse plugin..that I can see in my
    repository if I follow that tutorial.

    Also the hadoop-plugin I am using..has no info on JIRA MAPREDUCE-1280
    regarding whether it is compatible with hadoop-0.20-append.

    Does anyone else. faced this kind of issue ???

    Thanks,
    Praveenesh



    On Wed, Jun 22, 2011 at 11:48 AM, Devaraj K <devaraj.k@huawei.com> wrote:
    Hadoop eclipse plugin also uses hadoop-core.jar file communicate to
    the
    hadoop cluster. For this it needs to have same version of
    hadoop-core.jar
    for client as well as server(hadoop cluster).

    Update the hadoop eclipse plugin for your eclipse which is provided
    with
    hadoop-0.20-append release, it will work fine.


    Devaraj K

    -----Original Message-----
    From: praveenesh kumar
    Sent: Wednesday, June 22, 2011 11:25 AM
    To: common-user@hadoop.apache.org
    Subject: Hadoop eclipse plugin stopped working after replacing
    hadoop-0.20.2
    jar files with hadoop-0.20-append jar files


    Guys,
    I was using hadoop eclipse plugin on hadoop 0.20.2 cluster..
    It was working fine for me.
    I was using Eclipse SDK Helios 3.6.2 with the plugin
    "hadoop-eclipse-plugin-0.20.3-SNAPSHOT.jar" downloaded from JIRA
    MAPREDUCE-1280

    Now for Hbase installation.. I had to use hadoop-0.20-append compiled
    jars..and I had to replace the old jar files with new 0.20-append compiled
    jar files..
    But now after replacing .. my hadoop eclipse plugin is not working
    well
    for
    me.
    Whenever I am trying to connect to my hadoop master node from that and try
    to see DFS locations..
    it is giving me the following error:
    *
    Error : Protocol org.apache.hadoop.hdfs.protocol.clientprotocol
    version
    mismatch (client 41 server 43)*

    However the hadoop cluster is working fine if I go directly on hadoop
    namenode use hadoop commands..
    I can add files to HDFS.. run jobs from there.. HDFS web console and
    Map-Reduce web console are also working fine. but not able to use my
    previous hadoop eclipse plugin.

    Any suggestions or help for this issue ?

    Thanks,
    Praveenesh


Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedJun 22, '11 at 5:56a
activeJun 23, '11 at 4:19a
posts9
users4
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase