FAQ
This most likely means that your upgrade of the jar files, etc. was not
clean (note that I am talking about the jar files on disk and not the dfs
upgrade). Somewhere there is a hadoop*.jar that is not the 0.13 one.

-----Original Message-----
From: Phantom
Sent: Wednesday, June 13, 2007 8:32 AM
To: hadoop-user@lucene.apache.org
Subject: Upgrade to hadoop-0.13 failing

Exception in thread "main"
org.apache.hadoop.ipc.RPC$VersionMismatch:Protocol
org.apache.hadoop.dfs.ClientProtocol version mismatch. (client = 11, server
= 9)
dev030.sctm.com: at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:254)
dev030.sctm.com: at org.apache.hadoop.dfs.SecondaryNameNode.<init>(
SecondaryNameNode.java:93)
dev030.sctm.com: at org.apache.hadoop.dfs.SecondaryNameNode.main(
SecondaryNameNode.java:468)

How do I fix the above ? I just ran ./start-dfs.sh -upgrade.

Thanks
Avinash

Search Discussions

Discussion Posts

Previous

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 2 of 2 | next ›
Discussion Overview
groupcommon-user @
categorieshadoop
postedJun 13, '07 at 3:02a
activeJun 13, '07 at 6:08a
posts2
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

Devaraj Das: 1 post Phantom: 1 post

People

Translate

site design / logo © 2022 Grokbase