FAQ
Praveenesh,

You can build pig without a hadoop jar and then use it along with your
append version hadoop jar. This would resolve it for you.

(Basically, your problem is that the DFS versions in hadoop-append and
hadoop regular are different, and thereby your pig which comes
prebuilt with a hadoop vanilla jar included, will not be able to
communicate with a cluster that's running append -- pig has a without
hadoop build feature for this purpose, am sure you'll find more
information over on search-hadoop.com, etc. searching in this
direction)
On Mon, Jul 4, 2011 at 10:29 AM, praveenesh kumar wrote:
Hi,

There is no hadoop jar in my pig lib directory.  I tried copying my hadoop
jar files in the pig lib folder. Also I tried adding that jar file in the
pig lib path.. still the error is same.
Is there any other way to make it run with hadoop-0.20-append version.
Guys, I am stuck with this issue. Need your guidance.

Thanks,
Praveenesh
On Sat, Jul 2, 2011 at 1:36 PM, Joey Echeverria wrote:

Try replacing the hadoop jar from the pig lib directory with the one from
your cluster.

-Joey


On Jul 2, 2011, at 0:38, praveenesh kumar wrote:

Hi guys..



I am previously using hadoop and Hbase...



So for Hbase to run perfectly fine we need Hadoop-0.20-append for Hbase jar
files.. So I am using Hadoop-0.20-append jar files.. which made both my
hadoop and hbase to work fine..

Now I want to use pig for my hadoop and hbase clusters..

I downloaded pig 0.8.0... and configured pig to run in map-reduce mode by
setting the pig_classpath to point to the $HADOOP_HOME/conf directory. Then
running ‘pig’ gives the following error messeage.



hadoop@ub13:/usr/local/pig/bin$ pig

2011-07-01 17:41:52,150 [main] INFO  org.apache.pig.Main - Logging error
messages to: /usr/local/pig/bin/pig_1309522312144.log

2011-07-01 17:41:52,454 [main] INFO
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine -
Connecting
to hadoop file system at: hdfs://ub13:54310

2011-07-01 17:41:52,654 [main] ERROR org.apache.pig.Main - ERROR 2999:
Unexpected internal error. Failed to create DataStorage

LOG MESSAGE -----

Error before Pig is launched---------------------------

ERROR 2999: Unexpected internal error. Failed to create DataStorage

java.lang.RuntimeException: Failed to create DataStorage

at
org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
at
org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
at
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
at
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
at org.apache.pig.impl.PigContext.connect(PigContext.java:183)

at org.apache.pig.PigServer.<init>(PigServer.java:226)

at org.apache.pig.PigServer.<init>(PigServer.java:215)

at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)

at org.apache.pig.Main.run(Main.java:452)

at org.apache.pig.Main.main(Main.java:107)

Caused by: org.apache.hadoop.ipc.RPC$VersionMismatch: Protocol
org.apache.hadoop.hdfs.protocol.ClientProtocol version mismatch. (client =
41, server = 43)

at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364)

at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)

at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)

at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)

at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)

at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)

at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)

at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)

at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)

at
org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
... 9 more
================================================================================
I guess the problem is the version mismatch between the
hadoop-append-core
jar files that my hadoop/hbase clusters is using currently and the
hadoop-core jar files that pig is using.Anyone faced any similar kind of
issue..???
On the documentation website... its written requirement as hadoop-0.20.2,
but the problem is I want to use my hadoop and hbase along with pig also.

Any suggestions.. how to resolve this issue..!!
Can anyone please mention which version of each one of them, are
compatible
with each other to work fine to put them in production.

Thanks,
Praveenesh


--
Harsh J

Search Discussions

Discussion Posts

Previous

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 4 of 4 | next ›
Discussion Overview
groupcommon-user @
categorieshadoop
postedJul 2, '11 at 7:39a
activeJul 4, '11 at 7:30a
posts4
users3
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase