FAQ
Hi,

We're trying to deploy a C++ MapReduce app developed with the Hadoop
Pipes API on a large cluster. The app is not starting up because its
shared libs are not present on the cluster nodes. For local testing we
set up a 2 node cluster on our dev boxes, where all libs are in a
standard location - /home/myapp/lib and LD_LIBRARY_PATH is set to this
path. The app runs without any problems here.

Are there any general procedures for deploying a C++ app on a Hadoop
cluster ? Ideally I'd like to just copy the libs to HDFS and let the
framework move them to the nodes where the map/reduce tasks are being
run. The libs should also be removed from the nodes after the tasks have
completed.

Thanks,

Rahul Sood
Advanced Tech Group
Yahoo, Bangalore

Search Discussions

Discussion Posts

Follow ups

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 1 of 2 | next ›
Discussion Overview
groupcommon-user @
categorieshadoop
postedFeb 20, '08 at 1:23p
activeFeb 21, '08 at 5:17a
posts2
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

Arun C Murthy: 1 post Rahul Sood: 1 post

People

Translate

site design / logo © 2022 Grokbase