FAQ
Hi,

We're trying to deploy a C++ MapReduce app developed with the Hadoop
Pipes API on a large cluster. The app is not starting up because its
shared libs are not present on the cluster nodes. For local testing we
set up a 2 node cluster on our dev boxes, where all libs are in a
standard location - /home/myapp/lib and LD_LIBRARY_PATH is set to this
path. The app runs without any problems here.

Are there any general procedures for deploying a C++ app on a Hadoop
cluster ? Ideally I'd like to just copy the libs to HDFS and let the
framework move them to the nodes where the map/reduce tasks are being
run. The libs should also be removed from the nodes after the tasks have
completed.

Thanks,

Rahul Sood
Advanced Tech Group
Yahoo, Bangalore

Search Discussions

  • Arun C Murthy at Feb 21, 2008 at 5:17 am

    On Feb 20, 2008, at 5:22 AM, Rahul Sood wrote:
    Are there any general procedures for deploying a C++ app on a Hadoop
    cluster ? Ideally I'd like to just copy the libs to HDFS and let the
    framework move them to the nodes where the map/reduce tasks are being
    run. The libs should also be removed from the nodes after the tasks
    have
    completed.
    Unfortunately no. If you were writing java apps you could use
    System.load or System.loadLibrary...

    I've opened https://issues.apache.org/jira/browse/HADOOP-2867 to make
    this enhancement.

    Arun
    Thanks,

    Rahul Sood
    Advanced Tech Group
    Yahoo, Bangalore

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedFeb 20, '08 at 1:23p
activeFeb 21, '08 at 5:17a
posts2
users2
websitehadoop.apache.org...
irc#hadoop

2 users in discussion

Arun C Murthy: 1 post Rahul Sood: 1 post

People

Translate

site design / logo © 2022 Grokbase