FAQ
We are using hadoop for multiple users and the DFS is using a shared
directory for data as noted by FAQ #13. Is there a way to have hadoop
use a different classpath per job?

Currently if I startup the hadoop instance with no script modifications,
and then run a job "bin/hadoop <classname> <params>"; I get a class not
found exception. Setting the HADOOP_CLASSPATH variable to include my
user's classes and libraries doesn't seem to work. The only way I can
get it to work is to shutdown hadoop and start it up including my user's
HADOOP_CLASSPATH.

If this has been asked an answered before feel free just to point me to
the previous chain.

Thanks,

-Xavier

Search Discussions

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedOct 13, '07 at 12:25a
activeOct 13, '07 at 12:25a
posts1
users1
websitehadoop.apache.org...
irc#hadoop

1 user in discussion

Xavier Stevens: 1 post

People

Translate

site design / logo © 2022 Grokbase