|
Tarandeep Singh |
at Oct 6, 2008 at 9:56 pm
|
⇧ |
| |
thanks Mahadev for the reply.
So that means I have to copy my jar file in the $HADOOP_HOME/lib folder on
all slave machines like before.
One more question- I am adding a conf file (just like HADOOP_SITE.xml) via
-conf option and I am able to query parameters in mapper/reducers. But is
there a way I can query the parameters in my job driver class -
public class jobDriver extends Configured
{
someMethod( )
{
ToolRunner.run( new MyJob( ), commandLineArgs);
// I want to query parameters present in my conf file here
}
}
public class MyJob extends Configured implements Tool
{
}
Thanks,
Taran
On Mon, Oct 6, 2008 at 2:46 PM, Mahadev Konar wrote:HI Tarandeep,
the libjars options does not add the jar on the client side. Their is an
open jira for that ( id ont remember which one)...
Oyu have to add the jar to the
HADOOP_CLASSPATH on the client side so that it gets picked up on the client
side as well.
mahadev
On 10/6/08 2:30 PM, "Tarandeep Singh" wrote:
Hi,
I want to add a jar file (that is required by mappers and reducers) to the
classpath. Initially I had copied the jar file to all the slave nodes in the
$HADOOP_HOME/lib directory and it was working fine.
However when I tried the libjars option to add jar files -
$HADOOP_HOME/bin/hadoop jar myApp.jar -conf $MY_CONF_FILE -libjars jdom.jar
I got this error-
java.lang.NoClassDefFoundError: org/jdom/input/SAXBuilder
Can someone please tell me what needs to be fixed here ?
Thanks,
Taran