FAQ
After encountering the class not found exception. I have put all my project
dependent jars in CDH-4.5.0-1.cdh4.5.0.p0.30/lib/hadoop-0.20-mapreduce/lib
and restart the cluster. And modify the previous code as below:
FileSystem fs = FileSystem.get(conf);
String sharedConfPath = "/user/hdfs/conf";
DistributedCache.addFileToClassPath(new Path(sharedConfPath), conf, fs);

i.e. set the project config files directory in classpath.

But now I am getting file not found exception.

On Friday, December 27, 2013 10:52:20 AM UTC+5:30, partn...@nrifintech.com
wrote:
I have written below codes which is working earlier. But its stopped
working as I have received class not found exception etc. Is it a memory
problem as earlier we have only hdfs and mapreduce1 installed in the
cluster. Then we installed solr, zoopkeeper and flume.

FileSystem fs = FileSystem.get(conf);
String sharedConfPath = "/user/hdfs/conf";
DistributedCache.addFileToClassPath(new Path(sharedConfPath), conf, fs);
String sharedLibPath = "/user/hdfs/lib";
Path path = new Path(sharedLibPath);
FileStatus[] file_status = fs.listStatus(path);
for (FileStatus fileStatus : file_status) {
String jarPath = sharedLibPath + "/" + fileStatus.getPath().getName();
DistributedCache.addFileToClassPath(new Path(jarPath), conf, fs);
}

Please help me to resolved this problem.
To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.

Search Discussions

Discussion Posts

Previous

Follow ups

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 2 of 3 | next ›
Discussion Overview
groupscm-users @
categorieshadoop
postedDec 27, '13 at 5:22a
activeDec 27, '13 at 5:39p
posts3
users3
websitecloudera.com
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase