Hello,
While trying to start the task tracker I get the following error in
the logs (see below).
I'm guessing its trying to clean up an aborted job( a badly coded one)
and too many files to clean up.
Does anyone know which directory its looking into so that I manually
clean it up?
Regards
S
==Error==
2009-11-30 11:39:47,989 ERROR org.apache.hadoop.mapred.TaskTracker:
Can not start task tracker because java.lang.OutOfMemoryError: GC
overhead limit exceeded
at java.util.Arrays.copyOf(Arrays.java:2882)
at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:572)
at java.lang.StringBuilder.append(StringBuilder.java:203)
at java.io.UnixFileSystem.resolve(UnixFileSystem.java:93)
at java.io.File.(File.java:1056)
at org.apache.hadoop.fs.FileUtil.fullyDelete(FileUtil.java:73)
at org.apache.hadoop.fs.FileUtil.fullyDelete(FileUtil.java:91)
at org.apache.hadoop.fs.FileUtil.fullyDelete(FileUtil.java:91)
at org.apache.hadoop.fs.FileUtil.fullyDelete(FileUtil.java:91)
at org.apache.hadoop.fs.FileUtil.fullyDelete(FileUtil.java:91)
at org.apache.hadoop.fs.FileUtil.fullyDelete(FileUtil.java:91)
at org.apache.hadoop.fs.RawLocalFileSystem.delete(RawLocalFileSystem.java:269)
at org.apache.hadoop.fs.ChecksumFileSystem.delete(ChecksumFileSystem.java:438)
at org.apache.hadoop.fs.FilterFileSystem.delete(FilterFileSystem.java:143)
at org.apache.hadoop.mapred.JobConf.deleteLocalFiles(JobConf.java:270)
at org.apache.hadoop.mapred.TaskTracker.initialize(TaskTracker.java:441)
at org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:2833)