RunJar fails executing thousands JARs within single JVM with error "Too many open files"

Key: HADOOP-6820
URL: https://issues.apache.org/jira/browse/HADOOP-6820
Project: Hadoop Common
Issue Type: Bug
Components: util
Affects Versions: 0.20.2
Environment: OS:Linux, Linux-user limited by maximum number of open file descriptors (for example: ulimit -n shows 1024)
Reporter: Alexander Bondar
Priority: Minor

According to Sun JVM (up to 7) bug http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4167874 - The JarFile objects created by sun.net.www.protocol.jar.JarFileFactory never get garbage collected, even if the classloader that loaded them goes away.

So, if linux-user has limitation on maximum number of open file descriptors (for example: ulimit -n shows 1024) and performs RunJar.main(...) over thousands of JARs that include other nested JARs (also loaded by ClassLoader) within single JVM, RunJar.main(...) throws following exception: java.lang.RuntimeException: java.io.FileNotFoundException: /some-file.txt (Too many open files)

This message is automatically generated by JIRA.
You can reply to this email to add a comment to the issue online.

Search Discussions

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-dev @
postedJun 11, '10 at 12:30p
activeJun 11, '10 at 12:30p

1 user in discussion

Alexander Bondar (JIRA): 1 post



site design / logo © 2022 Grokbase