FAQ
Hi.

We run hadoop-0.18.3 and it seems that the jobcache does not get cleaned out
properly.

Would this cron script be to any harm to hadoop ?

# Clean all files which are two or more days old
/usr/bin/find ${JOB_CACHE_PATH} -type f -mtime +2 -exec rm {} \;

Need to start cleaning today so hoping for quick response.

Cheers

//Marcus Herou


--
Marcus Herou CTO and co-founder Tailsweep AB
+46702561312
marcus.herou@tailsweep.com
http://www.tailsweep.com/

Search Discussions

  • Vibhooti Verma at Feb 10, 2010 at 9:43 am
    We faced the same issue and we also use the cron to delete the older
    entries.
    Please be careful that your mtime for deletion should never be less than the
    longest job you can ever have.

    On Wed, Feb 10, 2010 at 1:45 PM, Marcus Herou wrote:

    Hi.

    We run hadoop-0.18.3 and it seems that the jobcache does not get cleaned
    out
    properly.

    Would this cron script be to any harm to hadoop ?

    # Clean all files which are two or more days old
    /usr/bin/find ${JOB_CACHE_PATH} -type f -mtime +2 -exec rm {} \;

    Need to start cleaning today so hoping for quick response.

    Cheers

    //Marcus Herou


    --
    Marcus Herou CTO and co-founder Tailsweep AB
    +46702561312
    marcus.herou@tailsweep.com
    http://www.tailsweep.com/


    --
    cheers,
    Vibhooti
  • Allen Wittenauer at Feb 10, 2010 at 7:20 pm

    On 2/10/10 12:15 AM, "Marcus Herou" wrote:
    We run hadoop-0.18.3 and it seems that the jobcache does not get cleaned out
    properly.

    Would this cron script be to any harm to hadoop ?

    # Clean all files which are two or more days old
    /usr/bin/find ${JOB_CACHE_PATH} -type f -mtime +2 -exec rm {} \;

    Need to start cleaning today so hoping for quick response.
    We do it something similar, but wait for it to be 7 days long in case we
    have a particularly long running job.
  • Marcus Herou at Mar 9, 2010 at 4:51 pm
    OK. Needed to short it down to even 1 day. Scary I know but the cache grows
    like crazy.

    /M

    On Wed, Feb 10, 2010 at 8:19 PM, Allen Wittenauer
    wrote:

    On 2/10/10 12:15 AM, "Marcus Herou" wrote:
    We run hadoop-0.18.3 and it seems that the jobcache does not get cleaned out
    properly.

    Would this cron script be to any harm to hadoop ?

    # Clean all files which are two or more days old
    /usr/bin/find ${JOB_CACHE_PATH} -type f -mtime +2 -exec rm {} \;

    Need to start cleaning today so hoping for quick response.
    We do it something similar, but wait for it to be 7 days long in case we
    have a particularly long running job.

    --
    Marcus Herou CTO and co-founder Tailsweep AB
    +46702561312
    marcus.herou@tailsweep.com
    http://www.tailsweep.com/

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedFeb 10, '10 at 8:16a
activeMar 9, '10 at 4:51p
posts4
users3
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2022 Grokbase