FAQ
I've got a Perl program that is run once an hour as a cron job.
Normally it finishes its task in about ten minutes, but occasionally
it takes more than an hour to complete. When this happens, a second
copy is started, the two copies step on each others' toes, and both
crash.

I'm looking for the second copy to know that another copy is running
and exit after logging a message to that effect. Any method needs to
work properly if the first copy crashes rather than exiting cleanly.

Thanks,
Mark Wagner

Search Discussions

  • Chas. Owens at Oct 3, 2007 at 8:42 pm

    On 10/3/07, Mark Wagner wrote:
    I've got a Perl program that is run once an hour as a cron job.
    Normally it finishes its task in about ten minutes, but occasionally
    it takes more than an hour to complete. When this happens, a second
    copy is started, the two copies step on each others' toes, and both
    crash.

    I'm looking for the second copy to know that another copy is running
    and exit after logging a message to that effect. Any method needs to
    work properly if the first copy crashes rather than exiting cleanly.
    snip

    Create a gatekeeper script* that runs the script in question. This
    gatekeeper script should look at the currently running processes
    (either using ps or some module suitable for your environment) and if
    the script is currently running don't execute the script.

    * or just add the functionality to your existing script, the benefit
    of having a general gatekeeper script is that if you need to do this
    again you can just pass different arguments in.
  • Elizabeth Cortell at Oct 3, 2007 at 8:48 pm
    Checking for instances by grepping the output of ps tends to also catch instances of

    vi myscript.pl

    So that innocently editing the file on that box keeps it from running! Not what you have in mind, I think.

    I suggest (untested, but we do something like this in the office):
    Upon startup, check for the existence of a lockfile with agreed-upon name in some tmp directory. If it exists, exit immediately; otherwise create the file. Remove the file at end of execution or upon death of script (in the END block).

    I haven't thought out all the ways that could go wrong. Comments welcome.


    Cheers,

    Liz

    On Wednesday, October 03, 2007, at 01:42PM, "Chas. Owens" wrote:
    On 10/3/07, Mark Wagner wrote:
    I've got a Perl program that is run once an hour as a cron job.
    Normally it finishes its task in about ten minutes, but occasionally
    it takes more than an hour to complete. When this happens, a second
    copy is started, the two copies step on each others' toes, and both
    crash.

    I'm looking for the second copy to know that another copy is running
    and exit after logging a message to that effect. Any method needs to
    work properly if the first copy crashes rather than exiting cleanly.
    snip

    Create a gatekeeper script* that runs the script in question. This
    gatekeeper script should look at the currently running processes
    (either using ps or some module suitable for your environment) and if
    the script is currently running don't execute the script.

    * or just add the functionality to your existing script, the benefit
    of having a general gatekeeper script is that if you need to do this
    again you can just pass different arguments in.

    --
    To unsubscribe, e-mail: beginners-unsubscribe@perl.org
    For additional commands, e-mail: beginners-help@perl.org
    http://learn.perl.org/


  • Bob McConnell at Oct 3, 2007 at 8:52 pm

    -----Original Message-----
    From: Elizabeth Cortell
    Sent: Wednesday, October 03, 2007 4:48 PM
    To: beginners@perl.org
    Subject: Re: Limiting a program to a single running instance

    Checking for instances by grepping the output of ps tends to
    also catch instances of

    vi myscript.pl

    So that innocently editing the file on that box keeps it from
    running! Not what you have in mind, I think.

    I suggest (untested, but we do something like this in the office):
    Upon startup, check for the existence of a lockfile with
    agreed-upon name in some tmp directory. If it exists, exit
    immediately; otherwise create the file. Remove the file at
    end of execution or upon death of script (in the END block).

    I haven't thought out all the ways that could go wrong.
    Comments welcome.


    Cheers,

    Liz
    If the process crashes, it won't always remove the file. Put the Process
    ID in the file, then the next run can check to see if that process is
    still alive and running the same application. If not, remove the file
    and continue.

    Bob McConnell
  • Mark Wagner at Oct 3, 2007 at 10:45 pm

    On 10/3/07, Bob McConnell wrote:
    -----Original Message-----
    From: Elizabeth Cortell
    Sent: Wednesday, October 03, 2007 4:48 PM
    To: beginners@perl.org
    Subject: Re: Limiting a program to a single running instance

    Checking for instances by grepping the output of ps tends to
    also catch instances of

    vi myscript.pl

    So that innocently editing the file on that box keeps it from
    running! Not what you have in mind, I think.

    I suggest (untested, but we do something like this in the office):
    Upon startup, check for the existence of a lockfile with
    agreed-upon name in some tmp directory. If it exists, exit
    immediately; otherwise create the file. Remove the file at
    end of execution or upon death of script (in the END block).

    I haven't thought out all the ways that could go wrong.
    Comments welcome.
    If the process crashes, it won't always remove the file. Put the Process
    ID in the file, then the next run can check to see if that process is
    still alive and running the same application. If not, remove the file
    and continue.
    Thanks. I've set this up, and it seems to be working.

    --
    Mark Wagner
  • Mark Wagner at Oct 3, 2007 at 8:57 pm

    On 10/3/07, Elizabeth Cortell wrote:

    I suggest (untested, but we do something like this in the office):
    Upon startup, check for the existence of a lockfile with agreed-upon name in some tmp directory. If it exists, exit immediately; otherwise create the file. Remove the file at end of execution or upon death of script (in the END block).

    I haven't thought out all the ways that could go wrong. Comments welcome.
    Will an END block get executed even if the script exits through an
    error such as calling an undefined function?

    --
    Mark Wagner
  • Chas. Owens at Oct 3, 2007 at 9:08 pm

    On 10/3/07, Mark Wagner wrote:
    On 10/3/07, Elizabeth Cortell wrote:

    I suggest (untested, but we do something like this in the office):
    Upon startup, check for the existence of a lockfile with agreed-upon name in some tmp directory. If it exists, exit immediately; otherwise create the file. Remove the file at end of execution or upon death of script (in the END block).

    I haven't thought out all the ways that could go wrong. Comments welcome.
    Will an END block get executed even if the script exits through an
    error such as calling an undefined function?
    snip

    Yes, Perl always tries to run END blocks, but there are some errors
    where END blocks cannot run (exhausted memory for instance). END
    cannot run if the machine dies (for obvious reasons), so that is one
    corner case you might need to be worried about. In general inspecting
    the process list or using a file with the pid of the current process
    in it (and if the file exists checking to see if that pid is the same
    as the script in question) is the right solution.
  • Jeff Pang at Oct 4, 2007 at 3:41 am
    2007/10/4, Mark Wagner <carnildo@gmail.com>:
    Will an END block get executed even if the script exits through an
    error such as calling an undefined function?
    If script die by itself,END is executed.
    Some kill signals like SIGINT,SIGTERM can let script die and END be
    executed,but some won't,like SIGKILL,when input `kill -9 pid` from
    terminal,END won't be happened.
    This is because kill signals like SIGINT,SIGTERM can be
    intercepted,script has the chances to exit gracefully,so END can be
    executed.But SIGKILL can't be intercepted,when sending kill
    signal,script will be forced to die immediately.
  • Chas. Owens at Oct 3, 2007 at 8:58 pm

    On 10/3/07, Elizabeth Cortell wrote:
    Checking for instances by grepping the output of ps tends to also catch instances of

    vi myscript.pl

    So that innocently editing the file on that box keeps it from running! Not what you have in mind, I think.
    snip

    Which is why you don't grep for the script name. You look for
    something like "^/usr/bin/perl myscript.pl$" in the ouput of your
    system's equivalent to "ps -axo command".
  • Ken Foskey at Oct 3, 2007 at 10:50 pm

    On Wed, 2007-10-03 at 13:34 -0700, Mark Wagner wrote:
    I've got a Perl program that is run once an hour as a cron job.
    Normally it finishes its task in about ten minutes, but occasionally
    it takes more than an hour to complete. When this happens, a second
    copy is started, the two copies step on each others' toes, and both
    crash.

    I'm looking for the second copy to know that another copy is running
    and exit after logging a message to that effect. Any method needs to
    work properly if the first copy crashes rather than exiting cleanly.
    http://search.cpan.org/~elizabeth/Sys-RunAlone-0.07/lib/Sys/RunAlone.pm

    Another design for this sort of thing is a mutex. There are a few
    modules for this. This means you can lock out smaller parts of code and
    keep two copies running concurrently doing different work.

    --
    Ken Foskey
    FOSS developer

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupbeginners @
categoriesperl
postedOct 3, '07 at 8:34p
activeOct 4, '07 at 3:41a
posts10
users6
websiteperl.org

People

Translate

site design / logo © 2022 Grokbase