On Nov 9, 2005, at 3:24 PM, Vergara, Michael ((TEM)) wrote:

Hello Everyone:

I have a Perl script that connects to multiple Oracle databases. I
have the script write some status information to
a log file. There is one particular database that hangs in
processing, and I don't know why. The log file does not seem
to be written until the file is closed.

I am wondering, is there a Perl command or technique that will
force lines that are logically written to a log file to be
physically written to that file?

Here's my opening code:
# Open Log file
my $logFileName = "${HomeDir}/work/monitor/output/logs/db_status.log";
open( LOGF, ">>" . $logFileName ) or die "\nCannot open log file\n\n";
print LOGF "\n----------\n";
print LOGF sprintf("db_status started at %02d-%02d-%04d %02d:%02d:%
$Tm->mon+1, $Tm->mday, $Tm->year+1900, $Tm-
hour, $Tm->min, $Tm->sec );

The line that says 'db_status started...' does not get written for
the hanging system because I usually 'kill -9' or 'Ctrl-C'
the process. I'd like to force a write after each section so I can
see there the process is hanging.

Any help will be appreciated.

This is a perl issue, not a DBI issue, but try

autoflush LOGF 1;

after you open LOGF.

This should force a flush after every print statement. If it doesn't
work I hope that at least it points you the right direction.

Search Discussions

Discussion Posts


Follow ups

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 3 of 6 | next ›
Discussion Overview
groupdbi-users @
postedNov 9, '05 at 8:24p
activeNov 9, '05 at 9:38p



site design / logo © 2022 Grokbase