FAQ
I'm getting errors when reading from/writing to pipes that are fairly
large in size. To bypass this, I wanted to redirect output to a file
in the subprocess.Popen function, but couldn't get it to work (even
after setting Shell=True). I tried adding ">","temp.sql" after the
password field but mysqldump gave me an error.

the code:
p1 = subprocess.Popen(["mysqldump","--all-databases","--user=user","--
password=password"], shell=True)
p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout)
output = p2.communicate()[0]
file=open('test.sql.gz','w')
file.write(str(output))
file.close()

the output:
gzip: compressed data not written to a terminal. Use -f to force
compression.
For help, type: gzip -h
mysqldump: Got errno 32 on write

I'm using python rather than a shell script for this because I need to
upload the resulting file to a server as soon as it's done.

thanks

Search Discussions

  • Gabriel Genellina at Apr 8, 2008 at 1:17 am
    En Mon, 07 Apr 2008 20:52:54 -0300, skunkwerk <skunkwerk at gmail.com>
    escribi?:
    I'm getting errors when reading from/writing to pipes that are fairly
    large in size. To bypass this, I wanted to redirect output to a file
    in the subprocess.Popen function, but couldn't get it to work (even
    after setting Shell=True). I tried adding ">","temp.sql" after the
    password field but mysqldump gave me an error.

    the code:
    p1 = subprocess.Popen(["mysqldump","--all-databases","--user=user","--
    password=password"], shell=True)
    p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout)
    output = p2.communicate()[0]
    file=open('test.sql.gz','w')
    file.write(str(output))
    file.close()
    You need a pipe to chain subprocesses:

    import subprocess
    p1 =
    subprocess.Popen(["mysqldump","--all-databases","--user=user","--password=password"],
    stdout=subprocess.PIPE)
    ofile = open("test.sql.gz", "wb")
    p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout, stdout=ofile)
    p1.wait()
    p2.wait()
    ofile.close()

    If you don't want the final file on disk:

    p1 =
    subprocess.Popen(["mysqldump","--all-databases","--user=user","--password=password"],
    stdout=subprocess.PIPE)
    p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout,
    stdout=subprocess.PIPE)
    while True:
    chunk = p2.stdout.read(4192)
    if not chunk: break
    # do something with read chunk

    p1.wait()
    p2.wait()

    --
    Gabriel Genellina
  • Skunkwerk at Apr 10, 2008 at 12:54 am

    On Apr 7, 6:17?pm, "Gabriel Genellina" wrote:
    En Mon, 07 Apr 2008 20:52:54 -0300,skunkwerk<skunkw... at gmail.com> ?
    escribi?:
    I'm getting errors when reading from/writing to pipes that are fairly
    large in size. ?To bypass this, I wanted to redirect output to a file
    in the subprocess.Popen function, but couldn't get it to work (even
    after setting Shell=True). ?I tried adding ">","temp.sql" after the
    password field but mysqldump gave me an error.
    the code:
    p1 = subprocess.Popen(["mysqldump","--all-databases","--user=user","--
    password=password"], shell=True)
    p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout)
    output = p2.communicate()[0]
    file=open('test.sql.gz','w')
    file.write(str(output))
    file.close()
    You need a pipe to chain subprocesses:

    import subprocess
    p1 = ?
    subprocess.Popen(["mysqldump","--all-databases","--user=user","--password=password"], ?
    stdout=subprocess.PIPE)
    ofile = open("test.sql.gz", "wb")
    p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout, stdout=ofile)
    p1.wait()
    p2.wait()
    ofile.close()

    If you don't want the final file on disk:

    p1 = ?
    subprocess.Popen(["mysqldump","--all-databases","--user=user","--password=password"], ?
    stdout=subprocess.PIPE)
    p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout, ?
    stdout=subprocess.PIPE)
    while True:
    ? ?chunk = p2.stdout.read(4192)
    ? ?if not chunk: break
    ? ?# do something with read chunk

    p1.wait()
    p2.wait()

    --
    Gabriel Genellina
    thanks Gabriel - tried the first one and it worked great!

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
grouppython-list @
categoriespython
postedApr 7, '08 at 11:52p
activeApr 10, '08 at 12:54a
posts3
users2
websitepython.org

2 users in discussion

Skunkwerk: 2 posts Gabriel Genellina: 1 post

People

Translate

site design / logo © 2021 Grokbase