Hi,

Is it possible to import data from a csv file into multiple tables
in postgresql database.
suppose if I have 2 tables students and departments. And my csv file
contains data for both the tables. Can I import this data into the two
tables.

Is there any perl script available to do this.

Help me out with this !!

thanks,
Deepblues

Search Discussions

  • Andrew Hammond at Feb 28, 2005 at 2:28 am
    The brief answer is no, you can not import from a single csv file into
    multiple tables.

    If the csv file consists of two distinct sections of data, then you
    could of course split it into two csv files. If what you want to do is
    normalize existing data, then you should first import the existing data
    into a working table. Then you can manipulate it within the database.

    It is unlikely that you will need perl to do any of this.

    Please post a sample of the data and the schema into which you plan to
    import your data.

    Drew


    Deepblues wrote:
    Hi,

    Is it possible to import data from a csv file into multiple tables
    in postgresql database.
    suppose if I have 2 tables students and departments. And my csv file
    contains data for both the tables. Can I import this data into the two
    tables.

    Is there any perl script available to do this.
  • Sean Davis at Feb 28, 2005 at 11:31 am
    ----- Original Message -----
    From: "Andrew Hammond" <ahammond@ca.afilias.info>
    To: "Deepblues" <deepblues@gmail.com>
    Cc: <pgsql-novice@postgresql.org>
    Sent: Sunday, February 27, 2005 9:28 PM
    Subject: Re: [NOVICE] Import csv file into multiple tables in Postgres

    The brief answer is no, you can not import from a single csv file into
    multiple tables.

    If the csv file consists of two distinct sections of data, then you could
    of course split it into two csv files. If what you want to do is normalize
    existing data, then you should first import the existing data into a
    working table. Then you can manipulate it within the database.

    It is unlikely that you will need perl to do any of this.
    I use perl a lot for stuff like this, but have found that in most cases, the
    easiest thing to do is to load the data into a single postgresql table and
    then create sql for doing the selects and inserts to then create the
    multiple tables. This has the added advantage that you get to keep a copy
    of the original data available in case you don't put every column into the
    "working" database. If you end up doing this a lot, you can create a
    separate "loader" schema that contains all of these raw csv tables in one
    place, not visible by most users so as not to confuse the "working" schema.

    Sean
  • Keith Worthington at Mar 1, 2005 at 5:04 pm

    Sean Davis wrote:
    ----- Original Message ----- From: "Andrew Hammond"
    <ahammond@ca.afilias.info>
    To: "Deepblues" <deepblues@gmail.com>
    Cc: <pgsql-novice@postgresql.org>
    Sent: Sunday, February 27, 2005 9:28 PM
    Subject: Re: [NOVICE] Import csv file into multiple tables in Postgres

    The brief answer is no, you can not import from a single csv file
    into multiple tables.

    If the csv file consists of two distinct sections of data, then you
    could of course split it into two csv files. If what you want to do
    is normalize existing data, then you should first import the existing
    data into a working table. Then you can manipulate it within the
    database.

    It is unlikely that you will need perl to do any of this.

    I use perl a lot for stuff like this, but have found that in most
    cases, the easiest thing to do is to load the data into a single
    postgresql table and then create sql for doing the selects and inserts
    to then create the multiple tables. This has the added advantage that
    you get to keep a copy of the original data available in case you
    don't put every column into the "working" database. If you end up
    doing this a lot, you can create a separate "loader" schema that
    contains all of these raw csv tables in one place, not visible by most
    users so as not to confuse the "working" schema.

    Sean
    I do nearly exactly as Sean has suggested all the time.

    We have a schema called data_transfer that contains all of the tables
    for accepting data. (As an aside the data is written into these tables
    by a shell script using the COPY command.) Each of the tables has a
    trigger that fires and executes a function. The function is responsible
    for taking the data out of the data_transfer table and putting it in
    one, two or three target tables. In addition to the advantage noted by
    Sean you can also manipulate the data during this process as well as
    check for duplicates dynamically change between INSERT and UPDATE
    commands, etc, etc.

    --
    Kind Regards,
    Keith
  • Deepblues at Mar 3, 2005 at 8:11 pm
    Thanks for all that information, My scenario looks like this ...

    I need to import an excel spreadsheet into the postgresql database .
    I converted the excel spreadsheet into a csv file and now I have 3
    tables in my database where I need to import this data. Im new to both
    perl and postgres. do you have any sample script that I can have a
    look at which loads the csv file into a interface table ?

    Deepblues


    On Tue, 01 Mar 2005 12:06:34 -0500, Keith Worthington
    wrote:
    Sean Davis wrote:
    ----- Original Message ----- From: "Andrew Hammond"
    <ahammond@ca.afilias.info>
    To: "Deepblues" <deepblues@gmail.com>
    Cc: <pgsql-novice@postgresql.org>
    Sent: Sunday, February 27, 2005 9:28 PM
    Subject: Re: [NOVICE] Import csv file into multiple tables in Postgres

    The brief answer is no, you can not import from a single csv file
    into multiple tables.

    If the csv file consists of two distinct sections of data, then you
    could of course split it into two csv files. If what you want to do
    is normalize existing data, then you should first import the existing
    data into a working table. Then you can manipulate it within the
    database.

    It is unlikely that you will need perl to do any of this.

    I use perl a lot for stuff like this, but have found that in most
    cases, the easiest thing to do is to load the data into a single
    postgresql table and then create sql for doing the selects and inserts
    to then create the multiple tables. This has the added advantage that
    you get to keep a copy of the original data available in case you
    don't put every column into the "working" database. If you end up
    doing this a lot, you can create a separate "loader" schema that
    contains all of these raw csv tables in one place, not visible by most
    users so as not to confuse the "working" schema.

    Sean
    I do nearly exactly as Sean has suggested all the time.

    We have a schema called data_transfer that contains all of the tables
    for accepting data. (As an aside the data is written into these tables
    by a shell script using the COPY command.) Each of the tables has a
    trigger that fires and executes a function. The function is responsible
    for taking the data out of the data_transfer table and putting it in
    one, two or three target tables. In addition to the advantage noted by
    Sean you can also manipulate the data during this process as well as
    check for duplicates dynamically change between INSERT and UPDATE
    commands, etc, etc.

    --
    Kind Regards,
    Keith

    --
    Deepti Mandava
    Graduate Teaching Assistant,
    The University of Kansas
    Lawrence, KS 66045-7546
  • Sean Davis at Mar 3, 2005 at 8:48 pm

    On Mar 3, 2005, at 3:10 PM, Deepblues wrote:

    Thanks for all that information, My scenario looks like this ...

    I need to import an excel spreadsheet into the postgresql database .
    I converted the excel spreadsheet into a csv file and now I have 3
    tables in my database where I need to import this data. Im new to both
    perl and postgres. do you have any sample script that I can have a
    look at which loads the csv file into a interface table ?
    Just an example....

    It will pay to read the DBI manual if you are going to use perl like
    this. You can probably just use the psql \copy command, though.


    #!/usr/bin/perl
    use strict;
    use warnings;
    use DBI;

    my $dbh = DBI->connect('dbi:Pg:dbname=qiagen2;host=localhost',
    'username',
    'password',
    {AutoCommit => 1},
    );

    open (IN,'<yourdatafile.txt');

    my $sql = qq{CREATE TABLE g_rif (
    g_rif_id serial primary key,
    gene_id integer,
    rif text not null,
    pmid integer not null
    )};
    $dbh->do($sql);

    $sql = qq{COPY g_rif (gene_id,rif,pmid) from STDIN};
    my $sth = $dbh->prepare($sql);
    $sth->execute() || die $sth->errstr;
    while (<IN>) {
    chomp;
    my @params=split/,/;
    # next line does the insertion, but with columns rearranged for
    convenience....
    my $ret =
    $dbh->func(join("\t",$params[1],$params[4],$params[2])."\n",
    'putline');
    }
    $dbh->func('endcopy');
    close IN;


    Sean
  • Bruno Wolff III at Mar 4, 2005 at 2:56 pm

    On Thu, Mar 03, 2005 at 14:10:58 -0600, Deepblues wrote:
    Thanks for all that information, My scenario looks like this ...

    I need to import an excel spreadsheet into the postgresql database .
    I converted the excel spreadsheet into a csv file and now I have 3
    tables in my database where I need to import this data. Im new to both
    perl and postgres. do you have any sample script that I can have a
    look at which loads the csv file into a interface table ?
    If you are using 8.0, you should be able to use COPY (or /copy from psql)
    to read in the files with one copy command per file. If there are no
    quoted values in the csv files, then you can do this on older versions
    of Postgres.
  • Todd Lewis at Mar 1, 2005 at 7:24 am
    Standard method is to copy into an interface table. Once the data is
    loaded into the interface you can them import it into the tables that
    you want.

    If this is a process that is run several times, then perl could be
    used to script your import into an interface table and then
    subsequently moving the data to the tables you want.

    The brief answer is no, you can not import from a single csv file into
    multiple tables.

    If the csv file consists of two distinct sections of data, then you
    could of course split it into two csv files. If what you want to do is
    normalize existing data, then you should first import the existing data
    into a working table. Then you can manipulate it within the database.

    It is unlikely that you will need perl to do any of this.

    Please post a sample of the data and the schema into which you plan to
    import your data.

    Drew


    Deepblues wrote:
    Hi,

    Is it possible to import data from a csv file into multiple tables
    in postgresql database.
    suppose if I have 2 tables students and departments. And my csv file
    contains data for both the tables. Can I import this data into the two
    tables.

    Is there any perl script available to do this.
    ---------------------------(end of broadcast)---------------------------
    TIP 2: you can get off all lists at once with the unregister command
    (send "unregister YourEmailAddressHere" to majordomo@postgresql.org)

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
grouppgsql-novice @
categoriespostgresql
postedFeb 27, '05 at 7:15p
activeMar 4, '05 at 2:56p
posts8
users6
websitepostgresql.org
irc#postgresql

People

Translate

site design / logo © 2021 Grokbase