FAQ
I am writing an app where I have to populate file content into a database.

The file looks like this,

epoch,name, orderid, quantaty.


There are about 100k rows like this. I am curious is it best to
process each line 1 at a time or do a mass INSERT of 100k and then do
a execute?


Another question,

Since the data is going to be large, are there any tricks or tips
anyone knows how to handle time series data into a database? Is
postgresql the right tool or is there something else out there?

TIA

Search Discussions

  • Thom Brown at May 18, 2010 at 10:58 am

    On 18 May 2010 11:51, Mag Gam wrote:
    I am writing an app where I have to populate file content into a database.

    The file looks like this,

    epoch,name, orderid, quantaty.


    There are about 100k rows like this. I am curious is it best to
    process each line 1 at a time or do a mass INSERT of 100k and then do
    a execute?
    Use COPY: http://www.postgresql.org/docs/current/static/sql-copy.html

    You may wish to import it into a temporary table first before
    converting the date into a useable format.
    Another question,

    Since the data is going to be large, are there any tricks or tips
    anyone knows how to handle time series data into a database? Is
    postgresql the right tool or is there something else out there?
    Depends on what you're doing with the data. If you're querying date
    ranges, you'll need to index based on date.

    Regards

    Thom

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
grouppgsql-novice @
categoriespostgresql
postedMay 18, '10 at 10:51a
activeMay 18, '10 at 10:58a
posts2
users2
websitepostgresql.org
irc#postgresql

2 users in discussion

Thom Brown: 1 post Mag Gam: 1 post

People

Translate

site design / logo © 2023 Grokbase