On 5 August 2013 12:38, James David Smith wrote:

Bit of an abstract question I appreciate, however I just thought I'd
see what people thought. I have an anonymosied dataset of travel
behaviour of some people in a major city (I'd rather not go into
details if that's ok). What I intend to do is to work out where they
each are for every minute of the day. So for ~80,000 people x 1440
minutes = 115,200,000 rows of data! So a few questions:

1) Is PostgreSQL going to be able to cope with this? In terms of the
table size? I think so...

2) My columns will be something like
person_id integer,
person_timestamp timestamp,
person_location_geom geometry
Any thoughts on those? The format of the columns?

3) I'll probably create a Primary Key which is a combination of
person_id and person_timestamp. Does this sound like a good idea?

4) Should I use some indexes to improve performance maybe?
Try it and see. It really depends on the queries you will run.

--
  Simon Riggs http://www.2ndQuadrant.com/
  PostgreSQL Development, 24x7 Support, Training & Services

Search Discussions

Discussion Posts

Previous

Follow ups

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 2 of 3 | next ›
Discussion Overview
grouppgsql-novice @
categoriespostgresql
postedAug 5, '13 at 11:38a
activeAug 5, '13 at 1:33p
posts3
users3
websitepostgresql.org
irc#postgresql

People

Translate

site design / logo © 2017 Grokbase