Hi,

We have a problem with some of our query plans. One of our tables is quite volatile, but postgres always uses the last statistics snapshot from the last time it was analyzed for query planning. Is there a way to tell postgres that it should not trust the statistics for this table? Basically we want it to assume that there may be 0, 1 or 100,000 entries coming out from a query on that table at any time, and that it should not make any assumptions.

Thanks,
Brian
========================
Brian Herlihy
Trellian Pty Ltd
+65 67534396 (Office)
+65 92720492 (Handphone)
========================

Search Discussions

  • Craig James at Jun 4, 2009 at 4:36 pm

    Brian Herlihy wrote:
    We have a problem with some of our query plans. One of our
    tables is quite volatile, but postgres always uses the last
    statistics snapshot from the last time it was analyzed for query
    planning. Is there a way to tell postgres that it should not
    trust the statistics for this table? Basically we want it to
    assume that there may be 0, 1 or 100,000 entries coming out from
    a query on that table at any time, and that it should not make
    any assumptions.>
    I had a similar problem, and just changed my application to do an analyze either just before the query, or just after a major update to the table. Analyze is very fast, almost always a orders of magnitude faster than the time lost to a poor query plan.

    Craig

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
grouppgsql-performance @
categoriespostgresql
postedJun 4, '09 at 2:09a
activeJun 4, '09 at 4:36p
posts2
users2
websitepostgresql.org
irc#postgresql

2 users in discussion

Brian Herlihy: 1 post Craig James: 1 post

People

Translate

site design / logo © 2022 Grokbase