0

I'm having a hard time to optimizing queries on a very big table. Basically all of the them filter the set of results by the date:

SELECT FROM bigtable WHERE date >= '2015-01-01' AND date <= '2016-01-01' ORDER BY date desc;

Adding the following date index actually makes things worse:

CREATE INDEX CONCURRENTLY bigtable_date_index ON bigtable(date(date));

That is, without the index it takes about 1s to run and with it it takes about 10s to run. But with bigger ranges and filtering it is very slow even without that index.

I'm using postgresql 9.4 and I see that 9.5 has some improvements for sorting that might help?

Does BRIN indexes should help in this case?

4
  • 1
    Please read wiki.postgresql.org/wiki/SlowQueryQuestions then edit your question and add the missing information. Commented May 3, 2016 at 14:40
  • 1
    And a strong advice to not name your columns date. Commented May 3, 2016 at 14:54
  • this is obviously and example @joop Commented May 3, 2016 at 23:08
  • @a_horse_with_no_name thanks, will add some more info Commented May 3, 2016 at 23:08

1 Answer 1

0

For an index to be effective, it needs to index the same thing you're filtering by. In this case, you're filtering by date, but you appears to have indexed date(date), so the index can't be used.

Either filter your table using date(date):

SELECT FROM bigtable
WHERE date(date) >= '2015-01-01' AND date(date) <= '2016-01-01'
ORDER BY date(date) desc;

Or index the naked date:

CREATE INDEX CONCURRENTLY bigtable_date_index ON bigtable(date);
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.