I am creating a web application to retrieve subsets of one large (4m rows) table. The 4m rows only change once a year. The table has 200+ columns of types boolean and numeric. It has no text columns.
Users will query subsets of this table for download.
I am rather familiar with the PostgreSQL 9.1 database, and my plan is:
- The webapp will help the user to create queries like "WHERE a=3 AND b=true AND c>300"
- The DB will have proper indexes for the most-used columns.
NOW.. I read here: http://stackoverflow.com/questions/10053050/why-is-solr-so-much-faster-than-postgres:
I recently switched from Postgres to Solr and saw a ~50x speed up in our queries. The queries we run involve multiple ranges, and our data is vehicle listings. For example: "Find all vehicles with mileage < 50,000, $5,000 < price < $10,000, make=Mazda..."
So now I wonder: Will/can Solr, Lucene, ElasticSearch, Amazon Cloud Search searches be faster than PostgreSQL even if no full-text search is involved?