During peak hours our users experience a lot of timeouts due to an overloaded database server. We are currently trying to make optimizations in order to cope with the high traffic. We read in the documentation of PostgreSQL that Asynchroneous commits might be an option for boosting performance when small data loss in case of a server crash is acceptable. We run the backend for an iPhone multiplayer game with quite simple, but a lot of, requests per second. Since it's a game and not a financial system we are running, the risk of data loss seems acceptable.
Does anyone have experience in difference in performance comparing synchroneous/asynchroneous commits? Is there any risks with using this option, apart from data loss in case of a server crash? Is there any risk that the queue of uncommitted transactions will keep growing since we will serve more requests?
Our setup PostgreSQL, Nginx, Ruby on Rails, Unicorn