Code Review Stack Exchange is a question and answer site for peer programmer code reviews. Join them; it only takes a minute:

Sign up
Here's how it works:
  1. Anybody can ask a question
  2. Anybody can answer
  3. The best answers are voted up and rise to the top

I want to get records in batches. But mongoid doc says to avoid using skip due as it can be expensive. I wrote this method to iterate through a large number records efficiently.

module Mongoid
  module Batches
    def find_each(batch_size = 1000)
      return to_enum(:find_each, batch_size) unless block_given?
      find_in_batches(batch_size) do |documents|
       documents.each { |document| yield document }
     end
   end
   def find_in_batches(batch_size = 1000)
     return to_enum(:find_in_batches, batch_size) unless block_given?
     documents = self.asc(:created_at).limit(batch_size).asc(:id).to_a
     while documents.any?
       documents_size = documents.size
       primary_key_offset = documents.last.id

       yield documents
       break if documents_size < batch_size
       documents = self.where(:id.gt => primary_key_offset).asc(:created_at).limit(batch_size).asc(:id).to_a
     end
   end
 end
end
Mongoid::Criteria.include Mongoid::Batches
share|improve this question

Actually there is no need to take this way since mongodb cursor by defaults returns batchwise

share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.