0

I have more than 25000 data in a collection so I have to optimize for the front-end.

First with node JS, I did a pagination for the rest API:

 app.get('/api', function(req, res, next) {

  Image.paginate({}, req.query.page, req.query.limit, function(err, pageCount, articles, itemCount) {

    if (err) return next(err)

    res.format({
      html: function() {
        res.status('articles').send({
          articles: articles,
          pageCount: pageCount,
          itemCount: itemCount
        })
      }
    })
  })
});app.listen(3000);

So i can access my data with:

localhost:3000/page=1&limit=5
localhost:3000/page=2&limit=5

...

But I would like to be able to search by id in all the JSON I have:

localhost:3000/id=[search by id]

because, in my JSON I have for each data:

"_id":"542e65368a1cec1227ae2bac",
"_id":"542e66b78a1cec1643e0f032",
...

How I can to do that?

Thanks for your help!

2
  • I have too much data (+150000 ids almost) and it will be very long I think so. Commented Oct 13, 2014 at 16:11
  • If you feel that MongoDB sorting is slower, you can just hold the id and time in a data structure in memory and keep that sorted and globally accessible. Any inserts to the mongodb will also have a similar insert into the in memory sorted structure. This needs to be optimized and managed correctly for scalability, but I feel that for small websites this should suffice. EDIT. The key is that you only sort it once here as opposed to each mongodb query. Commented Oct 13, 2014 at 16:41

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.