We have a python application which does a DFS search against data in a database. Right now, that data is small so I can hold everything in a python container but the data is going to grow and there might come a time when it is impossible to hold everything in RAM. On the other hand, querying data from the database during every iteration is not a smart solution, is it?
But the question is more general. The solution is most probably in some smart caching but I don't have any experience in that.
What are the techniques of dealing with large data when you need to frequently (say you are running an algorithm) access it?