My problem: I have a large database of interconnected objects which I need to process with a combination of short- and long-lived workers. These objects are mostly read-only (i.e. any of them can be changed/marked-as-deleted, but that happens infrequently). The workers may or may not be within one Python process, or even on one system.
I've been doing this with a "classic" session-based SQLAlchemy ORM, approach, but that approach turns out to be way too slow and memory intense, as each thread gets its own copy of every object.
My vision would be an object server. It would mediate write access to the database and then send change/invalidation notices to the workers. (Changes are infrequent enough that I don't care if a worker gets a notice it's not interested in.)
Read access would be coordinated so that only one thread within a process fetches an object, and I'd need a LRU cache to keep frequently-accessed and long-lived objects in memory.
I don't care if updates are applied immediately or are only visible to the local process until committed. I also don't need fancy indexing or query abilities; if necessary I can go to the storage backend for that. (That should be SQL, though a NoSQL back-end would be nice to have.)
Does something like this already exist, somewhere out there, or do I need to write this, or does somebody know of an alternate solution?