I have a cpu intensive code which uses a heavy dictionary as data (around 250M data). I have a multicore processor and want to utilize it so that i can run more than one task at a time. The dictionary is mostly read only and may be updated once a day.
How can i write this in python without duplicating the dictionary?
I understand that python threads don't use native threads and will not offer true concurrency. Can i use multiprocessing module without data being serialized between processes?
I come from java world and my requirement would be something like java threads which can share data, run on multiple processors and offers synchronization primitives.
|
|||
feedback
|
You can share read-only data among processes simply with a You could use Jython (or IronPython) to get a Python implementation with exactly the same multi-threading abilities as Java (or, respectively, C#), including multiple-processor usage by multiple simultaneous threads. |
|||
feedback
|
Take a look at this in the stdlib: http://docs.python.org/library/multiprocessing.html There are a bunch of wonderful features that will allow you to share data structures between processes very easily. |
|||
feedback
|
Use |
|||||
feedback
|