Join the Stack Overflow Community
Stack Overflow is a community of 6.5 million programmers, just like you, helping each other.
Join them; it only takes a minute:
Sign up

I have a short but for me very important question:

I would like to write variables from an active python script that is already running to another python script. So I don't want something like this:

$ cat first.py second.py 
#first.py
def demo():
    some_list = []
    for i in 'string':
         some_list.append( i )
    return list

#second.py 
from first import demo

some_list = demo()
print some_list 

$python second.py
['s', 't', 'r', 'i', 'n', 'g']

I want my running script , e.g. "sent.py" to write constantly variables to some kind of "workspace", and then for example access those variables over another script, e.g. "get.py". And that without that I have to start both scripts together in a bash script.

So I am probably looking for a solution that is first passing python to bash an then to python again? I am very sorry, I am not so familiar with the terminology.

I hope it became clear what I mean, I did my best to explain it. I am kind of desperate and hope you can help. I have tried out and googled all kinds of stuff, but it just didn't work.

share|improve this question
2  
Why not write through a socket? docs.python.org/2/library/socket.html – Reut Sharabani Jun 26 '15 at 23:13
2  
You realise Python scripts can run other python scripts, right? – Thedudxo Jun 26 '15 at 23:18
    
You need some sort of IPC. The parent environment from which the two python processes have been spawned cannot act as a shared memory, because Unix prohibits child2parent updates. – Eugeniu Rosca Jun 26 '15 at 23:32
    
I'm not quite sure what you're asking here. Are you trying to extract values from a long-lived process? Generate values from one script for use in another? – Alex Laties Jun 27 '15 at 0:22
    
How about export variable to the linux environment variable. – Liao Zhuodi Jun 27 '15 at 2:39

To get a variable workspace like this you need some form of interprocess communication, shared memory, or storage.

It can be as simple as a file with a known format (possibly JSON or pickler serialization) and a locking mechanism to ensure that the reader waits until the file is completely written. The lock could be achieved by having a separate file that the writer creates while writing and deletes when complete.

Interprocess communication is achievable through sockets and TCP sockets are likely the easiest. Your long-running script would run a TCP listener and the reading process would connect and read from it. This could get weird because Python has a global interpreter lock and would have to block during these comms. No work could be performed. Twisted is an async framework that can solve this. Coordinating that availabity to listen and doing other work could be very tricky.

Shared memory has similar mutual exclusion complications as a file so it would need to have a dedicated memory address for the writer to specify its write lock.

I think the file is the easiest. The long-running process would do its work, create the lock file, serialize the variables to JSON (or whatever you want), then remove the lock file, then repeat. Independently the reading process would check for the lock file, if its absent it would read the variable file and proceed with using the variables as it needs.

share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.