Take the 2-minute tour ×
Stack Overflow is a question and answer site for professional and enthusiast programmers. It's 100% free, no registration required.

I am writing the back end of an Online Judge ( code checker) in python 2.7 which takes submissions from database(using mysql) evaluates the submission and writes the result back to database.I am running multi processes and each process runs multiple threads.For time being I am printing the evaluations status and other stuff directly to STDOUT. I haven't even put lock on printing(which is cumbersome because there are many print statements).Although I know lock is necessary in this condition.But initially my evaluator had only one process which had only one thread.I did not put lock that time and was redirecting STDOUT and STDERR to file from command line.But now the case is different.Also such log would difficult to be read and finding the error and other things if my evaluator crashes.Is there some neat way of logging in this case ??

share|improve this question

1 Answer 1

up vote 2 down vote accepted

You can use a variable like a counter that is be a process or thread counter. So you can check this counter and you can use lock when counter bigger than 1.

share|improve this answer
1  
I guess that won't work I cannot allow threads/processes to simultaneously access this variable counter i would in turn have to put a lock on it.Same problem again. –  sasha sami Jul 7 '13 at 17:29
    
Yeap you are right. You have to use lock on it. I think you don't escape use to lock. I am wondering how to solve it. –  hinzir Jul 7 '13 at 18:08
    
yeah !! guess so thanks for replying :) –  sasha sami Jul 7 '13 at 18:34

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.