I'm getting the following error on a CGI script I'm running....
Traceback (most recent call last):, referer: http://mysite/cgi-bin/dev/testing.py
File "/var/www/cgi-bin/dev/testing.py", line 908, in <module>, referer: http://mysite/cgi-bin/dev/testing.py
webpage(), referer: http://mysite/cgi-bin/dev/testing.py
File "/var/www/cgi-bin/dev/testing.py", line 899, in webpage, referer: http://mysite/cgi-bin/dev/testing.py
getResults(form), referer: http://mysite/cgi-bin/dev/testing.py
File "/var/www/cgi-bin/dev/testing.py", line 557, in getResults, referer: http://mysite/cgi-bin/dev/testing.py
new_nums = processNums(nums), referer: http://mysite/cgi-bin/dev/testing.py
File "/var/www/cgi-bin/dev/testing.py", line 328, in processNums, referer: http://mysite/cgi-bin/dev/testing.py
t.start(), referer: http://mysite/cgi-bin/dev/testing.py
File "/usr/lib64/python2.6/threading.py", line 471, in start, referer: http://mysite/cgi-bin/dev/testing.py
_start_new_thread(self.__bootstrap, ()), referer: http://mysite/cgi-bin/dev/testing.py
thread.error: can't start new thread, referer: http://mysite/cgi-bin/dev/testing.py
It could be a ulimit issue on my machine but I wanted to check my code with you guys. Here is the code I use for threading....
import Queue
import multiprocessing
from threading import Thread
def processNums(nums):
new_nums = []
queue = Queue.Queue()
for num in nums:
queue.put(num)
thread_num = multiprocessing.cpu_count()
for x in range(0,thread_num):
t = Thread(target=multNum,args=(queue,new_nums,))
t.setDaemon(True)
t.start()
queue.join()
return new_nums
def multNum(queue,new_nums):
while True:
try: num = queue.get()
except: break
# do something....
new_num = num*123456
new_nums.append(new_num)
queue.task_done()
print processNums([54,12,87,3268,2424,148,5,9877])
outputs [6666624, 1481472, 10740672, 403454208, 299257344, 18271488, 617280, 1219374912]
This is a really REALLY watered down version of my code (there is so much I cannot copy all of it here) but I suspect my problem lies here. My question is... Should I be closing these threads somehow? Doesn't python do that automatically? Or is this a configuration issue with apache or my linux server? This is the first time I'm seeing this error, but it is also the first time I've run this application with the data set I'm using. This data set generates thousands of threads. Thanks.