I'm running a rather simple script with ArcPy, which extracts values of raster to points of a shapefile:
def raster_overlay(input_files):
for raster in input_files:
in_shp = "temp/shapefiles/points.shp"
in_raster = raster
print in_raster
ExtractMultiValuesToPoints(in_shp, in_raster, "BILINEAR")
raster_overlay(raster_files)
The variable raster_files
is a list with strings pointing to a number of raster files on the local hard disk.
The shapefile contains around 50 points, and each raster is around 15MB.
If the list of raster is small (more or less <20), everything works as expected. However, I have around 50 raster images. Using the complete list, Python crashes ("python.exe has stopped working"
) after processing 21 images or so.
The values of the first 20 raster are correctly appended to the shapefile (one column for each raster).
I monitored python.exe in the Windows 7 Resource Monitor. Memory usage of Python starts around 15 MB and then gradually increases to around 700 MB (and then Python crashes). So, memory leakage might be the reason for the crashes (although the PC has 8 GB of RAM).
I tried to include gc.collect()
and del in_shp
and del in_raster
inside the loop, but nothing changed. What else could I try?
PS: I just tried changing BILINEAR
to NONE
, and the script didn't crash. However, memory usage still increased in the same way (up to around 1.1 GB). And I would like to use bilinear interpolation, so the question is still relevant.
Edit: Cross-posted here to the ESRI support. I'll update this question if they can give a solution.
try: except:
won't catch a hard crash/segfault/core dump like the OP describes. – Luke Sep 11 '13 at 1:14