14

I need to have a multidimensional array in a shared memory between two processes. I'm trying to make a simple example that works: I send [1, 2, 3, 4, 5, 6, 7, 8, 9] to the other process, which reshapes it into [[1, 2, 3], [4, 5, 6], [7, 8, 9]] without taking additional memory.

import multiprocessing
import ctypes
import numpy


def f(array):
    nmp = numpy.frombuffer(array.get_obj(), dtype=int)
    b = nmp.reshape((3, 3))


if __name__ == '__main__':
    time_array = []
    import common_lib
    common_lib.time_usage(time_array)
    arr = multiprocessing.Array(ctypes.c_int, [1,2,3,4,5,6,7,8,9])
    p = multiprocessing.Process(target=f, args=(arr,))
    p.start()
    p.join()

I did exactly as was in the manuals. But the function frombuffer gives this error:

ValueError: buffer size must be a multiple of element size

1 Answer 1

28

The dtype for the numpy array needs to be explicitly set as a 32-bit integer.

nmp = numpy.frombuffer(array.get_obj(), dtype="int32")

If you are on a 64-bit machine, it is likely that you were trying to cast the 32-bit ctypes array as a 64-bit numpy array.

2
  • It worked flawlessly! Thank you very much! I have 64bit machine. Just in case, what type should it be if I use float or double? Commented Apr 11, 2014 at 4:14
  • 1
    @soshial single precision float will be "float32" or np.float32 and double precision float will be "float64" or np.float64. Commented Apr 11, 2014 at 4:27

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.