Take the 2-minute tour ×
Code Review Stack Exchange is a question and answer site for peer programmer code reviews. It's 100% free, no registration required.

I implemented floodfill algorithm (with stack) using Objective C. It is very slow as for each pixel few methods are called (I ported existed C implementation). Even on C it is quite slow as to compare 2 colors (for each pixel) 2 function calls used. Do you have any ideas to I can optimize it?

One of my ideas - to convert byte array of RGBA date to array[][] of color struct and use checking for equality of each element (e.g. color.r, color.g, color.b and color.a). But still it will be a question for blending colors, but I can call blend function only if alpha in in certain range.

Thank you! P.S: I'll share final result with acceptable performance under github, so anyone will be able to use it in the future...

share|improve this question

1 Answer 1

Your issue lies with the calls to malloc()/free().

Allocating memory is sloooooow. See here for objective c benchmarks.

Assuming you are filling a blank image, you will allocate one node for each pixel.

If you were doing this on an image for an iPhone 4 you might spend (based on malloc/free benchmark) approximately 640*960*656ns = 403ms per image, which is equivalent to 2.5fps

You should debug the scanline algorithm rather than use the stack algorithm. Failing that you could preallocate a second buffer of nodes to match your pixel count and fill your list with pointers to the corresponding node.

share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.