Sign up ×
Stack Overflow is a question and answer site for professional and enthusiast programmers. It's 100% free.

I have a php array structure

Array
(
    [car] => Array
        (
            [red] => 0.333
            [orange] => 0.333
            [blue] => 0.333
        )

    [truck] => Array
        (
            [white] => 0.333
            [green] => 0.333
            [blue] => 0.333
        )
)

I have been using serialize to save the array to a text file, and unserialize to get back the array form. Unfortunately, the serialized array is getting very large, but mostly due to the float point (bug or by design) conversion when serialized. For example, instead of 0.333, the serialized process converts .333 into .3333333333333333333333333333333333333333333333333. This made me want to switch to json_encode to save the array. When compare serialize to json_encode, the serialized file is 40MB in size vs 8MB size for json_encode.

That is great, except when I try to json_decode the file, it is no longer in the array form. I tried json_decode($array, true), but that does not work either.

Any idea how to get json_encode to work for this example?

TIA

PS, my floating point number was generated by rounding off the decimals. Another answer that I found on StackO suggested that instead of using round($part/$sum, 3);, use sprintf('%.3f', $part/$sum); which turned the floating point into a string. That alone cut the serialized file down from 40MB to 19MB, but it still much larger than the json_encode file size of 8MB.

share|improve this question
    
When you decode the json_encoded data, what do you get? –  user1864610 Aug 24 '13 at 1:07
1  
Working from your array here, this phpFiddle successfully encodes and decodes to array format. –  user1864610 Aug 24 '13 at 1:15
    
Thanks Mike W, your answers helped me figured out that json_decode does not work well with large file. The largest json file that I was able to json_decode is only about .5-.6MB. –  Jamex Aug 24 '13 at 2:35

1 Answer 1

up vote 0 down vote accepted

The 'problem' is due to json_decode inability to read large json_encode files. The largest json file that can work is only ~.5MB. Tested on 4GB Ram, 4 core Xeon server, and also 4gb localhost laptop. I also had set memory_limit in the php.ini file to be 3GB for other php routines (yes, 3GB) and restarted apache. So the memory_limit setting appears not to be the problem.

Error message was not helpful, it stated that

Warning: array_slice() expects parameter 1 to be array, null given in /home/xxxxx/public_html/xxxx.php on line xx

Hopefully this error message would help some person in the future to narrow down the bug.

share|improve this answer
    
Your maximum file size seems rather small as a limit. Your error message implies that your decode returned a null, which will happen if the decode fails for some reason, The error can be determined by calling json_last_error_msg(). There are a number of possible errors that might have arisen, some of which could possibly be addressed. –  user1864610 Aug 25 '13 at 4:23
    
Thanks for the tip, I found others who have said that they have problems with json decode's handling of large arrays too, and there are classes to get around the problem. I thought that .6MB was a little small to run into this problem too, but then there is no definitive comparison of how large the array has to be before json decode fails. But it probably has to do with the size and complexity of the array. As it turned out, the more data my array has, the less useful it is to my algorithm, had to revert back to smaller array. –  Jamex Aug 25 '13 at 22:27

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.