Software Engineering Stack Exchange is a question and answer site for professionals, academics, and students working within the systems development life cycle who care about creating, delivering, and maintaining software responsibly. Join them; it only takes a minute:

Sign up
Here's how it works:
  1. Anybody can ask a question
  2. Anybody can answer
  3. The best answers are voted up and rise to the top

I have two Float32Arrays each one of which is 1.6*10^7 in length(floating point array). Using JS I retrieve them from server and add them element by element. My webpage stops responding and I get the following error message. Is there a way of handling huge arrays on client side without using JS or with JS?

enter image description here

share|improve this question
6  
That's 128 freaking megabytes if my calculations are correct, and assuming you send them in perfectly packed binary rather than in JSON or something. It's debatable whether handling this much data on the client side is sensible, but sending them from the server to the client is insane. What do you need 32 million numbers for anyway? – delnan Aug 13 at 22:10
    
why don't you retrieve it chunk by a chunk from the server, instead of retrieving it at once, get a chunk --> process it --> store the result on the client --> repeat – David Aug 19 at 2:12
    
When you say add them element by element, do you mean using .push() on the array? It almost sounds like you mean adding to the DOM. – joshp Aug 21 at 10:47
    
What are you doing with these arrays once you retrieve them? I believe it is essential to know more details to guarantee a good answer. – kamoroso94 Aug 22 at 2:38

I don't have previleges to comment. So, I am adding it as an answer here.

My browser (firefox 47) was able to set values for 50,000,000 elements of a Float32Array, one by one, under one second (It is not an insert/append though). It ran out of memory beyond that.

I assume your bottleneck (browser warning) has more to do with slow fetching/processing of elements, few at a time, and keep the main browser thread busy.

If you really need that much data to be fetched/processed/inserted, while at the same time let the browser remain responsive to the user, you may want to consider the multi-threading in HTML5 (or javascript libraries provided by Mozilla and others). It will help you in offloading the busy work into a background thread.

share|improve this answer

The answer is that you can't process all that stuff toghether, because is too much.

You should rethink about what you are doing, some possible solution are:

  • split the data in chunks
  • reduce the size with filters
  • pre-elaborate on server side
  • ... be creative
share|improve this answer

What you need is processing relative big data in a low memory, low performance environment. The general solution to this is using streams. In these streams you put only a single or a few chunks in the memory, process it, and free the memory. So you won't need a lot of memory and processing power to do the job.

You need to stream the data from the server, create a processing stream on the client and display the data chunk by chunk, probably pixel by pixel. I guess you'll need a zooming algorithm, which should be able to use the same streams.

If you have a working solution, then you can try to make it faster, e.g. by using multiple websocket connections, multiple display streams, a client side storage to cache data, data preparation on server side, web workers, etc... I think this will be a much longer project than you expected... Don't forget to share the solution with the js community! :-)

share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.