Tell me more ×
Stack Overflow is a question and answer site for professional and enthusiast programmers. It's 100% free, no registration required.

The normal result() method described in the documentation appears to load all records immediately. My application needs to load about 30,000 rows, and one at a time, submit them to a third-party search index API. Obviously loading everything into memory at once doesn't work well (errors out because of too much memory).

So my question is, how can I achieve the effect of the conventional MySQLi API method, in which you load one row at a time in a loop?

share|improve this question
1  
I don't think you can. Did you try loading multiple queries with LIMIT and OFFSET, maybe load 1000~ records a time. – Raphael_ Dec 11 '12 at 18:23
Work back from what you can reasonably pass to the search api, whether thats 10 or 100 or 1000, etc. Put something in place in case the group submission fails, and you have to resubmit to the search api. – cartalot Dec 11 '12 at 18:49

3 Answers

up vote 2 down vote accepted

Here is something you can do.

while ($row = $result->_fetch_object()) {
  $data = array(
    'id'         => $row->id
    'some_value' => $row->some_field_name
  );
  // send row data to whatever api
  $this->send_data_to_api($data);
}

This will get one row at the time. Check the CodeIgniter source code, and you will see that they will do this when you execute the result() method.

share|improve this answer
Thank you, @elvispt, even though this depends on an internal interface, it seems to be the only option. – Jonah Dec 14 '12 at 21:00
Thank you! Previously I've just used the regular old mysql functions when I've had to deal with datasets larger than PHP could hold in memory. This is waaaaay better. – Mala Feb 14 at 22:14

Well, there'se the row() method, which returns just one row as an object, or the row_array() method, which does the same but returns an array (of course).

So you could do something like

$sql = "SELECT * FROM yourtable";
$resultSet = $this->db->query($sql);
$total = $resultSet->num_rows();

for($i=0;$i<$total;$i++) {
  $row = $resultSet->row_array($i);
}

This fetches in a loop each row from the whole result set.
Which is about the same as fetching everyting and looping over the $this->db->query($sql)->result() method calls I believe.

If you want a row at a time either you make 30.000 calls, or you select all the results and fetch them one at a time or you fetch all and walk over the array. I can't see any way out now.

share|improve this answer
But the whole set can't be in memory at any given time. I want to select a row, push it off to the API, and dispose of that data, and repeat. – Jonah Dec 11 '12 at 18:57

Well, the thing is that result() gives away the entire reply of the query. row() simply fetches the first case and dumps the rest. However the query can still fetched 30 000 rows regardles of which function you use.

One design that would fit your cause would be:

$offset = (int)@$_GET['offset'];

$query = $this-db->query("SELECT * FROM table LIMIT ?, 1", array($offset));
$row = $query->row();

if ($row) {

    /* Run api with values */

    redirect(current_url().'?offset'.($offset + 1));

}

This would take one row, send it to api, update the page and use the next row. It will alos prevent the page from having a timeout. However it would most likely take a while with 30 000 records and refreshes, so you may wanna adjust your LIMIT ?, 1 to a higher number than 1 and go result() and foreach() multiple apis per pageload.

share|improve this answer
Unfortunately an HTTP redirect would not work in this case, as it's being called through the command line. It doesn't seem like a super-practical solution... I guess I might have to use the database adapter directly? – Jonah Dec 11 '12 at 18:58
You could call it each minute and check em of through an extra mysql column so it will WHERE api = 'FALSE' for you instead. – Robin Castlin Dec 11 '12 at 22:27

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.