Take the 2-minute tour ×
Stack Overflow is a question and answer site for professional and enthusiast programmers. It's 100% free, no registration required.

I have an array with 30000 plus entries that need to go into a MySQL table.

What is the best practice? From here? Lets say [0], [1] and [2] in the database would be 'title', 'type' and 'customer'

Is it add key names matching the column names in the table and call som "magic" function? Or build the query manually...

Array
(
    [0] => Array
        (
            [0] => 2140395946
            [1] => 1SAP
            [2] => 0041451463
        )

    [1] => Array
        (
            [0] => 2140411607
            [1] => 2SAP
            [2] => 0041411940
        )

    [2] => Array
        (
            [0] => 2140706194
            [1] => 4SAP
            [2] => 0041411943
        )
etc. etc.

UPDATE - based on answers

Thanks for the answers.

The solution would normally be to manually create the SQL-string and all rows can be inserted as one:

INSERT INTO `tx_opengate_table` (`machine` ,`customer` ,`type`)
VALUES
  ('m123', 'dfkj45', 'A'),
  ('m137', 'kfkj49', 'A'), "repeat this line for each entry in the array"
  ... ... ...
  ('m654321', '34dgf456', 'C4') "end with out comma, or remove last comma"
;

Special for TYPO3

I happen to do this in the CMS TYPO3 and just came across a new function added not that long ago:

//Insert new rows
$table = 'tx_opengate_stuff';
$fields = array('machine','type','customer');
$lines = "array as given above"
$GLOBALS['TYPO3_DB']->exec_INSERTmultipleRows($table,$fields,$lines);
share|improve this question

4 Answers 4

up vote 3 down vote accepted

I would say just build it yourself. You can set it up like this:

$query = "INSERT INTO x (a,b,c) VALUES ";
foreach ($arr as $item) {
  $query .= "('".$item[0]."','".$item[1]."','".$item[2]."'),";
}
$query = rtrim($query,",");//remove the extra comma
//execute query

Don't forget to escape quotes if it's necessary.

Also, be careful that there's not too much data being sent at once. You may have to execute it in chunks instead of all at once.

share|improve this answer
    
Super. That is it. I guess it just should be $item[0], $item[1] and $item[2] and not the $arr array that should go inte the $query string. –  Tillebeck Oct 28 '10 at 7:02
    
Yeah, sorry. I fixed it. –  Richard Marskell - Drackir Oct 28 '10 at 15:54
    
It's always necessary, as you stated in the recent comment :) –  Your Common Sense Jan 27 '11 at 18:18
    
@Col. Shrapnel: Yeah, I meant if they hadn't done it already. :) I don't know how the $arr is populated. –  Richard Marskell - Drackir Jan 27 '11 at 19:56

Magic function? I'm guessing you mean some sort of DB abstraction layer? IMHO that would just double your work.

Just build the query manually looping through array[] INSERT'ing values as you go.

share|improve this answer
$statement = "INSERT INTO table (title, type, customer) VALUES ";
foreach( $data as $row) {
   $statement .= ' ("' . implode($row, '","') . '")';
}

UPDATE: Changed explode to implode (I always get those confused).

share|improve this answer
    
Great. Something like this. I guess you would use "implode" to concat the $row array to a string, right? Otherwise I misunderstod ;-) –  Tillebeck Oct 28 '10 at 6:58
    
I find this one the best answer IF you change explode to implode... –  Daniel Hunter Mar 25 '12 at 23:25

You will have to build the query manually if you want the best performance during this operation. If you would iteratively add everything using PDO or some abstarction layer, you will have 30000+ insert queries.

Use foreach to iterate over the arraay, build one nested INSERT query that does all the work at once, and just send it to the server.

share|improve this answer
    
Doing extended inserts here is a bad idea. Ever seen what happens when you fill up the MySQL buffer? Perhaps doing extended inserts for 10 at a time isn't a bad idea, but this depends on data length. –  Brad Oct 26 '10 at 21:21
    
You should always be aware of the buffer size while doing a bulk insert in MySQL. That doesn't make it a bad idea. I never had this issue doing it with 100 - 1000 records a time without setting it. You can configure the buffer's length by setting the bulk_insert_buffer_size value. It defaults to 8388608, so whether you are going to have a problem really depends on the sizes of the field values. Clearly, when I tried to insert 1000 rows at once, my average length of the values for one row was less than 830 chars. By looking at the question, I don't expect one line to be longer than 40 chars. –  Pelle ten Cate Oct 27 '10 at 8:23

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.