I'm passing a large dataset into a mysql table via php using insert commands and I'm wondering if its possible to insert approximately 1000 rows at a time via a query other than appending each value on the end of an mile long string and then executing it. I am using the codeigniter framework so its functions are also available to me.
3
4
|
|||
|
7
|
Assembling one That said, it sounds like you might be running into string-handling problems in PHP, which is really an algorithm problem, not a language one. Basically, when working with large strings, you want to minimize unnecessary copying. Primarily, this means you want to avoid concatenation. The fastest and most memory efficient way to build a large string, such as for inserting hundreds of rows at one, is to take advantage of the
The advantage of this approach is that you don't copy and re-copy the SQL statement you've so far assembled with each concatenation; instead, PHP does this once in the If you have lots of columns to put together, and one or more are very long, you could also build an inner loop to do the same thing and use |
||||||||||
|
0
|
Well, you don't want to execute 1000 query calls, but doing this is fine:
Depending on your data source, populating the array might be as easy as opening a file and dumping the contents into an array via |
||
|
0
|
You could prepare the query for inserting one row using the mysqli_stmt class, and then iterate over the array of data. Something like:
Where 'idsb' are the types of the data you're binding (int, double, string, blob). |
|||
|
0
|
You could always use mysql's
to do bulk inserts rather than using a bunch of |
|||
|
0
|
Hi, bulk insert always use all columns. In this case you must edit your CSV or Excel. Grretings Andreas |
||
|