I'm using:
- A local Postgres DB
- Laravel 5 with a MySQL DB also local
- Postgres DB contains the table structure from OTRS - with over 91.000 rows on the table "tickets"
I needed to move the data from one DB to the other. What I did was compile a big query that united all the data I needed and run it in a PHP script I created inside the Laravel structure. Then I iterated through the results using Eloquent to insert into MySQL. I know it's terrible but I had to remove the memory and time limits for PHP in order to do this and though it took a really long time, it worked!
Right now I realized I missed something, I need to rerun a similar query (results in the same number of rows), but this time to just add one field on the MySQL DB.
My question is, how can I optimize this process? I'm thinking of using chunks but I don't know how to that.
To clarify:
- MySql's Tickets table contains 91397 rows and 5 columns
- Postgres Tickets table also has 91397 rows and 6 columns
- I created a migration on Laravel (MySQL) that added the extra column (though it's empty)
It's probably easier if I show you the code I have.