Tell me more ×
Unix & Linux Stack Exchange is a question and answer site for users of Linux, FreeBSD and other Un*x-like operating systems.. It's 100% free, no registration required.

I need to do some kind of incremental import again and again from a SQLite 3 database to a MySQL database. Column names are the same in both databases. Is there a good way to do this by a script (therefore is there a good way to get the column values of one row as some kind of array/list, so I can really work with them?) or am I better off with a small C program?

Update: OK, there is the -line option, giving one column value per line, result rows are then separated by an empty line. Still need to find a really simple way to parse this ;-)

share|improve this question
2  
Should be quite easy with a high level script language like perl, python or ruby. – Johan Sep 9 '12 at 15:12
this question is more suitable for stack overflow – amphibient Nov 1 '12 at 14:21

2 Answers

Your problem is not exactly simple, so I don't think there is a really simple solution.

Most probably you will need to modify this, but in the hope that it can help you get started here's a simple perl implementation of converting the output of sqlite3 -line to insert statements:

chomp($_);
if ($_) { 
    m/^\s*(\w+) = (.*)/; 
    $map{$1} = $2; 
} else { 
    my $keys = join(", ", keys(%map));
    my $values = join(", ", map { s/'/\\'/g; "'$_'"; } values(%map));
    print sprintf("insert into TABLE (%s) values (%s);\n", $keys, $values);
}

Save the above perl script in a file named parse.pl, and pipe the results of sqlite3 -line output to it like this:

echo 'select * from TABLE;' | sqlite3 -line sqlite3.db | perl -n parse.pl

The result is sql insert statements which can be piped to mysql, after you have tested carefully that they look correct. The script creates the column names and values dynamically, you may want to adjust that, especially the values, to not quote non-string values.

WARNING! Do not try to use this script "as is"! Automating inserts from one database to another is NOT a trivial task, and if not done carefully there can be all kinds of side effects depending on the type of data you have. For example I haven't tested the above script with multi-line data or binary valued columns.

Another solution is to save the output of sqlite3 in a file, and import that into mysql using the load data command. Look up the syntax and the details in the documentation, it has many interesting options. If I remember correctly it has an option to replace data, which I have used on a few occasions similar to your need, for incremental imports.

share|improve this answer
1  
I don't understand where the argument ($_) is coming from. It looks like you're missing a lot of parsing and quoting, what if a value contains a space or ' or other special character? – Gilles Sep 9 '12 at 21:18
When perl is invoked with the -n flag, the program will be wrapped in a while loop where $_ takes on the value of each line coming from stdin. I tested the example before posting, give it a try! You are right about escaping special characters. The script is meant as a proof of concept to begin with, and the basis for any necessary tuning for the actual use case. – janos Sep 10 '12 at 7:27

I've written such a thing in PHP using adapters, one thing we did was create a sys_parsed flag to mark what has been imported (or failed). Basically keep track so you don't end up trying an import that doesn't work forever.

It parses thousands of sqlite files and imports this into a singl mysql db with a table per customer. But it's definitely not a 15 liner. There's a lot more to it if you want reliable imports. It's quite performant too, not a lot of load.

If it interests you, I could take a look and see if I can share it (written for customer)

share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.