I'm working with a remote db for importing data to my Django proyect's db.
With the help of MySQLdb
I've managed with ease to create an importing function like the following:
def connect_and_get_data(useful_string):
CONNECTION = MySQLdb.connect(host=..., port=...,
user=..., passwd=..., db=...,
cursorclass=MySQLdb.cursors.DictCursor,
charset = "utf8")
cursor = CONNECTION.cursor()
cursor.execute("SELECT ... FROM ... WHERE ... AND some_field=%s", (useful_string))
result = cursor.fetchall()
cursor.close()
Very happy with that, works as expected.
But going on with the code, I've noticed that sometimes I'll need to connect to the db again, in order to execute other different queries.
The first idea was quite logical, to me:
for every query I'll need, define a function which calls connect_and_get_data
with the given query as parameter... something like this:
def get_data_about_first_amazing_topic(useful_string):
query = "SELECT ... FROM ... WHERE ... AND some_field=%s" %(useful_string)
connect_and_get_data(query)
...
def get_data_about_second_amazing_topic(other_useful_string):
query = "SELECT ... FROM ... WHERE ... AND some_field=%s" %(other_useful_string)
connect_and_get_data(query)
...
with this modifications to connect_and_get_data
:
def connect_and_get_data(query):
...
cursor.execute(query)
...
As you already might imagine, this solutions fails.
Reading mluebke's answer to the question python mysql fetch query
"You are passing arguments to the execute function, not doing python string substitution"
I understood immediately where I was wrong; but I still feel that something is missing: I've tried different solutions, but I'm definitely unhappy with all of them.
Is there a "good" way to encapsulate my connect_and_get_data(query)
function, in order to serve me the way I want, or I'm totally in the wrong path with this?
Which ones are considered "best practices" in this situation?