1

Currently, my script downloads .txt files from an FTP site into a local directory (after checking to see if the directory already contains the file), converts each file into (separate) .csv. I need the data to port into a mysql database.

Here's the part that I'm having trouble with: running the second part of the script (ballotreader), where rows from the file are imported into SQL, when the path is not dynamic (i.e. R:\path\filename), the script works and the sql table populates. When the path is dynamic, (i.e. 'R:\path\'+filename) nothing populates. Any suggestions?

for filename in filenames:
    local_filename = os.path.join('R:\\path', filename)
    if os.path.isfile(local_filename) is False:
        print 'New file found.'
        file = open(local_filename, 'wb')
        ftp.retrbinary("RETR " + filename, file.write, 8*1024)
        file.close()
        print 'Downloaded '+filename+' file'
        txt_file = r""+filename
        csv_file = r""+filename+".csv" 
        in_txt = csv.reader(open(txt_file, "rb" ),delimiter = '|')
        outcsv = csv.writer(open(csv_file,'wb'))
        outcsv.writerows(in_txt)

        with open("R:\\path"+csv_file,'rb') as csv_input:
            ballotreader = csv.reader(csv_input, delimiter=',',quotechar ='|') 
            for row in ballotreader:  
                cursor = db.cursor()  
                if row[1] > 0:  

2 Answers 2

0

The problem is probably with the way you construct path. Try using os.path.join:

import os

path = os.path.join("R:\\path",csv_file)
with open(path,'rb') as csv_input:
...
0
0

Thanks @btel. That's the solution to my problem. My final code:

    with open(csv_file,'wb') as outcsv:
        out = csv.writer(outcsv, delimiter = '|')
        out.writerows(in_txt)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.