Recently I had an issue with updating my log file and csv file from python when working on a cluster. Even though I waited for a long time the files were not being updated only when the script finished. In order to fix this I researched what other people have done. By flushing every so often the files will update. This solution worked for me. Hopefully it will work for you
import csv
import sys
csvfile = open('results/test.csv','wb')\
csvout = csv.writer(csvfile,delimiter=',',quotechar='"')
sys.stdout = open('results/log.txt','w')
for it in xrange(max_iters):
# rest of the program
csvout.writerow(['test1','test2'])
csvfile.flush()
sys.stdout.flush()