Tuesday, October 20, 2015

git commit error due to large files

I couldn't commit apparently because something was cashed. I found this workaround
git stash save --keep-index
git filter-branch -f --index-filter 'git rm --cached --ignore-unmatch  path2file' 

Friday, October 16, 2015

Recently I had an issue with updating my log file and csv file from python when working on a cluster. Even though I waited for a long time the files were not being updated only when the script finished. In order to fix this I researched what other people have done. By flushing every so often the files will update. This solution worked for me. Hopefully it will work for you


import csv
import sys
csvfile = open('results/test.csv','wb')\
csvout = csv.writer(csvfile,delimiter=',',quotechar='"')
sys.stdout = open('results/log.txt','w')
for it in xrange(max_iters): # rest of the program csvout.writerow(['test1','test2']) csvfile.flush() sys.stdout.flush()