My Slicehost machine ran out disk space yesterday, so I was looking for logs to rotate, files to delete, etc and I ran across a few large groups of files that I wanted to keep. I tried gzip -9 like always, but then found myself wondering if I could get these files any smaller. A quick google turned me on to rzip. But as it turns out, rzip is extremely memory intensive and was completely consuming everything I had on this poor slice.
I could go though all of the files trying to decide which ones I really wanted, but that seemed like it would take too long. In the end, I fired up an ec2 instance on Amazon's Web Services, moved the files over, rzipped them there, moved the results back and reclaimed about a gig of disk space. It only took a few minutes, and cost me $0.14 from amazon.
That's just crazy. I wonder how long we have before that just becomes common practice.
error sifting isotopes local variable 'maxk' referenced before assignment error making suggestion 'NoneType' object has no attribute 'category_label'