Fastest way to write huge data in file

Two major reasons for observed “slowness”:

  • your while loop is slow, it has about a million iterations.
  • You do not make proper use of I/O buffering. Do not make so many system calls. Currently, you are calling write() about one million times.

Create your data in a Python data structure first and call write() only once.

This is faster:

t0 = time.time()
open("bla.txt", "wb").write(''.join(random.choice(string.ascii_lowercase) for i in xrange(10**7)))
d = time.time() - t0
print "duration: %.2f s." % d

Output: duration: 7.30 s.

Now the program spends most of its time generating the data, i.e. in random stuff. You can easily see that by replacing random.choice(string.ascii_lowercase) with e.g. "a". Then the measured time drops to below one second on my machine.

And if you want to get even closer to seeing how fast your machine really is when writing to disk, use Python’s fastest (?) way to generate largish data before writing it to disk:

>>> t0=time.time(); chunk="a"*10**7; open("bla.txt", "wb").write(chunk); d=time.time()-t0; print "duration: %.2f s." % d
duration: 0.02 s.

Leave a Comment