Err.... How many times per minute are you reading this data from Python?
Because in my system I could read such a file with 20 million records (~400MB) in well under a second.
Unless you are performing this in a limited hardware, I'd say you are worrying too much about nothing.
>>> timeit("all(b.read(20) for x in xrange(0, 20000000,20) ) ", "b=open('data.dat')", number=1)
0.2856929302215576
>>> c = open("data.dat").read()
>>> len(c)
380000172