Question

I need to do several test with a large file in mongoDB. But my test data is in .sql format and only of 2MB.

I need a way to generate a 2GB input file and I thought two ways:

  • Generate a 2GB file from the 2MB one, using some Linux magic.
  • Iterate the input instruction 1000 times.

I don't know in which OS MongoDB will run, so a multiplatform solution is preferred. Which way is better?

Thanks.

Was it helpful?

Solution

The following script, read inputfile.txt and append it into outputfile.txt until it reach a size of 2GB.

 filename_in="inputfile.txt"  
 filename_out="outputfile.txt" 
 size="2000000000"       #2GB
 echo > $filename_out
 while [  $(stat -c%s "$filename_out") -lt "$size" ]; do
      echo file size =  $(stat -c%s "$filename_out")
      cat $filename_in >> $filename_out
done
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top