Question

I need to split large text files around 10 GB into multiple text files (mostly 1gb files) and join those same text files into one file.

Was it helpful?

Solution

If you have split command then try this,

Example:

split -b1024 your_large_file.txt sample_prefix

It will split the large file and produce the list of files with 1024 bytes.

Join:

cat sample_prefixaa sample_prefixab sample_prefixac > final_org_largefile.txt

It will concatenate the contents of the spitted files and produce the single file.

Note: Linux will have split command. But, I don't know about GNUwin32.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top