Yesterday I had to upload a huge file (1.4GB tar.gz file) to my web-server and uncompress it once it was uploaded. Even my new fiber optic (FIOS) internet connection couldn’t handle a file that big (it dropped out at around 900mb). So I figured I would split the file up into manageable chunks, upload the split files to server, join them together, then uncompress them. Since I’m no unix shell guru I googled “splitting and merging binary files unix”; nothing useful came up, I tried changing the search terms a little bit to get different results, still nothing.
Using ManOpen I found the
split utility. Looked like exactly what I was looking for. I split the huge tar.gz file into 100mb chunks using the following command:
[code lang=”Bash”]split -b 100m bigarchive.tar.gz[/code]
This worked perfectly, so I searched for a
join command to merge the files together once they were on the server. There is indeed a join command, but it’s description is a “relational database operator”; not exactly what I was looking for. After a little bit of searching and experimentation I learned that the
cat command doesn’t just work with text files, it works with binary files too. So I used the following command to join the file chunks together and uncompress the resulting archive (I was able to use * instead of the filenames of the chunks since there weren’t any files in the directory besides the archive chunks):
[code]cat * | gunzip | tar xf -[/code]
It actually ended up to be a pretty simple operation, but it wasn’t documented on the web anywhere!