# Damage a huge documents right into smaller sized items

Just how do I damage a huge, +4 GB documents right into smaller sized documents of concerning 500MB each.

And also just how do I re-assemble them once more to get the initial documents?

0
2019-05-04 02:48:28
Source Share

You can additionally do this with Archive Manager if you favor a GUI. Look under 'Save- > Other Options- > Split right into quantities of'.

0
2019-05-08 04:01:55
Source

You can make use of split and also pet cat .

As an example something like

$split --bytes 500M --numeric-suffixes --suffix-length=3 foo foo. (where the input filename is foo and also the last argument is the result prefix ). This will certainly create documents like foo.000 foo.001 ... The very same command with brief alternatives :$ split -b 100k -d -a 3 foo foo

You can additionally define "-- line-bytes" if you desire it to split on line borders as opposed to simply specific variety of bytes.

For re-assembling the created items once more you can make use of e.g. :

$cat foo.* > foo_2 (thinking that the covering types the outcomes of covering globbing - and also the variety of components does not go beyond the system reliant restriction of debates ) You can contrast the outcome using :$ cmp foo foo_2
$echo$?

(which needs to result 0 )

Alternatively, you can make use of a mix of find/sort/xargs to re-assemble the items :

\$ find -maxdepth 1 -type f -name 'foo.*'  | sort | xargs cat > foo_3
0
2019-05-07 17:53:12
Source