

print0 | xarsg -0 gzip -d # for the gunzip one, but still choke on files with "newline" in themĪnother (arguably better?) solution, if you have GNU find at your disposal: cd theparentdir & find. This option extracts each specified file in memory and compares the CRC (cyclic redundancy check, an enhanced checksum) of the expanded file with the original’s stored CRC value. For this, we can use the -t flag with the unzip command.
#Linux unzip gz archive#
You can also prevent specific file types from getting extracted. tgz file using the terminal into a specific folder (dir) To unpack and put files in a different folder (directory) say /tmp/data, enter: tar zxvf backups. You may want to make sure your archive isn’t corrupted. Here the command will unzip all files except excludedFile.txt.

if filenames have some special characters (spaces, tabs, newline, etc) in them.type f -name '*.gz')Īnd then, to gzip them back: cd theparentdir & gzip $(find. If you want, for each of those, to launch "gzip -d" on them: cd theparentdir & gzip -d $(find.
