As you will be decompressing your files and not files you got from a shady part of the internet you would probably avoid that. Although it will still run just fine there is a chance that a terrible security vulnerability may be found in the lzip tools and leveraged by the attackers on the poor souls who use lzip.īut considering that those users may not be the most numerous I think it will be a poor target. 7-zip can also handle xz, but not lzip, even though it's basically the same algorithm.Īlthough there is nothing that will happen in a mere 20 years, let alone 10, that will prevent you running lzip you are correct that as there seems to be only a single developer it may not get as much support in the future. Although I tend to go for bzip2 over gzip anyway. These days I only use gzip for text or when I want to be very quick or compatible with tools like 7-zip on windows. A gzipped tar is a compressed collection (of uncompressed files). There are plenty of ways to recover from damage and yet still use xz. A zip archive is a collection of compressed files. Lzip claims to have the ability to recover from damage, which is why I liked it, but I would have to give that a test myself. I have never seen a tar.lz file myself but tar.xz is seen widely. I prefer to use xz, even though I like lzips claims xz seems to have gotten a better footing in general use. Of the four compressors tested here, xz is the only one alien to the Unix concept of 'doing one thing and doing it well'. This either calls for a wrapper, or I personally CBB to use that pipeline.Nothing will happen to lzip within such a short timeframe. Lzip vs xz Xz has a complex format, partially specialized in the compression of executables and designed to be extended by proprietary formats. OTOH, encrypting with another tool makes for four++ commands (including key management and key storage), and four is larger than two. But memory usage for plzip2 is much higher Best compression speed goes to lbzip2 then Facebook's pzstd followed by pigz. ![]() However, plzip2 was faster than pxz so if you want speed + compression ratio, plzip2 would of been better. So sue both Solomon and his friend Reed for not coming up with next algorithm!Ĭan't recall from the top of my head if lzip segments+encrypts, but if it does, and does that multithreaded - well then, add some parity and you'll be fine. Best compression ratio goes to xz/pxz, followed by lzip/plzip and then the various bzip2 implementations. It's old, probably older than your smoothie-age githubs. ![]() I'd say zstd and gzip is better suited for general use, while bzip2 and 7zip (PPMd) are better suited for high compression of text files. This is CPU time from memory, might not be fully accurate. There was even a version with Thread Building Blocks support which is still cached, but I am not aware of its commit history. If I can remember correctly zstd 0.2s, gzip 0.8s, 7zip (PPMd) 2.1s, bzip2 2.7s, lzip, xz, 7zip (lzma) 15.16s. Heck, xz can be viewed as a fork of it, and in fact it uses one of it's algorithms!įor parity, I'd go with PAR 2.0 format because, well, is there any better?. Speaking of long term data storage, I'd go with segmented, encrypted 7-Zip with a couple of parity volumes added.ħz format is stable enough for my tastes (15+ years no problem, and it does warn you fairly when you do need that parity applied), opensource (so you can easily find old versions if need arises), encrypts (as long as you trust AES), compresses well (using quite some RAM depending on options), decompresses fast enough (but zstd is faster).
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |