One might exist already: lzlib.
I admit I haven’t done a great deal of research, so maybe there are problems, but I’ve found that lzip
tends to do better at compression than xz
/lzma
and, to paraphrase its manual, it’s designed to be a drop-in replacement for gzip
and bzip2
. It’s been around since at least 2009 according to the copyright messages.
That said, xz
is going to receive a lot of scrutiny from now on, so maybe it doesn’t need replacing. Likewise, anything else that allows random binary blobs into the source repository is going to have the same sort of scrutiny. Is that data really random? Can it be generated by non-obfuscated plain text source code instead? etc. etc.
The Robustness Principle may seem like little more than a suggestion, but it is the foundation on which many successful things are based.
To boil it down to meme-level old-school Torvaldsry: Assume everyone else is a f–king idiot who can barely do what they’re supposed to and expect to parse their files / behaviour / trash accordingly.
If you do not do this, you are, without doubt, one of those f–king idiots everyone else is having to deal with. If you do do this, it does not guarantee that you are not a f–king idiot. Awareness is key.
Examples where this works: Web browser quirks mode; Driving a car; Measure twice, cut once. This latter one is special because it reveals that often, the f–king idiot you’re trying to deal with is yourself.
Assume everyone else is worse.
Fun corollary: In altering his behaviour towards
f–king idiotspeople who should know better, Linus has learned to apply the robustness principle to interpersonal communication.