if you could pick a standard format for a purpose what would it be and why?
e.g. flac for lossless audio because…
(yes you can add new categories)
summary:
- photos .jxl
- open domain image data .exr
- videos .av1
- lossless audio .flac
- lossy audio .opus
- subtitles srt/ass
- fonts .otf
- container mkv (doesnt contain .jxl)
- plain text utf-8 (many also say markup but disagree on the implementation)
- documents .odt
- archive files (this one is causing a bloodbath so i picked randomly) .tar.zst
- configuration files toml
- typesetting typst
- interchange format .ora
- models .gltf / .glb
- daw session files .dawproject
- otdr measurement results .xml
This is the kind of thing i think about all the time so i have a few.
- Archive files:
.tar.zst
- Produces better compression ratios than the DEFLATE compression algorithm (used by
.zip
andgzip
/.gz
) and does so faster. - By separating the jobs of archiving (
.tar
), compressing (.zst
), and (if you so choose) encrypting (.gpg
),.tar.zst
follows the Unix philosophy of “Make each program do one thing well.”. .tar.xz
is also very good and seems more popular (probably since it was released 6 years earlier in 2009), but, when tuned to it’s maximum compression level,.tar.zst
can achieve a compression ratio pretty close to LZMA (used by.tar.xz
and.7z
) and do it faster[1].zstd and xz trade blows in their compression ratio. Recompressing all packages to zstd with our options yields a total ~0.8% increase in package size on all of our packages combined, but the decompression time for all packages saw a ~1300% speedup.
- Produces better compression ratios than the DEFLATE compression algorithm (used by
- Image files:
JPEG XL
/.jxl
- “Why JPEG XL”
- Free and open format.
- Can handle lossy images, lossless images, images with transparency, images with layers, and animated images, giving it the potential of being a universal image format.
- Much better quality and compression efficiency than current lossy and lossless image formats (
.jpeg
,.png
,.gif
). - Produces much smaller files for lossless images than AVIF[2]
- Supports much larger resolutions than AVIF’s 9-megapixel limit (important for lossless images).
- Supports up to 24-bit color depth, much more than AVIF’s 12-bit color depth limit (which, to be fair, is probably good enough).
- Videos (Codec):
AV1
- Free and open format.
- Much more efficient than x264 (used by
.mp4
) and VP9[3].
- Documents:
OpenDocument / ODF / .odt
- @raubarno@lemmy.ml says it best here.
.odt
is simply a better standard than.docx
.
it’s already a NATO standard for documents Because the Microsoft Word ones (.doc, .docx) are unusable outside the Microsoft Office ecosystem. I feel outraged every time I need to edit .docx file because it breaks the layout easily. And some older .doc files cannot even work with Microsoft Word.
- @raubarno@lemmy.ml says it best here.
.tar
is pretty bad as it lacks in index, making it impossible to quickly seek around in the file. The compression on top adds another layer of complication. It might still work great as tape archiver, but for sending files around the Internet it is quite horrible. It’s really just getting dragged around for cargo cult reasons, not because it’s good at the job it is doing.
In general I find the archive situation a little annoying, as archives are largely completely unnecessary, that’s what we have directories for. But directories don’t exist as far as HTML is concerned and only single files can be downloaded easily. So everything has to get packed and unpacked again, for absolutely no reason. It’s a job computers should handle transparently in the background, not an explicit user action.
Many file managers try to add support for .zip
and allow you to go into them like it is a folder, but that abstraction is always quite leaky and never as smooth as it should be.
.tar is pretty bad as it lacks in index, making it impossible to quickly seek around in the file.
.tar.pixz/.tpxz has an index and uses LZMA and permits for parallel compression/decompression (increasingly-important on modern processors).
It’s packaged in Debian, and I assume other Linux distros.
Only downside is that GNU tar doesn’t have a single-letter shortcut to use pixz as a compressor, the way it does “z” for gzip, “j” for bzip2, or “J” for xz (LZMA); gotta use the more-verbose “-Ipixz”.
Also, while I don’t recommend it, IIRC gzip has a limited range that the effects of compression can propagate, and so even if you aren’t intentionally trying to provide random access, there is software that leverages this to hack in random access as well. I don’t recall whether someone has rigged it up with tar and indexing, but I suppose if someone were specifically determined to use gzip, one could go that route.
By separating the jobs of archiving (.tar), compressing (.zst), and (if you so choose) encrypting (.gpg), .tar.zst follows the Unix philosophy of “Make each program do one thing well.”.
The problem here being that GnuPG does nothing really well.
Videos (Codec): AV1
- Much more efficient than x264 (used by .mp4) and VP9[3].
AV1 is also much younger than H264 (AV1 is a specification, x264 is an implementation), and only recently have software-encoders become somewhat viable; a more apt comparison would have been AV1 to HEVC, though the latter is also somewhat old nowadays but still a competitive codec. Unfortunately currently there aren’t many options to use AV1 in a very meaningful way; you can encode your own media with it, but that’s about it; you can stream to YouTube, but YouTube will recode to another codec.
The problem here being that GnuPG does nothing really well.
Could you elaborate? I’ve never had any issues with gpg before and curious what people are having issues with.
Unfortunately currently there aren’t many options to use AV1 in a very meaningful way; you can encode your own media with it, but that’s about it; you can stream to YouTube, but YouTube will recode to another codec.
AV1 has almost full browser support (iirc) and companies like YouTube, Netflix, and Meta have started moving over to AV1 from VP9 (since AV1 is the successor to VP9). But you’re right, it’s still working on adoption, but this is moreso just my dreamworld than it is a prediction for future standardization.
Could you elaborate? I’ve never had any issues with gpg before and curious what people are having issues with.
This article and the blog post linked within it summarize it very well.
- By separating the jobs of archiving (
.tar
), compressing (.zst
), and (if you so choose) encrypting (.gpg
),.tar.zst
follows the Unix philosophy of “Make each program do one thing well.”.
wait so does it do all of those things?
So there’s a tool called tar that creates an archive (a .tar
file. Then theres a tool called zstd that can be used to compress files, including .tar
files, which then becomes a .tar.zst
file. And then you can encrypt your .tar.zst
file using a tool called gpg, which would leave you with an encrypted, compressed .tar.zst.gpg
archive.
Now, most people aren’t doing everything in the terminal, so the process for most people would be pretty much the same as creating a ZIP archive.