How in the world do people store images / photos nowadays?
Just as there is a clear winner for video - av1 - there seems to be nothing in the way of "this is clearly the future, at least for the next few years" when it comes to encoding images.
JPEG is... old, and it shows. The filesizes are a bit bloated, which isn't really a huge problem with modern storage, but the quality isn't great.
JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome, which pretty much makes it dead in the water (don't you just love monopolies making decisions for you?)
HEIC is good, as long as you pinky promise to never ever leave Apple's ecosystem, ie HEIC sucks.
AVIF seems computationally expensive and the support is pretty spotty - 8bit yuv420 might work, but 10b or yuv444 often doesn't. Windows 10 also chokes pretty hard on it.
Alternatives like WebP might be good for browsers but are nigh-unworkable on desktops, support is very spotty.
PNG is cheap and support is ubiquitous but filesizes become sky-high very quick.
So what's left? I have a whole bunch of .HEIC photos and I'd really like if Windows Explorer didn't freeze for literal minutes when I open a folder with them. Is jpeg still the only good option? Or is encoding everything in jpeg-xl or avif + praying things get better in the future a reasonable bet?
OneDeuxTriSeiGo 10 hours ago [-]
> JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome, which pretty much makes it dead in the water (don't you just love monopolies making decisions for you?)
It's worth noting that Firefox is willing to adopt JPEG-XL[1] as soon as the rust implementation[2] is mature And that rust impl is a direct port from the reference C++ implementation[3]. Mac OS and Safari already support JPEG-XL [4]. And recently Windows picked up JPEG-XL support. The only blockers at this point are Firefox, Chromium, and Android. If/when Firefox adopts JPEG-XL, we'll probably see google follow suit if only out of pressure from downstream Chromium platforms wanting to adopt it to maintain parity.
So really if you want to see JPEG-XL get adopted, go throw some engineering hours at the rust implementation [2] to help get it up to feature parity with the reference impl.
g**gle is hellbent on killing JPEG-XL support in favor of WebP. assuming they'll capitulate to downstream pressure is a stretch. this article [0] sums it up nicely:
What this [removal of support for JPEG-XL in Chromium] really translates to is, “We’ve created WebP, a competing standard, and want to kill anything that might genuinely compete with it”. This would also partly explain why they adopted AVIF but not JPEG XL. AVIF wasn’t superior in every way and, as such, didn’t threaten to dethrone WebP.
I'm not assuming they capitulate under just pressure. Rather I'm assuming they'll capitulate if a majority of or even all of the big third party chromium browsers push for adding it to mainline chromium.
This is less just blind pressure but rather the risk that google becomes seen as an untrustworthy custodian of chromium and that downstreams start supporting an alternate upstream outside of google's control.
Jxl is certainly a hill that google seems intent to stand on but I doubt it's one they'd choose to die on. Doubly so given the ammo it'd give in the ongoing chrome anti-trust lawsuits.
frollogaston 8 hours ago [-]
How is Google so intent on webp winning? They don't even support it in their own products besides Chrome.
Spooky23 6 hours ago [-]
Chrome is like a different company. They do weird shit.
ethbr1 4 hours ago [-]
You either die an hero or live long enough to become the MS Office team.
throw0101c 6 hours ago [-]
> HEIC is good, as long as you pinky promise to never ever leave Apple's ecosystem, ie HEIC sucks.
HEIC was developed by the MPEG folks and is an ISO standard, ISO/IEC 23008-12:2022:
OS support includes Windows 10 v1083, Android 10+, Ubuntu 20.04, Debian 10, Fedora 36. Lots of cameras and smartphones support it as well.
There's nothing Apple-specific about it. Apple went through the process of licensing H.265, so they got HEIC 'for free' and use it as the default image format because over JPEG it supports: HDR, >8-bit colour, etc.
†Like WebP was similar to an image/frame from a VP8 video.
socalgal2 4 hours ago [-]
> HEIC was developed by the MPEG folks
And the MPEG folks were so cool with video, all that licensing BS. Sounds great. No thanks!
jandrese 11 hours ago [-]
From what I've seen WebP is probably the strongest contender for a JPEG replacement. It's pretty common in the indie game scene for example to re-encode a JPEG game to WebP for better image quality and often a significant (25% or more) savings on installer size. Support is coming, albeit somewhat slowly. It was pretty bad in Ubuntu 22, but several apps have added support in Ubuntu 24. Windows 11 supports WebP in Photos and Paint for another example.
0x20cowboy 10 hours ago [-]
I hate webp. Not for any legitimate technical reason like, but I often just want to download an image from the web for an image board or drop it in a diagram or ppt or for a joke and nothing works with that format. Nothing. Google image search is useless because of it.
Cmd+shift+4 is now the only way to grab an image out of a browser. Which is annoying.
It has made my life needlessly more complicated. I wish it would go away.
Maybe if browsers auto converted when you dragged ann image out of the browser window I wouldn’t care, but when I see webp… I hate.
socalgal2 4 hours ago [-]
That's true of any new format. Until everything supports it it's not so great. iPhone saves .HEIC which I have to convert to something else to be useful. It's not everywhere (not sure it ever will be).
Windows didn't use to show .jpgs in the window explorer. I know becase I wrote a tool to generate thumbnail HTML pages to include on archive CDs of photos.
To solve this problem, some format has to "win" and get adopted everywhere. That format could be webp, but it will take 3-10 years before everything supports it. It's not just the OS showing it in it's file viewer. It's it's preview app supporting it. It's every web site that lets you upload an image (gmail/gmaps/gchat/facebook/discord/messenger/slack/your bank/apartment-rental-companies, etc..etc..etc..) I just takes forever to get everyone to upgrade.
bapak 3 hours ago [-]
When does a format stop being new? WebP was introduced fifteen years ago.
cosmic_cheese 10 hours ago [-]
Webp images are right up there with the fake transparent PNGs you come across in Google Images.
harry8 5 hours ago [-]
Total agreement from me, I use this:
bin/webp2png:
#!/bin/bash
dwebp "$1" -o "${1%%.webp}".png
frollogaston 8 hours ago [-]
Even if webp got better support later, I want it deprecated just as revenge for previously wasting my time.
jandrese 9 hours ago [-]
Worst case you can open it up in Paint and save as JPEG.
Also, I just checked and Powerpoint has no problem dropping in a webp image. Gimp opens it just fine. You are right that web forums are often well behind the times on this.
Just drop the offending image onto the icon in the dock.
nyanpasu64 9 hours ago [-]
My working model is that WebP images are generally a lossy copy of a PNG or a generation-loss transcoding of a JPG image. I know that lossless WebP technically exists but nobody uses it when they're trying to save bandwidth at the cost of the user.
encom 9 hours ago [-]
Often (in my experience) WebP is served as a bait-and-switch even if the link ends with .jpg. So I use Curl to fetch the file, and since Curl doesn't send "Accept: image/webp" unless you tell it to, the server just gives you what you ask for.
I once edited Firefox config to make it pretend to not support WebP, and the only site that broke was YouTube.
MiddleEndian 7 hours ago [-]
lol I installed the firefox extension "Don't Accept image/webp" but I assume a lot of sites just ignore it
adzm 8 hours ago [-]
Photoshop has native WebP support now too!
wang_li 10 hours ago [-]
On the classic macs they’re was a program called DropDisk. You could drag a disk image to it and it auto mounted it. That suggests a tool for you. Make a desktop app that you can drag and drop images on that converts them to jpeg and saves them in a folder.
msephton 7 hours ago [-]
You can create this using Automator in a minute.
8 hours ago [-]
jamiek88 6 hours ago [-]
thumbsup app does exactly this.
Salgat 10 hours ago [-]
Exactly, part of being a "superior format" is adoption. Until then, it's just another in a sea of potential.
The last major browser to add support was Safari 16 and that was released on September 12, 2022. I see pretty much no one on browsers older than Safari 16.4 in metrics on websites I run.
userbinator 3 hours ago [-]
People want to use images outside of browsers too.
turnsout 8 hours ago [-]
Yeah, after seeing the logs I made the switch to webp earlier this year. As much as I hate to admit it (not a fan of Google), it’s a pretty big bandwidth savings for the same (or better) quality.
hombre_fatal 7 hours ago [-]
I switched to webp on my forum for avatars and other user image uploads.
With one format you get decent filesize, transparency, and animation which makes things much simpler than doing things like conditionally producing gifs vs jpegs.
glitchc 12 hours ago [-]
> Just as there is a clear winner for video - av1 - there seems to be nothing in the way of "this is clearly the future, at least for the next few years" when it comes to encoding images.
Say what? A random scan across the internet will reveal more videos in MP4 and H.264 format than av1. Perhaps streaming services have switched, but that is not what regular consumers usually use to make and store movies.
CharlesW 11 hours ago [-]
New compressed media formats always travel a decade-long path from either (1) obscurity → contender → universal support or (2) obscurity → contender → forgotten curiosity. AV1 is on one path, WebP is on another.
Andrex 11 hours ago [-]
As someone who doesn't follow this stuff, which is on which path?
CharlesW 11 hours ago [-]
WebP remains fairly esoteric after 15 years, has always been a solution in search of a problem, and isn’t even universally supported in products by its owner.
AV1 was created and is backed by many companies via a non-profit industry consortium, solves real problems, and its momentum continues to grow. https://bitmovin.com/blog/av1-playback-support/
modeless 11 hours ago [-]
AV1 is on the path to universal support and WebP is on the path to obscurity.
tedunangst 10 hours ago [-]
Apple CPUs have AV1 support in hardware.
zimpenfish 1 hours ago [-]
Only support for decoding and from A17 Pro and M3 onwards, I believe? Going to be a few years before that's commonly available (he says from the work M1 Pro.)
[edit: clarify that it's decoding only]
consp 8 hours ago [-]
So does every modern GPU. This is nothing special.
airstrike 4 hours ago [-]
I think you're arguing the same point—that there's plenty of support and it's arguably growing.
Izkata 12 hours ago [-]
Yeah, I think I only just found out about av1 a few weeks ago with a video file that wouldn't play. Thought it was corrupted at first it's been so long since I saw something like that.
ksec 6 hours ago [-]
And H.264 is about to be patent free this year in many places.
userbinator 3 hours ago [-]
I suspect there are even more H.265 than av1.
zuminator 47 minutes ago [-]
People often forget that PNG images can be compressed in a lossy manner to keep the filesize down, not quite as well as jpegs but still quite substantially.
Normal people use jpeg. It's good enough, much like mpeg-2 was good enough for DVDs. Compatibility always beats storage efficiency.
Photography nerds will religiously store raw images that they then never touch. They're like compliance records.
munchler 12 hours ago [-]
I think most photography nerds who want to save edited images to a lossless format will use TIFF, which is very different from the proprietary "raw" files that come out straight out of the camera.
inferiorhuman 3 hours ago [-]
Most raw files are TIFF with proprietary tags.
IAmBroom 11 hours ago [-]
You'd be wrong in my experience.
No photog nerd wants EVEN MORE POSTPROCESSING.
munchler 9 hours ago [-]
I don't understand. You've got to save the edited result in a file somehow. What format do you use?
msephton 7 hours ago [-]
The file as it comes out of the camera, so-called raw, is a family of formats. Usually such files are kept untouched and any edits are saved in a lightweight settings file (in the format of your editing app) alongside the original.
ksec 6 hours ago [-]
And a low of RAW format are adopting or considering adopting JPEG Lossless as codec.
Gigachad 8 hours ago [-]
Normal people just use whatever the default on their phone is. Which for iPhone is HEIC, not sure about Android, AVIF?
> How in the world do people store images / photos nowadays?
Well, as JPEGs? Why not? Quality is just fine if you don't touch the quality slider in Photoshop or other software.
For "more" there's still lossless camera RAW formats and large image formats like PSD and whatnot.
JPEG is just fine.
afiori 12 hours ago [-]
I wonder how much of JPEG good quality is that we are quite accustomed to its artefacts.
Arainach 11 hours ago [-]
I've never seen JPEG artifacts on images modified/saved 5 or fewer times. Viewing on a monitor including at 100%, printing photos, whatever - in practice the artifacts don't matter.
BugsJustFindMe 12 hours ago [-]
At high quality, the artifacts are not visible unless you take a magnifying glass to the pixels, which is a practice anathema to enjoying the photo.
afiori 10 hours ago [-]
I am referring to highly compressed images or low resolution ones, at high bitrates mostly all formats look the same.
what i mean is that jpeg squarish artifacts look ok while av1 angular artifacts look distorted
whaleofatw2022 12 hours ago [-]
There's something to be said about this. A high quality JPEG after cleanup can sometimes be larger than an ARW (sony RAW) on export and it makes no sense to me.
djeastm 8 hours ago [-]
>are nigh-unworkable on desktops, support is very spotty
I use .webp often and I don't understand this. At least on Windows 10 I can go to a .webp and see a preview and double-click and it opens in my image editor. Is it not like this elsewhere?
hadlock 8 hours ago [-]
Try uploading one to any web service. Like imgur.
esafak 2 hours ago [-]
Assuming you are asking about archiving: Use the original format it came in. If you're going to transcode it should be to something lossless like J2K or PNG.
asielen 4 hours ago [-]
Tiff if you want to archive them and they started as raw or tiff, jpeg for everything else. If the file is already jpeg, there is no point in covering it to a new better quality format, the quality won't get better than it already is.
It may be obsolete, but it is ubiquitous. I care less about cutting edge tech than I do about the probability of being able to open it in 20+ years. Storage is cheap.
Presentation is a different matter and often should be a different format than whatever your store the original files as.
acomjean 3 hours ago [-]
And jpg isn’t that bad when encoded at high quality, and not saved repeatedly.
I took a tiff and saved it high quality jpg. Loaded both into photoshop and “diffed” them (basically subtracted both layers). After some level adjustment you could see some difference but it was quite small.
thisislife2 10 hours ago [-]
> How in the world do people store images / photos nowadays?
I had some high resolution graphic works in TIFF (BMP + LZW). To save space, I archived them using JPEG-2000 (lossless mode), using the J2k Photoshop plug-in ( https://www.fnord.com/ ). Saved tons of GBs. It has wide multi-platform support and is a recognized archival format, so its longevity is guaranteed for some time on our digital platforms. Recently explored using HEIF or even JPEG-XL for these but these formats still don't handle CMYK colour modes well.
theandrewbailey 7 hours ago [-]
> JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome, which pretty much makes it dead in the water (don't you just love monopolies making decisions for you?)
I've done several tests where I lowered the quality settings (and thus, the resulting file size) of JPEG-XL and AVIF encoders over a variety of images. In almost every image, JPEG-XL subjective quality fell faster than AVIF, which seemed mostly OK for web use at similar file sizes. Due to that last fact, I concede that Chrome's choice to drop JPEG-XL support is correct. If things change (JPEG-XL becomes more efficient at low file sizes, gains Chrome support), I have lossless PNG originals to re-encode from.
4 hours ago [-]
todotask2 4 hours ago [-]
At least there's JPEG-XL support in recent Windows 11 updates.
I've found that sometimes WebP with lossless compression (-lossless) results in smaller file sizes for graphics than JPEG-XL and sometimes it's the other way around.
nvch 7 hours ago [-]
RAW? Storage is becoming cheeper, why discard the originals?
When looking for a format to display HQ photos on my website I settled with a combination of AVIF + JPG. Most photos are AVIF, but if AVIF is too magical comparatively to JPG (like 3x-10x smaller) I use a larger JPG instead. "Magic" means that fine details are discarded.
WebP discards gradients (like sunset, night sky or water) even at the highest quality, so I consider it useless for photography.
mrheosuper 3 hours ago [-]
not every storage is created equal. 1TB hdd is dirt cheap, 1TB of cloud storage is expensive af
alistairSH 6 hours ago [-]
HEIC for photos taken by my iPhone. Apple stuff seems to do a mostly ok job auto-converting to JPG when needed (I assume, since I haven’t manually converted one in ages).
And JPG for photos taken on a “real” camera (including scanned negatives). Sometimes RAW, but they’re pretty large so not often.
rr808 5 hours ago [-]
I found that if you plug a iphone into a windows PC and copy the photos off it will convert to jpg. However it makes copying very slow, and the quality is worse, so I'd advise to turn off the setting on the phone (I think its compatibility mode or similar)
PaulHoule 11 hours ago [-]
I've done a few shootouts at various times in the last 10 years. I finally decided WebP was good for the web maybe two years ago, that is, I have 'set it or forget it' settings and get a good quality/size result consistently. (JPEG has the problem that you really need to turn the knob yourself since a quality level good for one image may not be good for another one)
I don't like AVIF, at least not for photos I want to share. I think AVIF is great for "a huge splash image for a web page that nobody is going to look at closely" but if you want something that looks like a pro photo I don't think it's better than WebP. People point out this example as "AVIF is great"
but I think it badly mangles the reflection on the left wing of the car and... it's those reflections that make sports cars look sexy. (I'll grant that the 'acceptable' JPEG has obvious artifacts whereas the 'acceptable' AVIF replaced a sexy reflection with a plausible but slighly dull replacement)
zeroq 11 hours ago [-]
For video it's not as easy as it takes way more compute and requires hardware support.
You can take any random device and it will be able to decode h264 at 4k. h265 not so much.
As for av1 - my Ryzent 5500GT released in 2024 does not support it.
7speter 10 hours ago [-]
I think the only cpus with av1 support right now, whether encode, decode or both, are tile era Meteor/Lunar/Arrowlake cpus from Intel.
Jap2-0 9 hours ago [-]
Actually, it goes back to Tiger lake (2020; 2021 for desktop). [0]
Addendum: AMD since RDNA2 (2020-2021-ish) [1], NVIDIA since 30 series (2020) [2], Apple since M3? (2023).
Note: GP's processor released in 2024 but is based on an architecture from 2020.
Apple M3 and newer CPUs and A17Pro and newer mobile CPUs also have hardware AV1 decode.
zmj 9 hours ago [-]
I recently reencoded my photography archive to webp. It's a static site hosted from S3. I was pretty happy with the size reduction.
SAI_Peregrinus 14 hours ago [-]
I store Raw + PSD with edits/history + whatever edited output format(s) I used.
MallocVoidstar 14 hours ago [-]
AV1 is not the clear winner for video. Currently-existing encoders are worse than x265 for high-bitrate encodes.
CharlesW 12 hours ago [-]
AV1's advantage narrows to ~5% over H.265 at very high data rates, in the same way that MP3 at 320 kbps is competitive with AAC at 320 kbps. But AV1 is never worse than H.265 from a VMAF/PSNR perspective at any bitrate, and of course H.265 is heavily patent encumbered in comparison. https://chipsandcheese.com/p/codecs-for-the-4k-era-hevc-av1-...
ksec 6 hours ago [-]
>AV1's advantage narrows to ~5% over H.265 at very high data rates.... But AV1 is never worse than H.265 from a VMAF/PSNR perspective at any bitrate,
There is a whole discussions that modern codec, or especially AV1 simply doesn't care about PSY image quality. And hence how most torrents are still using x265 because AV1 simply doesn't match the quality offered by other encoder/ x265. Nor does the AOM camp cares about it, since their primarily usage is YouTube.
>in the same way that MP3 at 320 kbps is competitive with AAC at 320 kbps.
It is not. And never will be. MP3 has inherent disadvantage that needs substantial higher bitrate for quite a lot of samples, even at 320kbps. We have been through this war for 10 years at Hydropgenaudio with Data to back this up, I dont know why in the past 2-3 years the topic has pop up once again.
MP3 is not better than AAC-LC in any shape or form even at 25% higher bitrate. Just use AAC-LC, or specifically Apple's Quick Time AAC-LC Encoder.
CharlesW 6 hours ago [-]
> There is a whole discussions that modern codec, or especially AV1 simply doesn't care about PSY image quality.
In early AV1 encoders, psychovisual tuning was minimal and so AV1 encodes often looked soft or "plastic-y". Today's AV1 encoders are really good at this when told to prioritize psy quality (SVT-AV1 with `--tune 3`, libaom with `--tune=psy`). I'd guess that there's still lots of headroom for improvements to AV1 encoding.
> And hence how most torrents are still using x265 because…
Today most torrents still use H.264, I assume because of its ubiquitous support and modest decode requirements. Over time, I'd expect H.265 (and then AV1) to become the dominant compressed format for video sharing. It seems like that community is pretty slow to adopt advancements — most lossy-compressed music <finger quotes>sharing</finger quotes> is still MP3, even though AAC is a far better (as you note!) and ubiquitous choice.
My point about MP3 vs. AAC was simply: As you reduce the amount of compression, the perceived quality advantages of better compressed media formats is reduced. My personal music library is AAC (not MP3), encoded from CD rips using afconvert.
shiroiuma 5 hours ago [-]
>Today most torrents still use H.264
That's not what I'm seeing for anything recent. x265 seems to be the dominant codec now. There's still a lot of support for h.264, but it's fading.
MallocVoidstar 10 hours ago [-]
I don't care about VMAF or PSNR, I care about looking with my eyes. With x265 on veryslow and AV1 on preset 0/1, and the source being a UHD BD I was downscaling to 1080p, AV1 looked worse even while using a higher bitrate than x265. Current AV1 encoders have issues with small details and have issues with dark scenes. People are trying to fix them (see svt-av1-psy, being merged into SVT-AV1 itself) but the problems aren't fixed yet.
eviks 37 minutes ago [-]
Well, did your "eyes" care more about fidelity or appeal?
>see svt-av1-psy, being merged into SVT-AV1 itself
Part of it being merged for now.
It is unfortunate this narrative hasn't caught on. Actual quality over VMAF and PSNR. And we haven't had further quality improvement since x265.
I do get frustrated every time the topic of codec comes up on HN. But then the other day I only came to realise I did spend ~20 years on Doom9 and Hydrogenaudio I guess I accumulated more knowledge than most.
spookie 7 hours ago [-]
Yup, have had the same experience.
twotwotwo 6 hours ago [-]
I recognize it as beating a dead horse now, but JPEG XL did what was needed to be actually adopted. AVIF has not been widely adopted given the difficulty of a leap to a new format in general and the computational cost of encoding AVIF specifically.
One of JPEG XL's best ideas was incorporating Brunsli, lossless recompression for existing JPEGs (like Dropbox's Lepton which I think might've been talked about earlier). It's not as much of a space win as a whole new format, but it's computationally cheap and much easier to just roll out today. There was even an idea of supporting it as a Content-Encoding, so a right-click and save would get you an OG .jpg avoiding the whole "what the heck is a WebP?" problem. (You might still be able to do something like this in a ServiceWorker, but capped at wasm speeds of course.) Combine it with improved JPEG encoders like mozjpeg and you're not in a terrible place. There's also work that could potentially be done with deblocking/debanding/deringing in decoders to stretch the old approach even further.
And JXL's other modes also had their advantages. VarDCT was still faster than libaom AVIF, and was reasonable in its own way (AVIFs look smoother, JXL tended more to preserve traces of low-contrast detail). There was a progressive mode, which made less sense in AVIF because it was a format for video keyframes first. The lossless mode was the evolution of FUIF and put up good numbers.
At this point I have no particular predictions. JPEG never stopped being usable despite a series of more technically sophisticated successors. (MP3 too, though its successors seemed to get better adoption.) Perhaps it means things continue not to change for a while, or at least that I needn't rush to move to $other_format or get left behind. Doesn't mean I don't complain about the situation in comments on the Internet, though.
bilekas 10 hours ago [-]
You’re right, for a lot of scenario which is exactly what a standard is there to do, encapsulating the broad strokes.
> Alternatives like WebP might be good for browsers but are nigh-unworkable on desktops, support is very spotty.
Right again, and WebP is the enrichment that goes with the backend when dealing with web. I wouldn’t knock it for not being local compatible, it was designed for the web first and foremost, I think it’s in the name.
tristor 14 hours ago [-]
For my extensive collection of photography, I export to JPEG-XL and then convert to JPEG for use online. Most online services, like Flickr, Instagram, et al don't support JPEG-XL, but there's almost no quality loss converting from JPEG-XL to JPEG vs exporting to JPEG directly from your digital asset management system, and storing locally in JPEG-XL works very well. Almost all desktop tools I use support JPEG-XL natively already, conversely almost nothing support WEBP.
Zardoz84 14 hours ago [-]
There is NO quality loss when converting from JPEG XL to JPEG and vice versa. It was done by design. Not an accident.
eviks 25 minutes ago [-]
You're confusing jpg>jxl>jpg, which can be done losslessly via a special mode, and jxl > jpg, which can't (even ignoring all the extra features of jxl that jpg doesn't support)
adgjlsfhk1 12 hours ago [-]
this isn't true. there's no loss from jpeg to jpeg-xl (if you use the right mode), but the reverse is not true
Zardoz84 12 hours ago [-]
I sorry to say that you are wrong about this.
> Key features of the JPEG XL codec are:
> lossless JPEG transcoding,
> Moreover, JPEG XL includes several features that help transition from the legacy JPEG coding format. Existing
JPEG files can be losslessly transcoded to JPEG XL files, significantly reducing their size (Fig. 1). These can be
reconstructed to the exact same JPEG file, ensuring backward compatibility with legacy applications. Both
transcoding and reconstruction are computationally efficient. Migrating to JPEG XL reduces storage costs
because servers can store a single JPEG XL file to serve both JPEG and JPEG XL clients. This provides a smooth
transition path from legacy JPEG platforms to the modern JPEG XL.
If you need more profs, you could transcode a JPEG to JPEG XL and convert against to JPEG. The result image would be BINARY IDENTICAL to the original image.
However, perhaps are you talking about an image on JPEG XL, using features only in JPEG XL (24 bit, HDR, etc...) that obviously couldn't be converted in a lossless way to a JPEG.
CorrectHorseBat 11 hours ago [-]
Yes, JPG to JPEG XL and back is lossless, but the reverse is nowhere mentioned.
Trying around with some jpg and jxl files I cannot convert jxl losslessly to jpg files even if they are only 8bit. The jxl files transcoded from jpg files show "JPEG bitstream reconstruction data available" with jxlinfo, so I think some extra metadata is stored when going from jpg to jxl to make the lossless transcoding possible. I can imagine not supporting the reverse (which is pretty useless anyway) allowed for more optimizations.
spider-mario 11 hours ago [-]
> However, perhaps are you talking about an image on JPEG XL, using features only in JPEG XL (24 bit, HDR, etc...) that obviously couldn't be converted in a lossless way to a JPEG.
A lot of those features (non-8×8 DCTs, Gaborish and EPF filters, XYB) are enabled by default when you compress a non-JPEG image to a lossy JXL. At the moment, you really do need to compress to JPEG first and then transcode that to JXL if you want the JXL→JPEG direction to be lossless.
gabrielhidasy 11 hours ago [-]
> However, perhaps are you talking about an image on JPEG XL, using features only in JPEG XL (24 bit, HDR, etc...) that obviously couldn't be converted in a lossless way to a JPEG.
So he was not wrong about this. You have perfect JPEG -> JPEG XL conversion, but not the other way around.
adgjlsfhk1 10 hours ago [-]
default jpeg-xl uses a different color space (XYZ), bigger transform (up to 256x256), rectangular transforms, etc. if you go from jpg to jxl, you can go back (but your jxl file will be less efficient), but if you compress directly to jxl, you can't losslessly go to jpg
tristor 13 hours ago [-]
That's good to know. I'm not an image format expert, but I couldn't see any loss that was visually discernible at any rate.
redeeman 11 hours ago [-]
just use jpegxl. works great on linux. Pressure software you use to use the proper formats
gsich 9 hours ago [-]
>How in the world do people store images / photos nowadays?
With codecs built for that purpose I hope. Intra-frame misconceptions "formats" should stay that way. A curiosity.
dangus 12 hours ago [-]
> HEIC is good, as long as you pinky promise to never ever leave Apple's ecosystem, ie HEIC sucks.
Not really true in my experience, I have no problems using it in Windows 11, Linux, or with my non-Apple non-Google cloud photos app.
The iPhone using it in an incredibly widespread way has made it a defacto standard.
If you're having trouble in Windows, I wonder if you're running Windows 11 or 10? Because 11 seems a lot better at supporting "modern things" considering that Microsoft has been neglecting Windows 10 for 3 years and is deprecating it this year.
munchler 12 hours ago [-]
My problem with HEIC is that if you convert it to another format, it looks different from the original, for reasons that I don't understand. I switched my iPhone back to JPEG to avoid that.
spookie 7 hours ago [-]
Only Gwenview on the Linux side is able to render them properly somehow
jorl17 11 hours ago [-]
Perhaps due to HDR handling?
heraldgeezer 10 hours ago [-]
>Just as there is a clear winner for video - av1
What?? Maybe I'm too much in aarrrgh circles but it's all H.264 / 265...
NoMoreNicksLeft 14 hours ago [-]
>JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome,
Why would I ever care about Chrome? I can't use adblockers on Chrome, which makes the internet even less usable than it currently is. I only start up chrome to bypass cross-origin restrictions when I need to monkey-patch javascript to download things websites try to keep me from downloading (or, like when I need to scrape from a Google website... javascript scraper bots seem to evade their bot detection perfectly, just finished downloading a few hundred gigabytes of magazines off of Google Books).
Seriously, fuck Chrome. We're less than 2 years away from things being as bad as they were in the IE6 years.
codazoda 14 hours ago [-]
I think we're there, not 2 years away.
I have software that won't work quite right in Safari or Firefox through a VPN every single day. Maybe it's the VPN and maybe it's the browser but it doesn't matter. We're at IE levels it's just ever so slightly more subtle this time. I'm still using alternatives but it's a battle.
NoMoreNicksLeft 13 hours ago [-]
VPN's layer 2... I suppose it could be resizing packets in such as way as to make it glitch out, but that just seems improbable.
Some of the warez sites I download torrents from have captchas and other javascripticles that only work on Chrome, but I've yet to see it with mainstream sites.
Fight the good fight.
frollogaston 8 hours ago [-]
The VPN could be on an IP address with a bad reputation that's getting soft-blocked by some stuff
crazygringo 9 hours ago [-]
> I can't use adblockers on Chrome
Why does this myth persist?
uBlock Origin Lite works perfectly fine on Chrome, with the new Manifest v3. Blocks basically all the ads uBlock Origin did previously, including YouTube. But it uses less resources so pages load even faster.
There's an argument that adblocking could theoretically become less effective in the future but we haven't seen any evidence of that yet.
So you can very much use adblockers on Chrome.
4 hours ago [-]
frollogaston 8 hours ago [-]
If uBO Lite is really better, why does uBO exist?
crazygringo 8 hours ago [-]
Because uBO Lite uses a newer Chrome function call (declarativeNetRequest) that didn't exist previously (original uBO was based on webRequest).
webRequest is slower because it has to evaluate JavaScript for each request (as well as the overhead of interprocess communication), instead of the blocking being done by compiled C++ code in the same process like declarativeNetRequest does.
uBO also has a bunch of extra features like zapping that the creator explicitly chose not to include in uBO Lite, in the interests of making the Lite version as fast and resource-light as possible. For zapping, there are other extensions you can install instead if you need that.
They're two different products with two different philosophies based on two different underlying architectures. The older architecture has now gone away in Chrome, but the new one supports uBlock Origin Lite great.
frollogaston 7 hours ago [-]
Thanks, I'll try it out then.
bob1029 15 hours ago [-]
Beyond the compression (which is amazing), JPEG is also extremely fast when implemented well. I'm not aware of any other image format that can encode at 60fps+ @ 1080p on a single CPU core. Only mild SIMD usage is required to achieve this. With dedicated hardware, the encode/decode cost quickly goes to zero.
I struggle to understand the justification for other lossy image formats as our networks continue to get faster. From a computation standpoint, it is really hard to beat JPEG. I don't know if extra steps like intra-block spatial prediction are really worth it when we are now getting 100mbps to our smartphones on a typical day.
illiac786 11 hours ago [-]
That’s why I am confident LLMs won’t change as much as some may think: after 20+ years of search engines, some still can’t be bothered to do a simple search. (Either that or you’re trolling, I can’t decide I have to say.) Hence, we can wait another 20 years and some will still not use LLMs for everyday questions.
To answer your (false?) question, there’s a long list of benefits, but I’d say HDR and storage efficiency are the two big ones I can think of. The storage efficiency especially is massive, especially with large images.
Lammy 10 hours ago [-]
> I struggle to understand the justification for other lossy image formats as our networks continue to get faster.
Because Google's PageSpeed and Lighthouse both tell people to use WebP, and a large percentage of devs will do anything Google say in the hopes of winning better SERP placement:
You might be getting 100 Mbps to your smartphone; many people – yes, even within the United States – struggle to attain a quarter of that.
bob1029 15 hours ago [-]
What is the likelihood of experiencing precisely marginal network conditions wherein webp improves the user experience so dramatically over jpeg that the user is able to notice?
If jpeg is loading like ass, webp probably isn't going to arrive much faster.
MrDOS 14 hours ago [-]
I'm sorry, I misunderstood your doubt of the usefulness of other lossy formats as criticism of using lossy formats in general in the face of higher bandwidth. Reading too fast, never mind me... :)
tossaway0 2 hours ago [-]
25Mbps is extremely fast in relation to the benefits when browsing the web of better image compression options than JPEG.
GuB-42 7 hours ago [-]
If you have slow internet on your smartphone, chances are that you also have a slow smartphone, and therefore decoding performance matter, it may also save you a bit of battery life for the same reason, which may be important in place with little internet coverage.
You have to find a balance, and unless (still) pictures are at the center of what you are doing, it is typically only a fraction of the bandwidth (and a fraction of the processing power too).
We are not talking about 100 Mbps, we downloaded JPEGs from dialup connections you know. You don't even need to go into the Mbps unless you are streaming MJPEG (and why would you do that?).
wizardforhire 15 hours ago [-]
Exactly! It’s like asking why we still use wheels when hovercrafts exist.
If humans are still around in a thousand years they’ll be using jpegs and they’ll still be using them a thousand years after that. When things work they have pernicious tendency to stick around.
eviks 9 minutes ago [-]
Besides the awful wheel comparison, there are dozens of formats that worked and stuck around until they got replaced, so this also tells us nothing on such a huge timescale
dsr_ 10 hours ago [-]
Wheels continue to support a load without power.
Wheels are vastly superior to hover technologies in the crucial areas of steering and controlled braking. (For uncontrolled braking, you just cut the power to your hover fans and lift the skirts...)
It turns out to be remarkably difficult to get a hovercraft to go up an incline...
Wheels are both suspension and traction in one system.
There's no particular physical advantage to JPEG over the others mentioned; it's just currently ubiquitous.
pbhjpbhj 15 hours ago [-]
Can JPEG do 3D somehow (I'm thinking VR/AR)? DVDs lasted well, until the medium itself moved to cheap NAND flash and then various SSD technologies.
When/if simple screens get usurped then we'll likely move on from JPEG.
I'm sure you were being a little flippant but your last sentence shows good insight. Someone said "we just need it to work" to me the other day and the "if it works there will be little impetus to improve it"-flag went off in my brain.
adgjlsfhk1 12 hours ago [-]
depends what you mean by 3d. jpeg-xl does let you add arbitrary channels, so you could add a depth channel, but it's not going to do a good job for full 3d (e.g. light field/point cloud).
one place I think jxl will really shine is PBR and game textures. for cases like that, it's very common to have color+transparency+bump map+normal map, and potentially even more. bundling all of those into a single file allows for away better compression
wizardforhire 14 hours ago [-]
Thanks, thats a great insight!
Idk about 3d, but I’ll assume someone probably will tape something out necessity if they haven't already.
…and yes, very flippant! But not without good reason. If we are to extrapolate; the popularity of jpeg, love it or hate it, will invariably necessitate it’s continued compatibility contributing to my pervious statement. That compatibility will invariably lead to plausible hypothetical circumstances where future developers out of laziness, ignorance, or just plain conformity to norms will lead to its choice and use perpetuating the cycle. The tendency as such is that short of a radical mass extension level like event brought about by mass wide spread technological adoption such as what you describe is why I don’t see it going away anytime soon. Not to say it couldn’t happen, I just feel it’s highly improbable because of the contributing human factors.
That jpeg gets so many complaints is I feel for two reasons. One, its ubiquity and two, that we actually see it!
Some similar situations that don’t get nearly as much attention but are far more pervasive are tcp/ip, bash, ntpd, ad nauseam. All old pervasive protocols so embedded as to be taken for granted, and also not able to be seen.
I’ll leave with this engineering truism that I feel should be more widely adhered to in software development, especially by UI designers: if it ain’t broke don’t fix it!
tehjoker 12 hours ago [-]
Jp3d can do 3d, but it is not well supported. It is an extension to the JPEG2000 specification iirc.
redeeman 11 hours ago [-]
transparancy? hdr? proper support for lossless? theres many things lacking in jpeg
GuB-42 8 hours ago [-]
Because JPEG just works. It is not the best by far, but everybody supports it, and it is usually not worth saving a few bytes in exchange for worse support, added complexity and extra processing.
Lossy compressed images are usually not the most significant consumers of bandwidth and disk space. Videos are. That's why there have been a lot more focus on video formats than anything else, there is a lot to gain here, not so much with still images.
JPEG-XL is super-complicated because it supports plenty of things most people don't really need.
Webp is somewhat better supported because it is backed by Google, it is also what is essentially a single frame video, so if you did the hard work on video (where it matters), you get images almost for free, and it saves Google a tiny bit of bandwidth, and "a tiny bit" is huge at Google scale.
We are seeing the same thing with audio. MP3 (1991) is still extremely popular, the rest is mostly M4A/AAC (2001). We pretty much have had the perfect audio format now, which is Opus (2012) and yet, we don't even use it that much, because the others are good enough for what we make of them.
Gigachad 8 hours ago [-]
Opus does get use though behind the scenes. Iirc discord uses opus and YouTube uses some pretty exotic audio codecs. No one saves opus files to their computer though.
RadiozRadioz 8 hours ago [-]
Opus is too complex. IIRC there is only one full implementation.
reddalo 15 hours ago [-]
The article only briefly mentions the real problem: outside of browsers, proper support for .webp files is very, very low. That's why JPEG is still king and probably still be for a long time.
AshleysBrain 15 hours ago [-]
WebP seems pretty widely supported to me - on Windows at least, Explorer shows thumbnails for them, Paint can open them, other editors like Paint.NET have built-in support... I haven't come across software that doesn't support WebP for a while.
frollogaston 14 hours ago [-]
Google Docs, of all things, does not support webp. Preview on Mac can open it but not edit. Those are my two most common use cases.
coryrc 12 hours ago [-]
I celebrated the anniversary of the (internal) bug asking for SVG support in Google slides. I think it's up to 15 years now?
So, uh, don't get your hopes up.
frollogaston 12 hours ago [-]
Well SVGs I understand being harder to support, those aren't really images. And various anti-injection security rules treat it as untrusted HTML code.
pimlottc 9 hours ago [-]
There is a workaround for using SVG in Google Slides by using Google Drive to convert to EMF (a format I’ve never heard of anywhere else). It’s a pain, though.
Yep, on Gnome we have both eog and GIMP that support webp completely, and have for many years. I don't think I've even tried with other apps but haven't needed to. I didn't even realize this was a problem for some platforms
CM30 14 hours ago [-]
Case in point, DaVinci Resolve. Incredibly popular with people creating videos for YouTube and TikTok, still doesn't support webp in 2025.
This becomes an issue if you're creating content about trending topics, since lots of marketing sites love using webp for every image.
nemomarx 15 hours ago [-]
if I want to even download webp and look at the file I need to convert it. barely functional in basic image galleries outside of mobile?
k__ 15 hours ago [-]
Sometimes you upload a jpeg, they convert it to webp, and then don't allow uploading webps.
palmfacehn 15 hours ago [-]
I had uploaded lossless webp images, the 3rd party site cached the images from my server and re-encoded them as lossy format of a higher file size and lower fidelity.
edflsafoiewq 14 hours ago [-]
They'll do that to large PNGs too.
throawayonthe 15 hours ago [-]
[dead]
Acrobatic_Road 15 hours ago [-]
Also missing from popular browsers is support for the new JPEG XL format.
Looks like a mixture of runtime and compiler flags are needed except for Safari.
JacobiX 15 hours ago [-]
I loved the article, but it overlooks one important point: although the JPEG format is frozen, encoders are still evolving !
Advances such as smarter quantization, better perceptual models, and higher-precision maths enables us achieve higher compression ratios while sticking to a format that's supported everywhere :)
cogman10 15 hours ago [-]
This is true, but there are limits. It's a little bit like DEFLAT. Sure, very advanced compressors like Zopfi exist which can get better compression ratios. But then, there's also just Zstd which will get a better compression ratio and compression speed trivially.
edflsafoiewq 15 hours ago [-]
I guess you're thinking of jpegli? Do you know how big a difference this actually makes?
qingcharles 4 minutes ago [-]
Jpegli is designed from the ashes of JPEG-XL (same author), both from Google. IIRC he also had a hand in the PNG format?
ksec 14 hours ago [-]
Anywhere from 5-15% if I remember correctly depending on source material. I was at one point thinking this would make JPEG-XL and AV1F moot because all of a sudden JPEG became good enough again. But the Author of JPEG-XL suggest there is still so much JPEG-XL encoder can do to further optimise bit / quality especially in the bpp below 1.0 range.
JacobiX 14 hours ago [-]
MozJPEG, Guetzli and also Jpegli
legitster 15 hours ago [-]
This takes me back to when the goal of a webpage was to be less than 1Mb. At the time, the only reason to use PNG (such luxury) was when you needed transparency.
The variable compression of JPEG was very important. In Photoshop you could just grab an image and choose the file size you needed and the JPEG quality would degrade to match your design constraints.
modeless 11 hours ago [-]
The amazing thing about JPEG is people are still squeezing out backwards compatible compression improvements 30 years later. Mozjpeg is well known by now, but Jpegli was just published last year and does even better at high bitrates[1]. It's hard to want to adopt newer formats when they keep getting squeezed by further improvements to good old JPEG.
EDIT: I was wrong, and had PNG and JPEG formats backward. As others correctly pointed out PNG is lossless where as JPEG is lossy. PNG is better for marketing / UI / artistic imagery and JPEG for photographs due to the tolerance for JPEGs lossy encoding with photographs, seems to be the generally accepted opinion now.
Regardless, since the picture tag[0] was introduced I’ve used that for most image media by default with relevant fallbacks, with WebP as default. Also allows loading relevant sized images based on media query which is a nice bonus
JPEG is lossy, in ways that were initially optimized for photographs. The details it loses are often not details photographs are good at providing in the first place. As the upside for losing some data, it gets to pick the data it gets to compress, and it chooses it in such a way as to minimize the size of the compressed data.
PNG is a lossless format. It's practically mandatory when you need 100% fidelity, as with icons or other graphics that are intended to have high contrast in small areas. It's able to optimize large areas of the same color very well, but suffers when colors change rapidly. It's especially unsuitable for photographs, as sensor noise (or film grain, if the source was film) create subtle color variations that are very difficult for it to encode efficiently.
You basically never have a situation where either one is appropriate. They are for different things, and should be used as such.
MrDOS 15 hours ago [-]
PNGs for line art and text, JPEGs for photorealistic images.
> When I came up in the IE era
In the IE era I recall, the battle was between GIF and JPEG because IE supported alphatransparent PNGs very poorly :)
> if I recall correctly JPEG as a format can encode an image with a higher fidelity than PNG
The other way around: JPEGs are “lossy” – they throw away visual information to save file size. PNGs, on the other hand, are “lossless”, and decode back to exactly the same pixels that were fed into the encoder.
no_wizard 15 hours ago [-]
It’s been so darn long since I looked into these formats I got it backward. Thank you!
Tobani 15 hours ago [-]
They're different beasts.
JPEGS are great for photographs, where lossy compression is acceptable.
PNGs can have transparency and lossless compression, so they're great for things like rasterized vector graphics, UI-element parts of a web page, etc.
kevingadd 15 hours ago [-]
PNGs are also ideal for color accuracy since one of the things you lose quickly when converting to JPEG is the ability to have an exact RGB value flow from input to output, even at a high quality level. So if you want i.e. a banner graphic to seamlessly blend in with your site's background color, JPEG is worse for that.
martin_a 14 hours ago [-]
Sorry, but that sounds more like a problem with your color management systems and workflows and not so much with JPEG itself.
kevingadd 10 hours ago [-]
JPEGs are intrinsically YCbCr not RGB, so I don't know how you would avoid some precision loss when you combine that colorspace conversion with things like the quality level and (potentially) subsampling.
qingcharles 2 minutes ago [-]
Are they? I thought JPG could switch? I know jpegli uses XYB colorspace in JPGs.
roelschroeven 15 hours ago [-]
It depends on the use case.
For archival purposes, where you care about not losing details is more important than image size, PNG is better (though often TIFF is used for that use case). For images with large blocks of solid colors and sharp edges (text, line drawings), PNG is arguable better (though JPEG can be acceptable if you're careful with quality settings). If you need alpha support, go for PNG since JPEG doesn't support that.
For photograph-like images, where image size is important, JPEG is preferred over PNG.
uses 15 hours ago [-]
PNG should be used for some types of graphics, like whenever you have big areas of solid color (like logos) or any time you need translucency / transparency. Although, nowadays you can and should use SVG in most of those cases.
JPEG should be used for everything else.
adgjlsfhk1 12 hours ago [-]
if your software supports it, lossless jpeg-xl supports all of these and should give you better compression
ghssds 15 hours ago [-]
JPEG and PNG don't work the same way. JPEG is lossy - it removes information from the original image, and PNG is lossless. For distribution, like on a website, with JPEG you can compress the image much more than PNG without the users noticing IF the image is photographic. If the JPEG represents a drawing, a screenshot, or contains labels, the lossy compression will be noticeable and PNG is much more appropriate. To keep as master for further modifications and compressions, keeping an image in JPEG format is a bad idea, again because JPEG is lossy. You can't ever encode an image with higher fidelity in JPEG vs PNG but both are useful.
sapiogram 15 hours ago [-]
> (if I recall correctly JPEG as a format can encode an image with a higher fidelity than PNG, at least in some circumstances)~
PNG is a lossless format, so I don't think that's possible, unless there's some specific feature that is not available in PNG.
addaon 15 hours ago [-]
> if I recall correctly JPEG as a format can encode an image with a higher fidelity than PNG, at least in some circumstances
24-bit color PNGs are lossless, to the extent that the input image is encodable in 24 bits of RGB (which is pretty much everything that's not HDR). There's no higher fidelity available for normal input images. If file size limits would force palettized PNGs, it's quite possible for a JPEG at the same file size to have higher fidelity (since it makes a different set of trade-offs, keeping color resolution but giving up spatial resolution instead); but this isn't really a common or particularly valid comparison in the PNG era, was more of an issue when comparing to GIFs.
tl;dr: Nope, PNG is perfect. JPEG can approach perfect, but never get there. Comparison is only interesting with external (e.g. file size) constraints.
The reason JPEGs still rule is because Google Chrome removed support for JPEG-XL, the actually better photo format, because the Google guys who did AVIF decided they don't want competition.
userbinator 3 hours ago [-]
I have written a JPEG, GIF, and PNG decoder. All easily weekend projects. As for the other two I've attempted the same for:
JPEG2000: insanely complex and nonintuitive, especially the edge-cases and overly flexible encoding decisions
WebP: also complex, and effectively Google-proprietary
panja 2 hours ago [-]
Thoughts on jxl?
NooneAtAll3 1 hours ago [-]
have you tried QOI? ;)
hungryhobbit 15 hours ago [-]
I love how the article implies there's something flawed about webp at the end ... but if you click the link the only "flaw" reported is that webp isn't ubiquitous enough yet, so some sites don't support it
Perfect logic: let's not switch to webp because it's bad. Why is it bad? Not everyone has switched to it yet.
Lammy 10 hours ago [-]
Being a much more complicated format than JPEG (WebP is based on the VP8 video codec) invites a huge attack surface because it requires so much more code to support. On top of that, the fact that 99% of people (even Apple's ImageIO!) use Google's libwebp means any exploit can hit almost everyone all at once. This has actually happened:
The lack of support makes me suspicious of it. If even Google Docs finds it too difficult to prioritize webp support, idk if there's some hidden problem with ease of implementation.
msabalau 14 hours ago [-]
As an enduser, I hate, hate, hate webp, because I cant' easily use the images in a wide range of ways.
Maybe it's vaguely more flexible and compresses well. I don't care. If someone uses it, I despise them.
fastball 15 hours ago [-]
> These days, the format is similar to MP3 or ZIP files—two legacy formats too popular and widely used to kill.
While killing MP3 might be difficult, the vast majority of people aren't handling audio files themselves these days, so probably not hard to phase out fairly rapidly.
11 hours ago [-]
frollogaston 14 hours ago [-]
I do handle files myself, but almost nothing keeps me on MP3. The only time I've even seen an MP3 in the past few years was when trying to get music into an old car system, which also sucked so much (e.g. shuffle feature with replacement) that I went back to just using the aux.
fastball 4 hours ago [-]
Yeah, most audiophiles/people I know who haven't moved entirely to Spotify/YouTube/Apply Music/etc are not storing files in mp3. FLAC or something else lossless is the obvious preference.
Krasnol 15 hours ago [-]
The Western World perspective on this platform generates quite funny statements sometimes. Outside of it, mp3 is still quite popular and normal.
Even within the Western World, there are many people who like to own their digital music.
geerlingguy 15 hours ago [-]
I still encode everything in MP3. The files work on my 20 year old SanDisk player, my original iPod, my iPhone, Mac, Chromebook, Windows laptop, MP3 CDs in our 08 minivan...
It's nice to have that consistent ubiquity, something very hard to find these days. Especially if you're entire audio library (audio books, podcasts, songs) comes from some streaming service that requires an app!
ryandrake 11 hours ago [-]
MP3 is still "good enough." I wasn't smart enough to keep FLACs around, and I'm not going to go back through my hundreds of CDs in boxes in my attic somewhere and re-encode every single one to a "modern" format for slight but noticeable quality gains. As you say, they also work everywhere, including my 16 year old car.
frollogaston 14 hours ago [-]
The poor quality is a dealbreaker for me. Yes it's fine with a high enough bitrate, but there isn't ubiquitous support for that.
encom 8 hours ago [-]
What decoders don't support VBR MP3 at this point? You would have to go back at least 20 years to find software that breaks on VBR. Maybe some very very terrible hardware decoder chokes on it?
Incidentally, breakage on VBR bitstreams is buggy behaviour, because some lazy developers assumed frame sizes would never change. VBR is completely within spec, and decoders do not have to explicitly support it.
Lastly a note on bitrate: 320 kbps CBR (the max allowed in spec) is often wasteful and pointless. In many cases, an encoder will pad out frames to conform to the requested bitrate. Indeed tools exist that will losslessly re-encode a CBR file to VBR by removing the padding, producing a smaller file. MP3 (as good as it is) has certain problem samples that aren't fixed by throwing more bits at them. A competent encoder with proper settings, like lame which defaults to -V4 is transparent in most samples to most people. If you disagree you should double-blind test yourself.
frollogaston 8 hours ago [-]
I'll have to check again, but my issue wasn't VBR, it was CBR above 128kbit/s I believe? 2012 Maserati GranTurismo, which has notoriously not-so-good electronics.
fastball 4 hours ago [-]
People that like to own their digital music in the Western World are generally not storing the files in mp3.
Where outside the West are you seeing many people that are specifically storing audio files in mp3 (vs just streaming/ storing in a better digital format/storing physical media)?
I live in SEA and most people here are not storing their own mp3s. Most people don't have computers at all – they have budget Android phones that don't have much built-in storage. What they do have is cheap internet, so they are either using Spotify (Free)/YouTube/etc. Many people still use CDs (mostly in cars) but those aren't mp3 either.
conradfr 15 hours ago [-]
Like when someone says spending $1000 to learn to code effectively using Claude or any other AI is nothing.
lucgommans 9 hours ago [-]
> It’s been difficult to remove [old JPEG] from its perch. [...] the formats AVIF and HEIC, each developed by standards bodies, have largely outpaced [JPEG]
- JPEG has two advantages on slow connections: the dimensions are either stored up front so the layout doesn't jump, or maybe the renderer is better; and it loads a less-sharp version first and progressively gets sharper
- JPEG was way faster when compressing and decompressing
- on the particular photo I wanted to optimise in this instance, JPEG was also simply the best quality for a given filesize which really surprised me after 32 years of potential innovation
Regarding AVIF, my n=1 experience was that it "makes smooth gradients where jpeg degrades to blotchy pixels, but at decent quality levels, jpeg preserves the grain that makes the photo look real". Gradients instead of ugliness at really small sizes can be perfect for your use-case, but note that it's also ~80 times slower at compression (80s vs. <1s)
JpegXL isn't widely in browsers yet so I couldn't use it
> These days, the [JPEG] format is similar to MP3
The difference with mp3 is that Opus is either a bit better or much better, but it's always noticeably better.
You can save ~half the storage space. For speech (audio books) I use 40kbps, and for music maybe 128kbps which is probably overkill. And I delete the originals without even checking anymore if it really sounds the same, I noticed that I simply can't tell the original apart in a blind test, no matter what expensive headset setup I try
TFA attributes it to a simple "they were first" advantage, but I think this is why "Why JPEGs still rule the web": no file format is better than JPEG in the same way as Opus is better than MP3; in that you don't have to think about it anymore and it's always a win in either filesize or quality
That said, Opus is also annoyingly hard to get into people's minds, but I've done it and you also see major platforms from compress-once-serve-continuously video (e.g. Youtube) to VoIP (e.g. Whatsapp) switching over for all their audio applications
ksec 15 hours ago [-]
I want to add a slightly off topic point here.
This submission was originally shown as [dead]. I have no idea why, I read some of the content and seems decent enough, especially in the current state of things when JPEG-XL is blocked because of AOM / Google Chrome. I vouched for it and upvoted, then somehow it is on the front page.
I wonder if dead means somehow flagged it. If so, then why? If not, why is it dead?
carlosjobim 11 hours ago [-]
Almost every new submission to Hacker News gets [dead] and [flagged] for no reason, and somebody has to vouch for it. I don't know if it's automated system or some kind of activists who are stalking the New section.
encom 8 hours ago [-]
Flagging seems to be used like a super downvote on both submissions and comments. Admins appear okay with it.
hoseja 37 minutes ago [-]
There is an photo example for quality levels throughout the article but instead of classical JPEG compression artifacts it just seems to get progressively more posterized. What's up with that.
t1234s 11 hours ago [-]
I notice webp produces images about 20% smaller than mozilla jpeg while appearing slightly sharper. I use a <picture> element to offer webp versions first but always include jpeg versions for future-proofing.
karim79 8 hours ago [-]
I've no single clue as to the future of image formats. All or them (or almost) have merits and stuff they suck at. I've been building a service which makes no judgement about it but tries to provide all the choices. JXL will be added soon:
The actual issue is that this is not an issue for the users. Only global providers, so they were the ones pushing "obscure" solutions. Music fidelity was more of a consumer problem, so formats like FLAC found a foothold besides the go-to one.
MallocVoidstar 15 hours ago [-]
JPEG: No active patents, universal support, good enough.
HEIC: Have fun licensing this.
WebP: Slightly better than JPEG maybe, but only supports 1/4 chroma resolution in lossy mode so some JPEGs will always look better than the equivalent WebP.
AVIF: Better than JPEG probably, but encoders for AV1 are currently heavily biased towards blurring, even at very high bitrates. Non-Chrome browser support took a while.
dylan604 15 hours ago [-]
Sometimes a format is simply just good enough. The gains that WebP may or may not have definitely do not get it over the hump of being so tied to a browser. AVIF is still very new, but other than the anim stuff, is it enough of an improvement to get people to switch. And that brings me to the entire corpus of existing files. A JPEG decoder will always be necessary based on the amount of preexisting files requiring it. If you have to support JPEG for some, might as well continue making new in the same format as well.
frollogaston 10 hours ago [-]
"Do I look like ah know what a web-P is? I just want a jaypeg of a gosh darn hotdog!" is my attitude on this
llm_nerd 15 hours ago [-]
JPEG XL: Better than JPEG in every way aside from legacy support, while being royalty free and open.
You can even "losslessly" compress existing JPEGs to JPEGXL.
JPEG XL is the natural replacement for JPEG and it is perverse that Google backtracked on supporting it.
addaon 15 hours ago [-]
> JPEG XL: Better than JPEG in every way aside from legacy support, while being royalty free and open.
And decoder complexity. A software JPEG decoder is a weekend project. A hardware JPEG decoder not much more. Doing the same for arbitrary JPEG XL files is much, much more complicated. In any world where any of development cost, implementation complexity, expected code quality (especially when using first-order assumptions like constant number of defects per line of code), or decoder resources (especially for hardware implementations) are important, JPEG has serious advantages.
llm_nerd 11 hours ago [-]
If we were talking about some abstract hypothetical format, this is entirely reasonable. Only there are a number of extremely high quality JPEG XL decoders. There is zero need for hardware assistance for decoding JPEG or JPEG XL, so that difference is a non-difference.
Every device in the world with iOS 17 or macOS 14 or better has JPEG XL support across the system.
This is a complete and utter non-issue. Google had even added JPEG XL support to Chromium, and then bizarrely removed it (not long before Apple fully supported JXL across all their platforms which invariably would have pushed it over the top), presumably to try to anoint WebP as the successor. Only WebP has so many disadvantages that all they did was entrench classic JPEG.
JPEG XL is unquestionably the best current next gen format for images.
addaon 9 hours ago [-]
> There is zero need for hardware assistance for decoding JPEG or JPEG XL, so that difference is a non-difference.
This depends on the system requirements, doesn't it? Suppose you're compositing a low-safety-impact video stream with (well, under) safety-impacting information in an avionics application, and you're currently using a direct GMSL link. There's an obvious opportunity to cost-down and weight-down the system by shifting to a lightly compressed stream over an existing shared Ethernet bus, and MJPEG is a reasonable option for this application (as is H.264, and other options -- trade study needed). When considering replacing JPEG with JPEG XL in this implementation, what's your plan for providing partitioning between the "extremely high quality" but QM software implementation and the DAL components? Are you going to dedicate a core to avoid timing interference? At that point you're spending more silicon than a dedicated JPEG decoder would take. You likely already have an FPGA in the system for doing the compositing itself, but what's the area trade-off between an existing "extremely high quality" JPEG XL hardware decoder and the JPEG one that you've been using for decades?
I don't doubt that in a world where everything is an iPhone (with a token nod to Android), "someone already wrote the code once and it's good enough" is sufficient. But there's a huge field of software engineering where complexity and quality drive decision making, and JPEG XL really is much more complex than JPEG Classic Flavor.
15 hours ago [-]
_kidlike 15 hours ago [-]
what about PNG?
kccqzy 14 hours ago [-]
Apples and oranges. PNGs aren't used for photographic images. It's good for line art, certain cartoons, pixel art and the like.
zargon 14 hours ago [-]
Entirely different category.
tsoukase 6 hours ago [-]
There are only two kinds of image file formats: the ones people complain about and the ones nobody uses.
greenavocado 15 hours ago [-]
DO NOT USE WEBP
JPEG-XL is the superior format. The only reason WebP exists is not because of natural selection, but because of nepotism (Google Chrome).
JPEG-XL is supported by exactly 1 browser. WebP and AV1F are supported by just about every browser.
I'd love to use JPEG-XL, but I'm guessing the only way to do that is also bringing along a WASM decoder everywhere I want to use it.
wpollock 11 hours ago [-]
This situation might change if Google is forced to divest itself from Chrome. This is currently in the US courts, but it might take awhile.
14 hours ago [-]
caycep 14 hours ago [-]
due to above said nepotism...
standards require some politicking and money I suppose
Arnt 14 hours ago [-]
Standards do require that. Someone has to show up at the right meetings and conferences, and it's often necessary to contribute code too.
"Build it and they will come" doesn't work for products, and it doesn't work for standards either.
Arnt 13 hours ago [-]
... and I want to add that there's no need to assume nepotism.
If you get code merged into something like Chrome, and it's big and goes unused for a few years, at some point some security-minded person will come along and argue that your code is an unused attack surface and should be removed again.
ericmcer 13 hours ago [-]
Standardization is the most important feature a technology can have. Look at JS.
frollogaston 9 hours ago [-]
Standardization is a big feature of JS, but it's also a surprisingly good language for its use cases. There was even good reason to port it over to the backend (NodeJS).
josefx 11 hours ago [-]
> Look at JS.
Given how much duct tape it took at times to get various browsers to behave I would say JS is proof of the opposite. It succeeded in an environment where standards where a mere suggestion.
frollogaston 9 hours ago [-]
Don't they all run JS ES5 itself the same way? It's more that each has different feature sets (HTML5 stuff, webrtc, wasm, etc), which are callable from JS but would've been a problem regardless of the language.
9 hours ago [-]
NoMoreNicksLeft 14 hours ago [-]
There are only two browsers, Safari and Firefox.
Acrobatic_Road 14 hours ago [-]
only Nightly
whywhywhywhy 15 hours ago [-]
Upload a JPEG, gets converted to WEBP
next person, downloads WEBP converts it to JPEG to actually edit it in software because even things like Photoshop/MacOS Preview can't edit one natively, saves to JPG and uploads, gets converted to WEBP
next person downloads the now JPEG's WEBP'd JPEG'd WEBP'd image
fast forward a decade the original quality versions of the images no longer exist just these degraded
blooalien 14 hours ago [-]
Even if you have image editing software that directly supports WEBP format input / output, you still have the exact same problem, because (like JPEG) it's a lossy format which will lose fidelity with each successive generation of load / save over time. If you intend to edit an image, then the original (if possible) and it's edit should both be saved in a lossless format even if the final published output is saved in a lossy format. If no lossless format of the original is available, then the highest quality version of the original should be converted to a lossless format for "archival" before editing, and that copy should always be used for editing purposes, rather than piling loss upon loss editing the lossy format and re-saving. It's kinda like copies of copies of copies of MP3 audio. Eventually it becomes a soupy mess not worth using.
whywhywhywhy 4 hours ago [-]
doubles the speed of the degradation because the image is encoded on upload regardless of size and double stacks the encoding of artefacts
vunderba 12 hours ago [-]
Well... kinda. WebP is a bit of an unusual format in that it supports both lossy and lossless forms of compression.
Most decent image editing software (Photoshop, Pixelmator, etc) will let you choose what you want.
So, what you are saying is that JPEG-XL is superior, as long as your use case is insensitive to whether the majority of web users can view your content?
alwillis 14 hours ago [-]
I get it but 2+ billion Apple device users is not nothin’.
They can render JPEG-XL; everything else will render the fall back format like JPEG or WebP.
throw_m239339 14 hours ago [-]
Teams & developers will likely only chose a single format if they can, the one that most browsers support, because doing some content negotiation is more code, more work. It doesn't take away anything from the parent point though, if JPEG-XL is more performant it could reduce bandwidth requirements.
alwillis 11 hours ago [-]
> because doing some content negotiation is more code, more work
It's actually not more work. The user's browser automatically handles the content negotiation and only downloads the image format it understands:
macOS, iPadOS and iOS get the JPEG-XL image, device that can handle WebP get that and everything else gets JPEG.
There are several image delivery services that will provide the best image format depending on the device that's connecting.
Zardoz84 14 hours ago [-]
just fuking use a poly fill to add support of JPEG XL. Or store JPEG XL and convert on the fly to JPEG to supply it to browsers that don't support JPEG XL.
> just fuking use a poly fill to add support of JPEG XL. Or store JPEG XL and convert on the fly to JPEG to supply it to browsers that don't support JPEG XL.
Doesn't a polyfill imply more Javascript running on the device?
BugsJustFindMe 14 hours ago [-]
Only until the browser gets updated and then the polyfill stops being invoked automatically. It's self-healing.
Zardoz84 12 hours ago [-]
Tt's a WASM . And really isn't very big. So if you need to store and serve many images files, potentially big images like in an preservation & diffusion software (like the stuff that I'm working), I felt that I could afford to pay the extra few KiBs on that WASM.
Spivak 15 hours ago [-]
I mean if you define superior not in terms of its technical merits but because of its blessed status by Google. It's not a terrible format but it is very much "what is the worst quality we can reasonably get away with to save on bandwidth." Such a thing does have its uses.
At some point you have to be pragmatic and meet users where they are but doesn't mean you have to like that Google did throw their weight around in a way that only they really can.
dragonwriter 14 hours ago [-]
> I mean if you define superior not in terms of its technical merits
“Technical merits” are rarely, for anything, the sole measurement of fitness for purposes.
Even for purely internal uses, internal social, cultural, and non-technical business constraints often have a real impact on what is the best choice, and when you get out into wider uses with external uses, the non-technical factors proliferate. That's just reality.
I understand the aesthetic preference to have decisions only require considering a narrow set of technical criteria which you think should be important, but you will make suboptimal decisions in the vast majority of real-world circumstances if you pretend that the actual decision before you conforms to that aesthetic ideal.
palmfacehn 15 hours ago [-]
I enjoy webp lossless mode
frollogaston 14 hours ago [-]
I'm not going to use webp or jpeg-xl.
RankingMember 12 hours ago [-]
Yeah they're both a pain in that there's friction at all to open/use them.
gjsman-1000 15 hours ago [-]
JPEG XL is superior... just like how Betamax was visually superior.
greenavocado 15 hours ago [-]
The main problem with your argument is the people want JPEG-XL. The main reason it is not in is purely due to Jim Bankoski's limited judgement, intelligence, and foresight.
Just like how some people wanted Betamax. Politics drives adoption; not technical superiority.
izacus 14 hours ago [-]
"People" here are a tiny minority of techies that have emotional connection to the library they built and a group of OSSers which will take anything that bashes Google as a gospel.
Everyone else... is fine with JPEG. And occasionally shares HEIC pictures from their iPhones.
greenavocado 14 hours ago [-]
Thankfully, unlike people and physical hardware, software and algorithms can lie in wait in perpetuity until they are resurrected when the political winds finally change course or favorable conditions for their spread emerge.
throw_m239339 14 hours ago [-]
> Thankfully, unlike people and physical hardware, software and algorithms can lie in wait in perpetuity until they are resurrected when the political winds finally change course or favorable conditions for their spread emerge.
XHTML 2 is waiting on that one... Oh well...
gsich 9 hours ago [-]
>just like how Betamax was visually superior.
Only for a very brief period. (Beta 1)
77pt77 15 hours ago [-]
How come jpeg 2000 never became popular?
izacus 14 hours ago [-]
In 99% cases of "Why wasn't this (better) format adopted?" the answer is:
* It's not actually better.
* It's patented/requires license/is owned by someone who wants a lot of royalties.
JPEG2000 is of the second variety.
SimplyUnknown 15 hours ago [-]
Multiple reasons, while technically better and more benign compression artifacts, it is computationally more expensive, limited quality improvements, encumbered by patents, poor Metadata format, poor colorspace support... In the end, the benefits aren't great enough compared to jpeg to change the default format
AshleysBrain 15 hours ago [-]
IIRC JPEG2000 was never supported by any browser other than Safari, and even Safari recently gave up and removed support (around the same time they added support for JPEG XL). As to why other browsers never supported it, I'm not sure.
Maken 15 hours ago [-]
JPEG 2000 used to be a patent minefield.
meindnoch 15 hours ago [-]
Greed. Wavelets used to be heavily patented.
llm_nerd 15 hours ago [-]
Orthogonal, but one fun thing about JPEG 2000 is that when you watch a movie at movie theatres now, odds are overwhelming that you are watching a sequence of JPEG 2000 encoded images.
77pt77 14 hours ago [-]
So like mjpeg but for jpeg 2000?
That doesn't even make much sense because you lose inter-frame compressibility
meindnoch 12 hours ago [-]
Every frame is a keyframe in digital cinema. It's literally a bunch of JPEG2000 files, wrapped in an MXF container. A typical 2hr movie is in the 300-400GB ballpark.
martin_a 14 hours ago [-]
As others have pointed out, JPEG is just fine. It's "enough" in the best sense of the word and gets its job done. It's supported on every device, in every browser, image viewer and whatnot. It. Just. Works.
Maybe there are formats that compress better or lossless, but thanks to advancements in disk space and transfer rates (I know, not everywhere but penetration and improvement will happen...) the disadvantages of JPEG can be handled and we can just enjoy a very simple file format.
In an era where enshitification lingers around every corner I'm just happy that I don't need to think about whether I have to convert _every digital picture I've ever taken_ into some next-gen format because some license runs out or whatnot. It just works. Let's enjoy that and hope it sticks around for 30 more years.
politelemon 15 hours ago [-]
The last part of the article about jpeg being outpaced seems anecdotal or unsubstantiated, it needs context.
cadamsdotcom 6 hours ago [-]
“Because it has a 30 year head start. And there are better, new, formats coming all the time; but everyone supporting them hates each other.”
jmyeet 15 hours ago [-]
So there are two use cases for public image dissemination:
1. Lossy: JPEG fills this role;
2. Lossless: this was GIF but is now PNG; and
3. Animated: GIF.
So for a format to replace JPEG, it must bring something to the table for lossy compression. Now that JPEG is patent-free, any new format must be similarly unencumbered. And it's a real chicken-and-egg problem in getting support on platforms such that people will start using it.
I remember a similar thing happening with Youtube adding VP9 (IIRC) support as an alternate to H264, which required an MPAA patent license. The MPAA also tried to cloud VP9 by saying it infringed on their patents anyway. No idea if that's true or not but nobody wants that uncertainty.
Anyway, without total support for VP9 (which Apple devices didn't have, for example) Youtube would need to double their storage space required for videos by having both codecs. That's really hard to justify.
Same goes for images. You then need to detect and use a supported image format... or just use JPEG.
shmerl 9 hours ago [-]
I'm trying to replace jpeg with avif in my use cases, but some sites still don't handle avif.
Even Github! Though the latter doesn't support IPv6 either.
echelon 15 hours ago [-]
Nothing supports WebP.
Most websites break with WebP. Desktop tools choke on WebP.
It sucks, because it's a good format.
JonnyReads 15 hours ago [-]
I've recently been battling with github not having support for WebP. Seems odd for a website not to support an image format when the browser does.
afavour 15 hours ago [-]
> Most websites break with WebP
That part at least isn't true.
edflsafoiewq 15 hours ago [-]
Probably they mean uploading WebPs where an image is expected often doesn't work.
echelon 14 hours ago [-]
Yes, I was referring to the backends.
A funny case in point: Sora.com produces WebP outputs, but you can't turn around and use them as inputs. (Maybe they've fixed that?)
Smaller websites almost always reject them. Even within big websites, support is fractured within the product surface area. You can't use them as Reddit profile icons, for instance.
One of the most apparent issues is that a lot of thumbnailing and CDN systems don't work natively with WebP, so you have to reject WebP outright until broader support is added.
Once the WebPs are in your systems, you have to make sure everything downstream can support them... it's infectious.
Really unfortunate that we haven't been able to move past this.
martin_a 14 hours ago [-]
You can think about WordPress what you want, but afaik it does not allow to upload WebP files to its media gallery without any additional plugins while being said to run large parts of the internet...
celsoazevedo 4 hours ago [-]
Just tested on my self-hosted WordPress blog and it works without any plugins.
Just as there is a clear winner for video - av1 - there seems to be nothing in the way of "this is clearly the future, at least for the next few years" when it comes to encoding images.
JPEG is... old, and it shows. The filesizes are a bit bloated, which isn't really a huge problem with modern storage, but the quality isn't great.
JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome, which pretty much makes it dead in the water (don't you just love monopolies making decisions for you?)
HEIC is good, as long as you pinky promise to never ever leave Apple's ecosystem, ie HEIC sucks.
AVIF seems computationally expensive and the support is pretty spotty - 8bit yuv420 might work, but 10b or yuv444 often doesn't. Windows 10 also chokes pretty hard on it.
Alternatives like WebP might be good for browsers but are nigh-unworkable on desktops, support is very spotty.
PNG is cheap and support is ubiquitous but filesizes become sky-high very quick.
So what's left? I have a whole bunch of .HEIC photos and I'd really like if Windows Explorer didn't freeze for literal minutes when I open a folder with them. Is jpeg still the only good option? Or is encoding everything in jpeg-xl or avif + praying things get better in the future a reasonable bet?
It's worth noting that Firefox is willing to adopt JPEG-XL[1] as soon as the rust implementation[2] is mature And that rust impl is a direct port from the reference C++ implementation[3]. Mac OS and Safari already support JPEG-XL [4]. And recently Windows picked up JPEG-XL support. The only blockers at this point are Firefox, Chromium, and Android. If/when Firefox adopts JPEG-XL, we'll probably see google follow suit if only out of pressure from downstream Chromium platforms wanting to adopt it to maintain parity.
So really if you want to see JPEG-XL get adopted, go throw some engineering hours at the rust implementation [2] to help get it up to feature parity with the reference impl.
-----
1. https://github.com/mozilla/standards-positions/pull/1064
2. https://github.com/libjxl/jxl-rs
3. https://github.com/libjxl/libjxl
4. https://www.theregister.com/2023/06/07/apple_safari_jpeg_xl/
5. https://www.windowslatest.com/2025/03/05/turn-on-jpeg-xl-jxl...
What this [removal of support for JPEG-XL in Chromium] really translates to is, “We’ve created WebP, a competing standard, and want to kill anything that might genuinely compete with it”. This would also partly explain why they adopted AVIF but not JPEG XL. AVIF wasn’t superior in every way and, as such, didn’t threaten to dethrone WebP.
[0] https://vale.rocks/posts/jpeg-xl-and-googles-war-against-it
This is less just blind pressure but rather the risk that google becomes seen as an untrustworthy custodian of chromium and that downstreams start supporting an alternate upstream outside of google's control.
Jxl is certainly a hill that google seems intent to stand on but I doubt it's one they'd choose to die on. Doubly so given the ammo it'd give in the ongoing chrome anti-trust lawsuits.
HEIC was developed by the MPEG folks and is an ISO standard, ISO/IEC 23008-12:2022:
* https://www.iso.org/standard/83650.html
* https://en.wikipedia.org/wiki/High_Efficiency_Image_File_For...
An HEIC image is generally a still frame from ITU-T H.265† (HEVC):
* https://www.geeky-gadgets.com/heif-avc-h-264-h-265-heic-and-...
OS support includes Windows 10 v1083, Android 10+, Ubuntu 20.04, Debian 10, Fedora 36. Lots of cameras and smartphones support it as well.
There's nothing Apple-specific about it. Apple went through the process of licensing H.265, so they got HEIC 'for free' and use it as the default image format because over JPEG it supports: HDR, >8-bit colour, etc.
†Like WebP was similar to an image/frame from a VP8 video.
And the MPEG folks were so cool with video, all that licensing BS. Sounds great. No thanks!
Cmd+shift+4 is now the only way to grab an image out of a browser. Which is annoying.
It has made my life needlessly more complicated. I wish it would go away.
Maybe if browsers auto converted when you dragged ann image out of the browser window I wouldn’t care, but when I see webp… I hate.
Windows didn't use to show .jpgs in the window explorer. I know becase I wrote a tool to generate thumbnail HTML pages to include on archive CDs of photos.
To solve this problem, some format has to "win" and get adopted everywhere. That format could be webp, but it will take 3-10 years before everything supports it. It's not just the OS showing it in it's file viewer. It's it's preview app supporting it. It's every web site that lets you upload an image (gmail/gmaps/gchat/facebook/discord/messenger/slack/your bank/apartment-rental-companies, etc..etc..etc..) I just takes forever to get everyone to upgrade.
bin/webp2png:
Also, I just checked and Powerpoint has no problem dropping in a webp image. Gimp opens it just fine. You are right that web forums are often well behind the times on this.
Just drop the offending image onto the icon in the dock.
I once edited Firefox config to make it pretend to not support WebP, and the only site that broke was YouTube.
The last major browser to add support was Safari 16 and that was released on September 12, 2022. I see pretty much no one on browsers older than Safari 16.4 in metrics on websites I run.
With one format you get decent filesize, transparency, and animation which makes things much simpler than doing things like conditionally producing gifs vs jpegs.
Say what? A random scan across the internet will reveal more videos in MP4 and H.264 format than av1. Perhaps streaming services have switched, but that is not what regular consumers usually use to make and store movies.
AV1 was created and is backed by many companies via a non-profit industry consortium, solves real problems, and its momentum continues to grow. https://bitmovin.com/blog/av1-playback-support/
[edit: clarify that it's decoding only]
https://pngmini.com/lossypng.html
https://pngquant.org/
https://css-ig.net/pinga
Photography nerds will religiously store raw images that they then never touch. They're like compliance records.
No photog nerd wants EVEN MORE POSTPROCESSING.
JPEG, or fancier jpeg: https://developer.android.com/media/platform/hdr-image-forma...
Well, as JPEGs? Why not? Quality is just fine if you don't touch the quality slider in Photoshop or other software.
For "more" there's still lossless camera RAW formats and large image formats like PSD and whatnot.
JPEG is just fine.
what i mean is that jpeg squarish artifacts look ok while av1 angular artifacts look distorted
I use .webp often and I don't understand this. At least on Windows 10 I can go to a .webp and see a preview and double-click and it opens in my image editor. Is it not like this elsewhere?
It may be obsolete, but it is ubiquitous. I care less about cutting edge tech than I do about the probability of being able to open it in 20+ years. Storage is cheap.
Presentation is a different matter and often should be a different format than whatever your store the original files as.
I took a tiff and saved it high quality jpg. Loaded both into photoshop and “diffed” them (basically subtracted both layers). After some level adjustment you could see some difference but it was quite small.
I had some high resolution graphic works in TIFF (BMP + LZW). To save space, I archived them using JPEG-2000 (lossless mode), using the J2k Photoshop plug-in ( https://www.fnord.com/ ). Saved tons of GBs. It has wide multi-platform support and is a recognized archival format, so its longevity is guaranteed for some time on our digital platforms. Recently explored using HEIF or even JPEG-XL for these but these formats still don't handle CMYK colour modes well.
I've done several tests where I lowered the quality settings (and thus, the resulting file size) of JPEG-XL and AVIF encoders over a variety of images. In almost every image, JPEG-XL subjective quality fell faster than AVIF, which seemed mostly OK for web use at similar file sizes. Due to that last fact, I concede that Chrome's choice to drop JPEG-XL support is correct. If things change (JPEG-XL becomes more efficient at low file sizes, gains Chrome support), I have lossless PNG originals to re-encode from.
I've found that sometimes WebP with lossless compression (-lossless) results in smaller file sizes for graphics than JPEG-XL and sometimes it's the other way around.
When looking for a format to display HQ photos on my website I settled with a combination of AVIF + JPG. Most photos are AVIF, but if AVIF is too magical comparatively to JPG (like 3x-10x smaller) I use a larger JPG instead. "Magic" means that fine details are discarded.
WebP discards gradients (like sunset, night sky or water) even at the highest quality, so I consider it useless for photography.
And JPG for photos taken on a “real” camera (including scanned negatives). Sometimes RAW, but they’re pretty large so not often.
I don't like AVIF, at least not for photos I want to share. I think AVIF is great for "a huge splash image for a web page that nobody is going to look at closely" but if you want something that looks like a pro photo I don't think it's better than WebP. People point out this example as "AVIF is great"
https://jakearchibald.com/2020/avif-has-landed/demos/compare...
but I think it badly mangles the reflection on the left wing of the car and... it's those reflections that make sports cars look sexy. (I'll grant that the 'acceptable' JPEG has obvious artifacts whereas the 'acceptable' AVIF replaced a sexy reflection with a plausible but slighly dull replacement)
You can take any random device and it will be able to decode h264 at 4k. h265 not so much.
As for av1 - my Ryzent 5500GT released in 2024 does not support it.
Addendum: AMD since RDNA2 (2020-2021-ish) [1], NVIDIA since 30 series (2020) [2], Apple since M3? (2023).
Note: GP's processor released in 2024 but is based on an architecture from 2020.
[0] https://en.wikipedia.org/wiki/Intel_Quick_Sync_Video#Hardwar...
[1] https://en.wikipedia.org/wiki/Video_Core_Next#Feature_set
[2] https://developer.nvidia.com/video-encode-and-decode-gpu-sup...
There is a whole discussions that modern codec, or especially AV1 simply doesn't care about PSY image quality. And hence how most torrents are still using x265 because AV1 simply doesn't match the quality offered by other encoder/ x265. Nor does the AOM camp cares about it, since their primarily usage is YouTube.
>in the same way that MP3 at 320 kbps is competitive with AAC at 320 kbps.
It is not. And never will be. MP3 has inherent disadvantage that needs substantial higher bitrate for quite a lot of samples, even at 320kbps. We have been through this war for 10 years at Hydropgenaudio with Data to back this up, I dont know why in the past 2-3 years the topic has pop up once again.
MP3 is not better than AAC-LC in any shape or form even at 25% higher bitrate. Just use AAC-LC, or specifically Apple's Quick Time AAC-LC Encoder.
In early AV1 encoders, psychovisual tuning was minimal and so AV1 encodes often looked soft or "plastic-y". Today's AV1 encoders are really good at this when told to prioritize psy quality (SVT-AV1 with `--tune 3`, libaom with `--tune=psy`). I'd guess that there's still lots of headroom for improvements to AV1 encoding.
> And hence how most torrents are still using x265 because…
Today most torrents still use H.264, I assume because of its ubiquitous support and modest decode requirements. Over time, I'd expect H.265 (and then AV1) to become the dominant compressed format for video sharing. It seems like that community is pretty slow to adopt advancements — most lossy-compressed music <finger quotes>sharing</finger quotes> is still MP3, even though AAC is a far better (as you note!) and ubiquitous choice.
My point about MP3 vs. AAC was simply: As you reduce the amount of compression, the perceived quality advantages of better compressed media formats is reduced. My personal music library is AAC (not MP3), encoded from CD rips using afconvert.
That's not what I'm seeing for anything recent. x265 seems to be the dominant codec now. There's still a lot of support for h.264, but it's fading.
https://cloudinary.com/blog/what_to_focus_on_in_image_compre...
Part of it being merged for now.
It is unfortunate this narrative hasn't caught on. Actual quality over VMAF and PSNR. And we haven't had further quality improvement since x265.
I do get frustrated every time the topic of codec comes up on HN. But then the other day I only came to realise I did spend ~20 years on Doom9 and Hydrogenaudio I guess I accumulated more knowledge than most.
One of JPEG XL's best ideas was incorporating Brunsli, lossless recompression for existing JPEGs (like Dropbox's Lepton which I think might've been talked about earlier). It's not as much of a space win as a whole new format, but it's computationally cheap and much easier to just roll out today. There was even an idea of supporting it as a Content-Encoding, so a right-click and save would get you an OG .jpg avoiding the whole "what the heck is a WebP?" problem. (You might still be able to do something like this in a ServiceWorker, but capped at wasm speeds of course.) Combine it with improved JPEG encoders like mozjpeg and you're not in a terrible place. There's also work that could potentially be done with deblocking/debanding/deringing in decoders to stretch the old approach even further.
And JXL's other modes also had their advantages. VarDCT was still faster than libaom AVIF, and was reasonable in its own way (AVIFs look smoother, JXL tended more to preserve traces of low-contrast detail). There was a progressive mode, which made less sense in AVIF because it was a format for video keyframes first. The lossless mode was the evolution of FUIF and put up good numbers.
At this point I have no particular predictions. JPEG never stopped being usable despite a series of more technically sophisticated successors. (MP3 too, though its successors seemed to get better adoption.) Perhaps it means things continue not to change for a while, or at least that I needn't rush to move to $other_format or get left behind. Doesn't mean I don't complain about the situation in comments on the Internet, though.
> Alternatives like WebP might be good for browsers but are nigh-unworkable on desktops, support is very spotty.
Right again, and WebP is the enrichment that goes with the backend when dealing with web. I wouldn’t knock it for not being local compatible, it was designed for the web first and foremost, I think it’s in the name.
> Key features of the JPEG XL codec are: > lossless JPEG transcoding,
> Moreover, JPEG XL includes several features that help transition from the legacy JPEG coding format. Existing JPEG files can be losslessly transcoded to JPEG XL files, significantly reducing their size (Fig. 1). These can be reconstructed to the exact same JPEG file, ensuring backward compatibility with legacy applications. Both transcoding and reconstruction are computationally efficient. Migrating to JPEG XL reduces storage costs because servers can store a single JPEG XL file to serve both JPEG and JPEG XL clients. This provides a smooth transition path from legacy JPEG platforms to the modern JPEG XL.
https://ds.jpeg.org/whitepapers/jpeg-xl-whitepaper.pdf
If you need more profs, you could transcode a JPEG to JPEG XL and convert against to JPEG. The result image would be BINARY IDENTICAL to the original image.
However, perhaps are you talking about an image on JPEG XL, using features only in JPEG XL (24 bit, HDR, etc...) that obviously couldn't be converted in a lossless way to a JPEG.
Trying around with some jpg and jxl files I cannot convert jxl losslessly to jpg files even if they are only 8bit. The jxl files transcoded from jpg files show "JPEG bitstream reconstruction data available" with jxlinfo, so I think some extra metadata is stored when going from jpg to jxl to make the lossless transcoding possible. I can imagine not supporting the reverse (which is pretty useless anyway) allowed for more optimizations.
A lot of those features (non-8×8 DCTs, Gaborish and EPF filters, XYB) are enabled by default when you compress a non-JPEG image to a lossy JXL. At the moment, you really do need to compress to JPEG first and then transcode that to JXL if you want the JXL→JPEG direction to be lossless.
So he was not wrong about this. You have perfect JPEG -> JPEG XL conversion, but not the other way around.
With codecs built for that purpose I hope. Intra-frame misconceptions "formats" should stay that way. A curiosity.
Not really true in my experience, I have no problems using it in Windows 11, Linux, or with my non-Apple non-Google cloud photos app.
The iPhone using it in an incredibly widespread way has made it a defacto standard.
If you're having trouble in Windows, I wonder if you're running Windows 11 or 10? Because 11 seems a lot better at supporting "modern things" considering that Microsoft has been neglecting Windows 10 for 3 years and is deprecating it this year.
What?? Maybe I'm too much in aarrrgh circles but it's all H.264 / 265...
Why would I ever care about Chrome? I can't use adblockers on Chrome, which makes the internet even less usable than it currently is. I only start up chrome to bypass cross-origin restrictions when I need to monkey-patch javascript to download things websites try to keep me from downloading (or, like when I need to scrape from a Google website... javascript scraper bots seem to evade their bot detection perfectly, just finished downloading a few hundred gigabytes of magazines off of Google Books).
Seriously, fuck Chrome. We're less than 2 years away from things being as bad as they were in the IE6 years.
I have software that won't work quite right in Safari or Firefox through a VPN every single day. Maybe it's the VPN and maybe it's the browser but it doesn't matter. We're at IE levels it's just ever so slightly more subtle this time. I'm still using alternatives but it's a battle.
Some of the warez sites I download torrents from have captchas and other javascripticles that only work on Chrome, but I've yet to see it with mainstream sites.
Fight the good fight.
Why does this myth persist?
uBlock Origin Lite works perfectly fine on Chrome, with the new Manifest v3. Blocks basically all the ads uBlock Origin did previously, including YouTube. But it uses less resources so pages load even faster.
There's an argument that adblocking could theoretically become less effective in the future but we haven't seen any evidence of that yet.
So you can very much use adblockers on Chrome.
webRequest is slower because it has to evaluate JavaScript for each request (as well as the overhead of interprocess communication), instead of the blocking being done by compiled C++ code in the same process like declarativeNetRequest does.
uBO also has a bunch of extra features like zapping that the creator explicitly chose not to include in uBO Lite, in the interests of making the Lite version as fast and resource-light as possible. For zapping, there are other extensions you can install instead if you need that.
They're two different products with two different philosophies based on two different underlying architectures. The older architecture has now gone away in Chrome, but the new one supports uBlock Origin Lite great.
I struggle to understand the justification for other lossy image formats as our networks continue to get faster. From a computation standpoint, it is really hard to beat JPEG. I don't know if extra steps like intra-block spatial prediction are really worth it when we are now getting 100mbps to our smartphones on a typical day.
To answer your (false?) question, there’s a long list of benefits, but I’d say HDR and storage efficiency are the two big ones I can think of. The storage efficiency especially is massive, especially with large images.
Because Google's PageSpeed and Lighthouse both tell people to use WebP, and a large percentage of devs will do anything Google say in the hopes of winning better SERP placement:
- https://web.dev/articles/serve-images-webp
- https://developer.chrome.com/docs/lighthouse/performance/use...
You might be getting 100 Mbps to your smartphone; many people – yes, even within the United States – struggle to attain a quarter of that.
If jpeg is loading like ass, webp probably isn't going to arrive much faster.
You have to find a balance, and unless (still) pictures are at the center of what you are doing, it is typically only a fraction of the bandwidth (and a fraction of the processing power too).
We are not talking about 100 Mbps, we downloaded JPEGs from dialup connections you know. You don't even need to go into the Mbps unless you are streaming MJPEG (and why would you do that?).
If humans are still around in a thousand years they’ll be using jpegs and they’ll still be using them a thousand years after that. When things work they have pernicious tendency to stick around.
Wheels are vastly superior to hover technologies in the crucial areas of steering and controlled braking. (For uncontrolled braking, you just cut the power to your hover fans and lift the skirts...)
It turns out to be remarkably difficult to get a hovercraft to go up an incline...
Wheels are both suspension and traction in one system.
There's no particular physical advantage to JPEG over the others mentioned; it's just currently ubiquitous.
When/if simple screens get usurped then we'll likely move on from JPEG.
I'm sure you were being a little flippant but your last sentence shows good insight. Someone said "we just need it to work" to me the other day and the "if it works there will be little impetus to improve it"-flag went off in my brain.
one place I think jxl will really shine is PBR and game textures. for cases like that, it's very common to have color+transparency+bump map+normal map, and potentially even more. bundling all of those into a single file allows for away better compression
Idk about 3d, but I’ll assume someone probably will tape something out necessity if they haven't already.
…and yes, very flippant! But not without good reason. If we are to extrapolate; the popularity of jpeg, love it or hate it, will invariably necessitate it’s continued compatibility contributing to my pervious statement. That compatibility will invariably lead to plausible hypothetical circumstances where future developers out of laziness, ignorance, or just plain conformity to norms will lead to its choice and use perpetuating the cycle. The tendency as such is that short of a radical mass extension level like event brought about by mass wide spread technological adoption such as what you describe is why I don’t see it going away anytime soon. Not to say it couldn’t happen, I just feel it’s highly improbable because of the contributing human factors.
That jpeg gets so many complaints is I feel for two reasons. One, its ubiquity and two, that we actually see it! Some similar situations that don’t get nearly as much attention but are far more pervasive are tcp/ip, bash, ntpd, ad nauseam. All old pervasive protocols so embedded as to be taken for granted, and also not able to be seen.
I’ll leave with this engineering truism that I feel should be more widely adhered to in software development, especially by UI designers: if it ain’t broke don’t fix it!
Lossy compressed images are usually not the most significant consumers of bandwidth and disk space. Videos are. That's why there have been a lot more focus on video formats than anything else, there is a lot to gain here, not so much with still images.
JPEG-XL is super-complicated because it supports plenty of things most people don't really need.
Webp is somewhat better supported because it is backed by Google, it is also what is essentially a single frame video, so if you did the hard work on video (where it matters), you get images almost for free, and it saves Google a tiny bit of bandwidth, and "a tiny bit" is huge at Google scale.
We are seeing the same thing with audio. MP3 (1991) is still extremely popular, the rest is mostly M4A/AAC (2001). We pretty much have had the perfect audio format now, which is Opus (2012) and yet, we don't even use it that much, because the others are good enough for what we make of them.
So, uh, don't get your hopes up.
https://graphicdesign.stackexchange.com/questions/115814/how...
Is missing WebP support a meme?
This becomes an issue if you're creating content about trending topics, since lots of marketing sites love using webp for every image.
Looks like a mixture of runtime and compiler flags are needed except for Safari.
The variable compression of JPEG was very important. In Photoshop you could just grab an image and choose the file size you needed and the JPEG quality would degrade to match your design constraints.
[1] https://opensource.googleblog.com/2024/04/introducing-jpegli...
Regardless, since the picture tag[0] was introduced I’ve used that for most image media by default with relevant fallbacks, with WebP as default. Also allows loading relevant sized images based on media query which is a nice bonus
[0]: https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/...
JPEG is lossy, in ways that were initially optimized for photographs. The details it loses are often not details photographs are good at providing in the first place. As the upside for losing some data, it gets to pick the data it gets to compress, and it chooses it in such a way as to minimize the size of the compressed data.
PNG is a lossless format. It's practically mandatory when you need 100% fidelity, as with icons or other graphics that are intended to have high contrast in small areas. It's able to optimize large areas of the same color very well, but suffers when colors change rapidly. It's especially unsuitable for photographs, as sensor noise (or film grain, if the source was film) create subtle color variations that are very difficult for it to encode efficiently.
You basically never have a situation where either one is appropriate. They are for different things, and should be used as such.
> When I came up in the IE era
In the IE era I recall, the battle was between GIF and JPEG because IE supported alphatransparent PNGs very poorly :)
> if I recall correctly JPEG as a format can encode an image with a higher fidelity than PNG
The other way around: JPEGs are “lossy” – they throw away visual information to save file size. PNGs, on the other hand, are “lossless”, and decode back to exactly the same pixels that were fed into the encoder.
JPEGS are great for photographs, where lossy compression is acceptable.
PNGs can have transparency and lossless compression, so they're great for things like rasterized vector graphics, UI-element parts of a web page, etc.
For archival purposes, where you care about not losing details is more important than image size, PNG is better (though often TIFF is used for that use case). For images with large blocks of solid colors and sharp edges (text, line drawings), PNG is arguable better (though JPEG can be acceptable if you're careful with quality settings). If you need alpha support, go for PNG since JPEG doesn't support that.
For photograph-like images, where image size is important, JPEG is preferred over PNG.
JPEG should be used for everything else.
PNG is a lossless format, so I don't think that's possible, unless there's some specific feature that is not available in PNG.
24-bit color PNGs are lossless, to the extent that the input image is encodable in 24 bits of RGB (which is pretty much everything that's not HDR). There's no higher fidelity available for normal input images. If file size limits would force palettized PNGs, it's quite possible for a JPEG at the same file size to have higher fidelity (since it makes a different set of trade-offs, keeping color resolution but giving up spatial resolution instead); but this isn't really a common or particularly valid comparison in the PNG era, was more of an issue when comparing to GIFs.
tl;dr: Nope, PNG is perfect. JPEG can approach perfect, but never get there. Comparison is only interesting with external (e.g. file size) constraints.
JPEG2000: insanely complex and nonintuitive, especially the edge-cases and overly flexible encoding decisions
WebP: also complex, and effectively Google-proprietary
Perfect logic: let's not switch to webp because it's bad. Why is it bad? Not everyone has switched to it yet.
- https://nvd.nist.gov/vuln/detail/CVE-2023-41064
- https://nvd.nist.gov/vuln/detail/CVE-2023-41061
- https://nvd.nist.gov/vuln/detail/CVE-2023-4863
- https://citizenlab.ca/2023/09/blastpass-nso-group-iphone-zer...
Maybe it's vaguely more flexible and compresses well. I don't care. If someone uses it, I despise them.
While killing MP3 might be difficult, the vast majority of people aren't handling audio files themselves these days, so probably not hard to phase out fairly rapidly.
Even within the Western World, there are many people who like to own their digital music.
It's nice to have that consistent ubiquity, something very hard to find these days. Especially if you're entire audio library (audio books, podcasts, songs) comes from some streaming service that requires an app!
Incidentally, breakage on VBR bitstreams is buggy behaviour, because some lazy developers assumed frame sizes would never change. VBR is completely within spec, and decoders do not have to explicitly support it.
Lastly a note on bitrate: 320 kbps CBR (the max allowed in spec) is often wasteful and pointless. In many cases, an encoder will pad out frames to conform to the requested bitrate. Indeed tools exist that will losslessly re-encode a CBR file to VBR by removing the padding, producing a smaller file. MP3 (as good as it is) has certain problem samples that aren't fixed by throwing more bits at them. A competent encoder with proper settings, like lame which defaults to -V4 is transparent in most samples to most people. If you disagree you should double-blind test yourself.
Where outside the West are you seeing many people that are specifically storing audio files in mp3 (vs just streaming/ storing in a better digital format/storing physical media)?
I live in SEA and most people here are not storing their own mp3s. Most people don't have computers at all – they have budget Android phones that don't have much built-in storage. What they do have is cheap internet, so they are either using Spotify (Free)/YouTube/etc. Many people still use CDs (mostly in cars) but those aren't mp3 either.
I'm currently sticking to JPEG because, last time I tried, JPEG came out as the best format. Referencing my memory at https://chaos.social/@luc/113615076328300784
- JPEG has two advantages on slow connections: the dimensions are either stored up front so the layout doesn't jump, or maybe the renderer is better; and it loads a less-sharp version first and progressively gets sharper
- JPEG was way faster when compressing and decompressing
- on the particular photo I wanted to optimise in this instance, JPEG was also simply the best quality for a given filesize which really surprised me after 32 years of potential innovation
Regarding AVIF, my n=1 experience was that it "makes smooth gradients where jpeg degrades to blotchy pixels, but at decent quality levels, jpeg preserves the grain that makes the photo look real". Gradients instead of ugliness at really small sizes can be perfect for your use-case, but note that it's also ~80 times slower at compression (80s vs. <1s)
JpegXL isn't widely in browsers yet so I couldn't use it
> These days, the [JPEG] format is similar to MP3
The difference with mp3 is that Opus is either a bit better or much better, but it's always noticeably better.
You can save ~half the storage space. For speech (audio books) I use 40kbps, and for music maybe 128kbps which is probably overkill. And I delete the originals without even checking anymore if it really sounds the same, I noticed that I simply can't tell the original apart in a blind test, no matter what expensive headset setup I try
TFA attributes it to a simple "they were first" advantage, but I think this is why "Why JPEGs still rule the web": no file format is better than JPEG in the same way as Opus is better than MP3; in that you don't have to think about it anymore and it's always a win in either filesize or quality
That said, Opus is also annoyingly hard to get into people's minds, but I've done it and you also see major platforms from compress-once-serve-continuously video (e.g. Youtube) to VoIP (e.g. Whatsapp) switching over for all their audio applications
This submission was originally shown as [dead]. I have no idea why, I read some of the content and seems decent enough, especially in the current state of things when JPEG-XL is blocked because of AOM / Google Chrome. I vouched for it and upvoted, then somehow it is on the front page.
I wonder if dead means somehow flagged it. If so, then why? If not, why is it dead?
https://kraken.io
HEIC: Have fun licensing this.
WebP: Slightly better than JPEG maybe, but only supports 1/4 chroma resolution in lossy mode so some JPEGs will always look better than the equivalent WebP.
AVIF: Better than JPEG probably, but encoders for AV1 are currently heavily biased towards blurring, even at very high bitrates. Non-Chrome browser support took a while.
You can even "losslessly" compress existing JPEGs to JPEGXL.
JPEG XL is the natural replacement for JPEG and it is perverse that Google backtracked on supporting it.
And decoder complexity. A software JPEG decoder is a weekend project. A hardware JPEG decoder not much more. Doing the same for arbitrary JPEG XL files is much, much more complicated. In any world where any of development cost, implementation complexity, expected code quality (especially when using first-order assumptions like constant number of defects per line of code), or decoder resources (especially for hardware implementations) are important, JPEG has serious advantages.
Every device in the world with iOS 17 or macOS 14 or better has JPEG XL support across the system.
This is a complete and utter non-issue. Google had even added JPEG XL support to Chromium, and then bizarrely removed it (not long before Apple fully supported JXL across all their platforms which invariably would have pushed it over the top), presumably to try to anoint WebP as the successor. Only WebP has so many disadvantages that all they did was entrench classic JPEG.
JPEG XL is unquestionably the best current next gen format for images.
This depends on the system requirements, doesn't it? Suppose you're compositing a low-safety-impact video stream with (well, under) safety-impacting information in an avionics application, and you're currently using a direct GMSL link. There's an obvious opportunity to cost-down and weight-down the system by shifting to a lightly compressed stream over an existing shared Ethernet bus, and MJPEG is a reasonable option for this application (as is H.264, and other options -- trade study needed). When considering replacing JPEG with JPEG XL in this implementation, what's your plan for providing partitioning between the "extremely high quality" but QM software implementation and the DAL components? Are you going to dedicate a core to avoid timing interference? At that point you're spending more silicon than a dedicated JPEG decoder would take. You likely already have an FPGA in the system for doing the compositing itself, but what's the area trade-off between an existing "extremely high quality" JPEG XL hardware decoder and the JPEG one that you've been using for decades?
I don't doubt that in a world where everything is an iPhone (with a token nod to Android), "someone already wrote the code once and it's good enough" is sufficient. But there's a huge field of software engineering where complexity and quality drive decision making, and JPEG XL really is much more complex than JPEG Classic Flavor.
JPEG-XL is the superior format. The only reason WebP exists is not because of natural selection, but because of nepotism (Google Chrome).
https://www.reddit.com/r/AV1/comments/ju18pz/generation_loss...
I'd love to use JPEG-XL, but I'm guessing the only way to do that is also bringing along a WASM decoder everywhere I want to use it.
standards require some politicking and money I suppose
"Build it and they will come" doesn't work for products, and it doesn't work for standards either.
If you get code merged into something like Chrome, and it's big and goes unused for a few years, at some point some security-minded person will come along and argue that your code is an unused attack surface and should be removed again.
Given how much duct tape it took at times to get various browsers to behave I would say JS is proof of the opposite. It succeeded in an environment where standards where a mere suggestion.
next person downloads the now JPEG's WEBP'd JPEG'd WEBP'd image
fast forward a decade the original quality versions of the images no longer exist just these degraded
Most decent image editing software (Photoshop, Pixelmator, etc) will let you choose what you want.
https://www.adobe.com/creativecloud/file-types/image/raster/...
But if you're not a professional it would be easy to mix up the two and slowly end up with VHS level degradation.
They can render JPEG-XL; everything else will render the fall back format like JPEG or WebP.
It's actually not more work. The user's browser automatically handles the content negotiation and only downloads the image format it understands:
macOS, iPadOS and iOS get the JPEG-XL image, device that can handle WebP get that and everything else gets JPEG.There are several image delivery services that will provide the best image format depending on the device that's connecting.
In HTML, use `<picture>`: https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/...
In CSS, use `image-set()`: https://developer.mozilla.org/en-US/docs/Web/CSS/image/image....
Doesn't a polyfill imply more Javascript running on the device?
At some point you have to be pragmatic and meet users where they are but doesn't mean you have to like that Google did throw their weight around in a way that only they really can.
“Technical merits” are rarely, for anything, the sole measurement of fitness for purposes.
Even for purely internal uses, internal social, cultural, and non-technical business constraints often have a real impact on what is the best choice, and when you get out into wider uses with external uses, the non-technical factors proliferate. That's just reality.
I understand the aesthetic preference to have decisions only require considering a narrow set of technical criteria which you think should be important, but you will make suboptimal decisions in the vast majority of real-world circumstances if you pretend that the actual decision before you conforms to that aesthetic ideal.
https://groups.google.com/a/chromium.org/g/blink-dev/c/WjCKc...
Everyone else... is fine with JPEG. And occasionally shares HEIC pictures from their iPhones.
XHTML 2 is waiting on that one... Oh well...
Only for a very brief period. (Beta 1)
* It's not actually better.
* It's patented/requires license/is owned by someone who wants a lot of royalties.
JPEG2000 is of the second variety.
That doesn't even make much sense because you lose inter-frame compressibility
Maybe there are formats that compress better or lossless, but thanks to advancements in disk space and transfer rates (I know, not everywhere but penetration and improvement will happen...) the disadvantages of JPEG can be handled and we can just enjoy a very simple file format.
In an era where enshitification lingers around every corner I'm just happy that I don't need to think about whether I have to convert _every digital picture I've ever taken_ into some next-gen format because some license runs out or whatnot. It just works. Let's enjoy that and hope it sticks around for 30 more years.
1. Lossy: JPEG fills this role;
2. Lossless: this was GIF but is now PNG; and
3. Animated: GIF.
So for a format to replace JPEG, it must bring something to the table for lossy compression. Now that JPEG is patent-free, any new format must be similarly unencumbered. And it's a real chicken-and-egg problem in getting support on platforms such that people will start using it.
I remember a similar thing happening with Youtube adding VP9 (IIRC) support as an alternate to H264, which required an MPAA patent license. The MPAA also tried to cloud VP9 by saying it infringed on their patents anyway. No idea if that's true or not but nobody wants that uncertainty.
Anyway, without total support for VP9 (which Apple devices didn't have, for example) Youtube would need to double their storage space required for videos by having both codecs. That's really hard to justify.
Same goes for images. You then need to detect and use a supported image format... or just use JPEG.
Even Github! Though the latter doesn't support IPv6 either.
Most websites break with WebP. Desktop tools choke on WebP.
It sucks, because it's a good format.
That part at least isn't true.
A funny case in point: Sora.com produces WebP outputs, but you can't turn around and use them as inputs. (Maybe they've fixed that?)
Smaller websites almost always reject them. Even within big websites, support is fractured within the product surface area. You can't use them as Reddit profile icons, for instance.
One of the most apparent issues is that a lot of thumbnailing and CDN systems don't work natively with WebP, so you have to reject WebP outright until broader support is added.
Once the WebPs are in your systems, you have to make sure everything downstream can support them... it's infectious.
Really unfortunate that we haven't been able to move past this.
WebP support was added to WordPress in 2021 with version 5.8: https://wordpress.org/documentation/wordpress-version/versio...
AVIF support in 2024, with v6.5: https://make.wordpress.org/core/2024/02/23/wordpress-6-5-add...