Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use dithering #21

Open
Ser-Gen opened this issue Mar 27, 2018 · 5 comments
Open

Use dithering #21

Ser-Gen opened this issue Mar 27, 2018 · 5 comments

Comments

@Ser-Gen
Copy link

Ser-Gen commented Mar 27, 2018

It's important for colorful images.

Source image [174285 B]
firefox-512

After upng, web interface [69927 B]
firefox-512 upng

Result of pngquant looks better. [69932 B]
firefox-512 pngquant

@photopea
Copy link
Owner

I implemented Floyd-Steinberg dithering in the past, but it was not worth it.

I think we need some compression-friendly dithering. Do you know anybody who could help us?

@Ser-Gen
Copy link
Author

Ser-Gen commented Mar 27, 2018

pngquant uses Floyd-Steinberg modified for better color handling.

I believe, that dithering will always increase filesize because of its' random nature.
Only purpose of this feature — to pleasure our eyes.
Dithering can be hidden under flag, just like in Ps. Users will decide.

I think we need some compression-friendly dithering. Do you know anybody who could help us?

I think, we may ask @kornelski.

@photopea
Copy link
Owner

photopea commented Mar 27, 2018

I mean, I made three versions of image:

  • A: 50 colors: 15 kB
  • B: 50 colors + Dithering: 23 kB
  • C: 100 colors: 22 kB

B looked as nice as C, but was slightly larger, so I thought that allowing more colors is better than dithering (both increase the file size).

I think we need dithering, that consists of some repetitive patterns, i.e. it should be "friendly" to Deflate algorithm - make B have only 20 kB (so it is still as nice as C, but smaller).

BTW. I also think, that pngquant performs a better Deflate (which also takes about 100x more time than UPNG.js: e.g. 30ms vs. 3000ms), so it can make B have only 20 kB, while using the same dithering as I did.

@Ser-Gen
Copy link
Author

Ser-Gen commented Mar 27, 2018

Oh, I see.
I don't know dithering algorithm, that can handle this case.

pngquant computes mse error, have min and max quality settings and don't write file if its' size too big or quality degrades dramatically.

Maybe you find this thread useful
https://encode.ru/threads/1757-Lossy-DEFLATE-lossy-PNG

And this project particually
https://github.com/foobaz/lossypng

@kornelski
Copy link

kornelski commented Mar 28, 2018

Yes, pngquant calculates mean square error, and applies dithering only in areas with high error. This way areas that don't need dithering don't get the extra noise.

pngquant also does edge detection (similar to Prewitt algorithm) and disables dithering on the edges. This prevents anti-aliasing look like fur.

In pngquant 90% of time is spent on extra runs of K-means. If you use --speed 10 the whole recompression (on i7 2.3Ghz) takes ~80ms dithered, 50ms undithered.

(BTW, TinyPNG doesn't have its own algorithm. It's just a GUI for pngquant).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants