Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's not clear from the article, in their "comparison of 1500 JPEG images from Wikipedia" did they just run through the entropy coding portion again or did they requantize? (I suspect they did jus the entropy coding portion, but hard to tell).

Getting better encoding by changing the quantization method can't be purely a function of file size, traditionally PSNR measurements as well as visual quality come into play.

Good to see some work in the area, I will need to check out what is new and novel.

That said, a company I worked for many moons ago came up with a method where by reorganization of coefficients post-quantization, you could easily get about 20% improvement in encoding efficiency, but the result was not JPEG compatible.

There is a lot that can be played with.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: