facebook rss twitter

Mozilla's advanced JPG encoder cuts file sizes by five per cent

by Mark Tyson on 16 July 2014, 10:15

Tags: Mozilla, Facebook

Quick Link: HEXUS.net/qacgo5

Add to My Vault: x

Mozilla has announced the availability of mozjpeg 2.0 via its research blog. We previously heard about this project to refine a "production-quality JPEG encoder" to offer better optimised images in March this year. Now the v2.0 fruits of that labour have landed and the codec is headlined as providing better file sizes, to the tune of 5 per cent on average, for both baseline and progressive JPEGs.

Back in March Mozilla told us that it had decided to start this project since "production JPEG encoders have largely been stagnant in terms of compression efficiency," since 1992, when this lossy image compression system became popular. It concluded, after much consultation with software engineers, that JPEG encoders had yet to reach their full compression potential and decided to develop a fork of libjpeg-turbo.

A typical Facebook gallery, hmmm tech and pizza...

mozjpeg 2.0 can save users on average of about 5 per cent on their JPG encoded files and it retains compatibility with the vast majority of decoders in current software. Mozilla also said that, depending on the type of image, you can see much greater compression improvements.

Mozilla detailed the specific improvements implemented in mozjpeg 2.0 to be the following:

  • Trellis quantization, improving compression in both baseline and progressive JPG files with no reduction in compatibility
  • Encoder now accepts JPEG files for re-compression
  • Users can tune for PSNR, PSNR-HVS-M, SSIM, and MS-SSIM metrics
  • Can be made to generate single DC scan for best compatibility

Facebook funding

We also heard that Facebook is already testing the new mozjpeg 2.0 encoding method. Stacy Kerkela, software engineering manager at Facebook said that "Facebook supports the work Mozilla has done in building a JPEG encoder that can create smaller JPEGs without compromising the visual quality of photos". The social network's support was also evident from a contribution of $60,000 to help develop this JPG encoder beyond its current state and onto mozjpeg 3.0. Obviously, to a web company as big as Facebook, making page load times shorter and serving five percent less data (in imagery) has the potential to reduce its operations costs.



HEXUS Forums :: 9 Comments

Login with Forum Account

Don't have an account? Register today!
A whole 5% off of a small lossy file? and they aren't compatible with some decoders?

Hmmm, no wonder Firefox has hit rock bottom, they obviously don't know where to allot dev time!
shaithis
A whole 5% off of a small lossy file? and they aren't compatible with some decoders?

Hmmm, no wonder Firefox has hit rock bottom, they obviously don't know where to allot dev time!

The potential savings in image bandwidth make this a very worthwhile pursuit.
shaithis
A whole 5% off of a small lossy file? and they aren't compatible with some decoders?

Hmmm, no wonder Firefox has hit rock bottom, they obviously don't know where to allot dev time!

Facebook transfers 100s of terabytes of data per day, and most of that is images! 5% of just 100TB is 5TB, more than you probably have in your whole system. It's a lot of savings for big companies, imagine how much money Google could save by applying this to Google image search! So much saving that will indirectly benefit the users in the long run
Wozza365
Facebook transfers 100s of terabytes of data per day, and most of that is images! 5% of just 100TB is 5TB, more than you probably have in your whole system. It's a lot of savings for big companies, imagine how much money Google could save by applying this to Google image search! So much saving that will indirectly benefit the users in the long run

Valid point but I think it's the size of the gain that is underwhelming. Google trial runs a new codec , VP9, and introduces a large saving for high resolution video. (I think, I can't find a link)
To be fair, if Facebook wanted to reduce their image overhead by 5%, they'd proably just dial down the JPEG quality slightly. I doubt most users would actually even notice.

Don't get me wrong, it's cool and all, but I'm not sure it's a problem that particularly needed to be solved - not least by an organisation that's struggling to stay relevant in the browser arena at the moment.