Page MenuHomePhabricator

Memory problem with large progressive jpegs (on Commons and other wikis)
Closed, ResolvedPublic

Description

Author: test5555

Description:
With the current configuration, jpegs saved as progressive jpegs ([[JPEG#JPEG compression]]) sometimes don't display if the image is large.

As compression is probably more useful for large images and if we want to support that, it might be worth finding a solution for larger images.


Version: unspecified
Severity: normal
URL: http://commons.wikimedia.org/wiki/Commons:Village_pump/Archive/2010Jun#Rokeby_Venus_by_Diego_Vel.C3.A1zquez

Details

Reference
bz24228

Event Timeline

bzimport raised the priority of this task from to Low.Nov 21 2014, 11:10 PM
bzimport set Reference to bz24228.
bzimport added a subscriber: Unknown Object (MLST).

This would require either patching ImageMagick to use less memory when scaling such images (which may ultimately require patching libjpeg) or switching to a different image scaler.

In principle, there's no reason why downscaling an image in any format couldn't be done using space (at most) proportional only to the size of the target image, not of the source. However, this requires being able to read and decode the source image data without buffering it all in memory. For progressive JPEGs this would mean scaling each pass as it is read, and only combining them after scaling.

I'm not aware of any existing image scaler that would do this, but then, I haven't really looked, either. If anyone knows of a program that does this and otherwise fits our needs (i.e. open source, usable from the command line, actively maintained, etc.), please tell us so we can evaluate it.

It would also be even more useful to have such a scaler for PNG images.

It would also be even more useful to have such a scaler for PNG images.

btw, thats bug 9497.

test5555 wrote:

As the image quality for standard thumbnails needn't be perfect, maybe these could be created by decoding only part of the file.

test5555 wrote:

Maybe the fix for Bug 24978 solved this too. At least the sample image above works.

We should perhaps keep track of if a jpeg is progressive, and not give progressive jpeg's the exemption to the "max image size to scale limit" that they currently enjoy.

  • Bug 37367 has been marked as a duplicate of this bug. ***

esby wrote:

Fixed the case by uploading a new version with non progressive jpeg setting.

Use exiftool to determine if the jpeg is progressive,
exiftool myimage.jpg | grep Encoding
should display
Encoding Process : Progressive DCT, Huffman coding <-- progressive, not working on commons for now.
Encoding Process : Baseline DCT, Huffman coding <-- non progressive ; working on commons.

@Yannf: Do not use this thread for reporting cases, just put them in the category and pm me over irc if you feel they need to be reencoded.

  • This bug has been marked as a duplicate of bug 17645 ***