Page MenuHomePhabricator

Systematic HTTP 503 timeouts by uploading big (<100MB) TIFF images via API action=upload, without stash, without async
Closed, DeclinedPublicBUG REPORT

Description

Mediawiki gives systematically a HTTP timeout (503) although everything is OK with the upload. For example with this picture:
https://commons.wikimedia.org/wiki/File:Haubitzenbatterie_mittleren_Rheinbr%C3%BCcke_-_CH-BAR_-_3237358.tif


Version: 1.23.0
Severity: major

Details

Reference
bz57717

Event Timeline

bzimport raised the priority of this task from to Medium.Nov 22 2014, 2:28 AM
bzimport set Reference to bz57717.
bzimport added a subscriber: Unknown Object (MLST).

For these types of bugs, please include method of upload (aka was chunked upload used, and in particular was the async option used. If you're not sure just include which program used to upload - UploadWizard, Special:Upload, or something else).

This is done using the API action=upload, without stash, without async. The most simple way.

Did you mean >100MB instead of <100MB in the bug summary?
If so, feel free to fix.

I mean <100MB. I think the bug is easy to reproduce with the example I have given.

(In reply to comment #4)

I mean <100MB.

Then what range does "big TIFF images" mean if it's also smaller than 100MB?

I don't know. I don't have made systematic tests.

But I have uploaded thousands of tiff files on Commons and my feeling is that everything is OK until ~95MB, over this limit you will maybe have a timeout.

I think that the post-treatment of the TIFF upload (checks, ...) is in some case longer than proxy/apache timeout thresholds and in such cases, this error occurs.

Aklapper changed the subtype of this task from "Task" to "Bug Report".Feb 15 2022, 9:39 PM
Aklapper removed a subscriber: wikibugs-l-list.

This bug is too old and too lacking in details to be actionable. Possibly https://gerrit.wikimedia.org/r/c/mediawiki/core/+/1007983 may help with this, but who knows.

If error is still happening, please file a new bug.