Page MenuHomePhabricator

Configure backend web servers to send css/js files gzip-compressed
Closed, ResolvedPublic

Details

Reference
bz15154

Event Timeline

bzimport raised the priority of this task from to Medium.Nov 21 2014, 10:20 PM
bzimport set Reference to bz15154.
bzimport added a subscriber: Unknown Object (MLST).

It's just a rule for the apaches.
But i think some old browsers didn't cope well with gzipped javascript. It's an important point to check.

We've sent gzipped JS from the wiki for years, so if these browsers exist they're already broken. :)

(In reply to comment #2)

We've sent gzipped JS from the wiki for years, so if these browsers exist
they're already broken. :)

So it would be ok to compress these?

ayg wrote:

Changing component back to Wikimedia. This is not an issue that's reliably fixable in MediaWiki; it's a web server configuration change. (We could package .htaccess files with MediaWiki, I guess, but that's not reliable anyway. We could also serve the CSS/JS from a script, like one that combines all the files into one or two requests, but that's a much bigger kettle of fish than this bug.)

Setting this up in Apache for extension files should be pretty easy; we serve a lot of the core .css/.js out of the upload cluster however. River, how straightforward would it be configured SWS for this?

river wrote:

it's already done, although it's disabled at the moment. search for "http-compression" in /opt/webserver7/https-ms1/config/ms1-obj.conf.

mike.lifeguard+bugs wrote:

(In reply to comment #6)

it's already done, although it's disabled at the moment. search for
"http-compression" in /opt/webserver7/https-ms1/config/ms1-obj.conf.

Disabled for some important reason, or could it be re-enabled OK?

river wrote:

(In reply to comment #7)

Disabled for some important reason, or could it be re-enabled OK?

no particular reason, but some testing should be done on dynamic compression vs cached, and how much cpu it actually uses.

http://docs.sun.com/app/docs/doc/820-6599/gedgd?a=view