Page MenuHomePhabricator

Improper use of WebRequest::getIntOrNull()
Closed, ResolvedPublic

Description

Author: emil

Description:
includes/RawPage.php (line 16):
$smaxage = $this->mRequest->getIntOrNull( 'smaxage', $wgSquidMaxage );

This doesn't correspond to getIntOrNull() function definition in includes/WebRequests.php which has only one parameter.

So if there is no 'smaxage' parameter in the web request then the $smaxage variable is null.

Later on in the code, if the raw page is not generated ('gen' request parameter is not set) then $smaxage is getting 0 which eventually causes a "Cache-Control: private" header to be sent.

Is this a desired behavior? Wouldn't use of getInt( 'smaxage', $wgSquidMaxage ) be better here?

Attached is a patch and a result of it on one wiki where I tested it.


Version: 1.12.x
Severity: trivial

Details

Reference
bz11330

Event Timeline

bzimport raised the priority of this task from to Low.Nov 21 2014, 9:58 PM
bzimport added a project: MediaWiki-Parser.
bzimport set Reference to bz11330.
bzimport added a subscriber: Unknown Object (MLST).

emil wrote:

a patch

Attached:

emil wrote:

a cache hit ratio graph for an example wiki

After applying this patch the cache hit ratio jumped up from 65% to 80%

Attached:

mini-graph.cgi.png (268×597 px, 29 KB)

It looks like the intention is to avoid a surprising s-maxage for things that aren't CSS or JS. Extra backend caching could surprise various bots and other tools trying to load pages via the raw interface.

r27456 adds some extra s-maxage forcing for CSS and JS pages that aren't made via the 'gen' option, which may partially obviate this patch.

Personally I think this whole thing is a big mess and probably needs a serious overhaul. :( :)

Reverted in r45251 -- this rev changes the behavior, forcing $smaxage to $wgSquidMaxage in cases where we would have previously ended up with $wgForcedRawSMaxage or 0.

Misread this. The "default" param has no effect here.

Removed in r45312.