Page MenuHomePhabricator

Editing very high-use templates results in a timeout in BacklinkCache::partition()
Closed, ResolvedPublic

Description

When someone edits a very high-use template (a template with millions of transclusions), the template edit doesn't finish cleanly. Instead of the template page reloading on page save, the user is presented with a read timeout error message. This read timeout behavior has existed for a few years now.


Version: unspecified
Severity: normal

Details

Reference
bz37731

Event Timeline

bzimport raised the priority of this task from to Needs Triage.Nov 22 2014, 12:23 AM
bzimport set Reference to bz37731.

According to Tim Starling in #wikimedia-tech just now:

[20-Jun-2012 02:28:44] Fatal error: Maximum execution time of 180 seconds exceeded at /usr/local/apache/common-local/php-1.20wmf5/includes/db/DatabaseMysql.php on line 285
that's me
and it fails in BacklinkCache->partition, very nice

This was as a result of this edit: https://commons.wikimedia.org/w/index.php?diff=prev&oldid=72965783. The template has 2,783,343 transclusions according to the Toolserver's commonswiki_p right now.

Updated summary to reflect cause.

afeldman wrote:

https://gerrit.wikimedia.org/r/#/c/32488/ added a limit to BacklinkCache::getNumLinks but some related jobs are still failing in BacklinkCache::getLinks like this:

Wed Dec 26 23:40:42 UTC 2012 mw14 commonswiki BacklinkCache::getLinks 10.0.6.61 2008 MySQL client ran out of memory (10.0.6.61) SELECT /*! STRAIGHT_JOIN */ page_namespace,page_title,page_id FROM templatelinks,page WHERE tl_namespace = '10' AND tl_title = 'Date' AND (page_id=tl_from) ORDER BY tl_from

That query returns 12384915 rows and would have to be batched.

(In reply to comment #6)

https://gerrit.wikimedia.org/r/#/c/32488/ added a limit to
BacklinkCache::getNumLinks but some related jobs are still failing in
BacklinkCache::getLinks like this:

Wed Dec 26 23:40:42 UTC 2012 mw14 commonswiki
BacklinkCache::getLinks
10.0.6.61 2008 MySQL client ran out of memory (10.0.6.61)
SELECT
/*! STRAIGHT_JOIN */ page_namespace,page_title,page_id FROM
templatelinks,page WHERE tl_namespace = '10' AND tl_title = 'Date' AND
(page_id=tl_from) ORDER BY tl_from

That query returns 12384915 rows and would have to be batched.

Moved to bug 43452 since this is about users getting timeouts.