There are a number of bugs in which small wikis are unfairly impacted by the performance constraints of large wikis. For example, many Special pages have been disabled across all Wikimedia wikis (cf. bug 15434). A small wiki such as ch.wikipedia.org, with 151 content pages, is treated the same as a wiki with over four million content pages. This doesn't make any sense.
This situation is unacceptable. A small wiki should not see a reduced user experience because of the existence of (almost entirely unrelated) wikis that have millions of content pages. We know the approximate sizes involved, so we should be able to safely and sanely tier these wikis (and then periodically check those tiers for accuracy and appropriateness). While we all wish that every wiki could be treated equally, it doesn't make any sense to punish small wikis indefinitely due to circumstances over which they have no control or involvement (i.e., an explosion in growth on a sibling project).
Some stats are available at https://wiki.toolserver.org/view/Wiki_server_assignments. There are other lists at Meta-Wiki, I believe. And I can query the *links tables for size if that's deemed necessary.
As far I as understand this, step one would be to make a set of groupings and then create individual wiki lists. Or perhaps just have a small.dblist or a large.dblist and add conditional statements based on that?
It looks like a small.dblist may already exist, even? Is that a list of small wikis (https://noc.wikimedia.org/conf/small.dblist doesn't load for me)?
Version: unspecified
Severity: enhancement