In DefaultSettings.php you should document this better:
/** * Robot policies for namespaces * e.g. $wgNamespaceRobotPolicies = array( NS_TALK => 'noindex' ); */ $wgNamespaceRobotPolicies = array();
else one is hard pressed to figure out what other things one can use
other than noindex. The array we can figure out from a nearby hint
about Language.php, but not the choices other than noindex, nor their
effects.
Document them here or say what other .php file to see.
Don't just hope the user will use a search engine or check around
meta.mediawiki.org, as that makes getting this simple answer dependent
on having a working and external network connection, even before
production time (i.e., at mere examining and testing the software
time).
Wait, mention parts of http://meta.wikimedia.org/wiki/Robots.txt like
The only way to keep a URL out of Google's index is to let Google crawl the page and see a meta tag specifying robots="noindex". Although this meta tag is already present on the edit page HTML template, Google does not spider the edit pages (because they are forbidden by robots.txt) and therefore does not see the meta tag.
Wait, SpecialPage.php already has
$wgOut->setRobotPolicy( "noindex,nofollow" );
so users will wonder why tinkering with NS_SPECIAL is futile.
(Should be noted in bug 8338.)
OK, enough for one day.
Version: 1.7.x
Severity: minor
OS: Linux
Platform: PC