import pywikibot as p
s=p.Site('commons','commons')
pg=p.Page(s, 'New York City')
for i in pg.iterlanglinks(): print i.site
...
commons:commons
<snip>
There are a few things that are working together to cause this:
iterlanglinks calls Site.pagelanglinks which does:
yield pywikibot.Link.langlinkUnsafe(linkdata['lang'], linkdata['*'], source=self)
In langlinkUnsafe, there is:
link._site = pywikibot.Site(lang, source.family.name)
Now, unfortunately for commons:
p.Site('en','commons')
Site("commons", "commons")
Another issue is that https://commons.wikimedia.org/w/api.php?action=query&titles=New%20York%20City&prop=langlinks (the actual API query we make) only returns language codes, not full database names.
Version: core-(2.0)
Severity: normal
Whiteboard: backportable