Page MenuHomePhabricator

XSS in CharInsert, forged strip markers
Closed, ResolvedPublic

Description

  • Original Message --------

Subject: XSS in when combined with faked strip item markers
Date: Sat, 17 Mar 2012 23:04:30 -0300
From: Bawolff Bawolff <bawolff@gmail.com>
To: security <security@wikimedia.org>

I saw r113981, and it made me wonder what sort of mischief someone
could get up to if they forged strip item markers. Anyhow, create a
page containing only the following:

<nowiki>','',''); alert("XSS",')</nowiki>

{{#tag:charinsert|{{padleft:|21|<nowiki/>}}-nowiki-00000002-QINU}}

This creates a link on the page, which one clicked, arbitrary js is
executed. Note this is slightly probabilistic, depending on how big a
number Parser::getRandomString() returns. Most of the time (I'd
estimate ~75%) it works. if it doesn't the first time, hit preview
again. Using more complex parser functions, one could probably make
code that works 99.9% of the time.

The link is ugly such that most users would probably not click on it,
but I imagine you could use css to make "clicking" on the link seem to
be a legitimate thing to do.

This particular issue could be fixed by doing $data =
$parser->mStripState->unstripBoth( $data ); from the charinsert
extension. However I'm concerned there may be similar issues with
other extensions

Thanks,
Bawolff


Version: unspecified
Severity: normal

Details

Reference
bz35315

Event Timeline

bzimport raised the priority of this task from to High.Nov 22 2014, 12:14 AM
bzimport added a project: MediaWiki-Parser.
bzimport set Reference to bz35315.
bzimport added a subscriber: Unknown Object (MLST).

I think it would be sensible to import ExtParserFunctions::killMarkers() into CoreParserFunctions and to run it on pretty much everything. It's surprising that there aren't a hundred bugs reporting "strip marker exposed in CoreParserFunction X". That might be disruptive, so maybe it can wait until 1.20 if an audit of popular tag hook extensions comes up clear.

Obviously CharInsert should be patched along the lines suggested by Bawolff. We don't usually announce extension vulnerabilities, but since it's a popular extension, maybe we could include a note about it in the mail to mediawiki-announce for 1.18.2.

Created attachment 10269
Patch for CharInsert

Issue reproduced, fix tested. I'm not sure whether to commit this now with a coy commit message or to wait until closer to 1.18.2.

Attached:

Created attachment 10270
Unstrip in CPF::tagObj()

I think part of the blame lies with #tag, since before #tag was introduced, the arguments to tag hook functions could not include strip markers. Calling unstrip on the #tag inputs could be said to preserve interface backwards compatibility.

Patch proposed for 1.18.2. Ran parser tests.

attachment bug35315-tag-func.patch ignored as obsolete

(In reply to comment #3)

Created attachment 10270 [details]
Unstrip in CPF::tagObj()

I think part of the blame lies with #tag, since before #tag was introduced, the
arguments to tag hook functions could not include strip markers. Calling
unstrip on the #tag inputs could be said to preserve interface backwards
compatibility.

Patch proposed for 1.18.2. Ran parser tests.

Hmm, that would break Wikipedia's fancy <ref> inside a <ref> template (when they do things like {{#tag:ref|Some note<ref>citation for note</ref>}}). Otherwise it seems like that would fix a lot of things.

Maybe have some way tag extensions could specify that they want to not have things unstripped, in order to make having strip tags be opt in (Kind of reminds me of the /have a method to denote certain tag extensions should have PST be run on them/ bug).

Maybe removing U+007F in things like Xml::escapeTagsOnly and
MediaWiki's various other escaping functions might be a good idea. It
wouldn't fix the charinsert issue, but could prevent future problems
of a similar nature (I haven't actually looked to see if there's any
code that assumes strip markers can get through the escaping functions
safely, so that might not be that good an idea). Intuitively though, If I
pass something into an escaping function, usually that's the absolute
last step before output, and I would expect at that point that all
strip markers have already been unstriped, and that there would be no way for unescaped text to slip back in.


Also of interest, the following causes the parser to go in an indef loop:

<pre>test</pre>

{{#tag:pre|{{padleft:|21|<nowiki/>}}-pre-00000004-QINU}}

attachment bug35315-tag-func.patch ignored as obsolete

(In reply to comment #4)

Hmm, that would break Wikipedia's fancy <ref> inside a <ref> template (when
they do things like {{#tag:ref|Some note<ref>citation for note</ref>}}).
Otherwise it seems like that would fix a lot of things.

Good point. And there already is a parser test that checks for that, I'm not sure how I missed it.

Committed killMarkers/markerStripCallback patch in r114231.

Suggested release announcement text:

A long-standing bug in the wikitext parser (bug 22555) was discovered to have security implications. In the presence of the popular CharInsert extension, it leads to cross-site scripting (XSS). XSS may be possible with other extensions or perhaps even the MediaWiki core alone, although this is not confirmed at this time. A denial-of-service attack (infinite loop) is also possible regardless of configuration.

Created attachment 10295
Limit the number of iterations of the unstrip loop to the total number of items to unstrip

Sounds good.

Also might I suggest limiting the unstrip loop so that it is impossible to cause an indef loop (Hopefully the other change would prevent people from mucking with strip markers, but just in case, not having the potential for indef loop would be good to)

Attached:

(In reply to comment #8)

Created attachment 10295 [details]
Limit the number of iterations of the unstrip loop to the total number of items
to unstrip

Sounds good.

Also might I suggest limiting the unstrip loop so that it is impossible to
cause an indef loop (Hopefully the other change would prevent people from
mucking with strip markers, but just in case, not having the potential for
indef loop would be good to)

When I asked Tim about this, he didn't think it was necessary to be backported, but it's fine to go into trunk after the other commits have been made

Attached:

For XSS, this test case seems to work just as well:

{{#tag:charinsert|<nowiki>','',''); alert("XSS",')</nowiki>}}

It doesn't need forged strip markers. But the infinite loop in comment 4 does need forged strip markers.

(In reply to comment #8)

Created attachment 10295 [details]
Limit the number of iterations of the unstrip loop to the total number of items
to unstrip

This still left quite a bit of scope for algorithmic DoS. The iterative algorithm that you're hacking was an inefficient hack to start with, since it scans the entire expanded text at every recursion depth, giving a running time proportional to the depth multiplied by the size. I had a go at doing a proper job of it at https://gerrit.wikimedia.org/r/#change,5596

Attached: