Page MenuHomePhabricator

block new spam only
Closed, ResolvedPublic

Description

a new entry in a blacklist should not cause a spamprotection intervention on existing links. it should one block someone, if he tries to put a new (forbidden) link to a page.
i guess, this could be technically solved by simply counting the numbers of occurrences of forbidden urls before and after editing of a page: if diff!=0 then block

background: when a regexp is put to the blacklist, all articles with matched urls are quasi-blocked and it would cost the blacklister too much time to delete/escape all links manually.


Version: unspecified
Severity: enhancement

Details

Reference
bz14092

Event Timeline

bzimport raised the priority of this task from to Medium.Nov 21 2014, 10:14 PM
bzimport added a project: SpamBlacklist.
bzimport set Reference to bz14092.
bzimport added a subscriber: Unknown Object (MLST).

Agreed, this would also solve most of the cases of another issue talked about in another bug which suggested bots should be able to be exemplified because they don't always know how to handle spam urls right in this case.

Blocking new spamlinks would be benificial. (Though, we should probably find some way of warning the editor "Hey, there are spam links on this page... Could you remove them for us so that editors don't end up with issues during a revert?)

Also, it's not the number of urls, but we can check that they are all there. I suggest taking a look at the ProtectSection extension. It makes good use of the EditFilter, but namely, it uses a reegx to ensure that all the protected section tags remain in the page after edit, so this is the perfect behavior to help understand how to do this with Spam Blacklist.

  • This bug has been marked as a duplicate of bug 1505 ***