a new entry in a blacklist should not cause a spamprotection intervention on existing links. it should one block someone, if he tries to put a new (forbidden) link to a page.
i guess, this could be technically solved by simply counting the numbers of occurrences of forbidden urls before and after editing of a page: if diff!=0 then block
background: when a regexp is put to the blacklist, all articles with matched urls are quasi-blocked and it would cost the blacklister too much time to delete/escape all links manually.
Version: unspecified
Severity: enhancement