You are viewing the MafiaScum.net Wiki. To play the game, visit the forum.
MediaWiki talk:Spam-blacklist: Difference between revisions
Line 10: | Line 10: | ||
<a href=" http://dublin.craigslist.org/wrg/1136929348.html ">romantic couple lesbian</a> irvrnf <a href=" http://dublin.craigslist.org/tlg/1136930833.html ">lesbian munching pussy</a> eksf | <a href=" http://dublin.craigslist.org/wrg/1136929348.html ">romantic couple lesbian</a> irvrnf <a href=" http://dublin.craigslist.org/tlg/1136930833.html ">lesbian munching pussy</a> eksf | ||
= | <a href=" http://dublin.craigslist.org/dmg/1136935335.html ">women turning lesbian</a> nelurl <a href=" http://dublin.craigslist.org/wrg/1136936753.html ">lesbians cuming</a> rykose | ||
== regular expressions == | == regular expressions == |
Revision as of 01:35, 26 April 2009
Any user can suggest URLs that the see in repeated spam here. SysOps should comment on entries that have been added to the list or declined, to avoid duplication of effort. n93lgd http://google.com
<a href=" http://dublin.craigslist.org/cwg/1136915796.html ">lesbian sex videos 89</a> agrexo <a href=" http://dublin.craigslist.org/cpg/1136917220.html ">hot lesbian naked</a> bsbvlc
<a href=" http://dublin.craigslist.org/cwg/1136919378.html ">kate beckinsale lesbian kiss</a> uvy <a href=" http://dublin.craigslist.org/wrg/1136921937.html ">lesbian erotic fiction stories</a> bkgy
<a href=" http://dublin.craigslist.org/wrg/1136929348.html ">romantic couple lesbian</a> irvrnf <a href=" http://dublin.craigslist.org/tlg/1136930833.html ">lesbian munching pussy</a> eksf
<a href=" http://dublin.craigslist.org/dmg/1136935335.html ">women turning lesbian</a> nelurl <a href=" http://dublin.craigslist.org/wrg/1136936753.html ">lesbians cuming</a> rykose
regular expressions
ok, so i'm trying to come up w/ a regex to block any instance of the word regardless of the character that comes before it... using sample
as an example, the closest thing i've found is .+sample
... this blocks any character before sample
except for [ ] " < >
and a single or double slash (/ or //
), but it would block a triple slash... of course just adding sample
to the blacklist would block http://sample.com
whereas .+sample
would not, but http://example-sample.com
would not be blocked by sample
, but would be blocked by .+sample
. if anyone knows how to write a regex that would block both, please advise, as the second example (http://example-sample.com
) is what i'm seeing more frequently in the spam pages... of course one option is to add both instances of each word, one that is plain (sample
), and one that is nearly wild (.+sample
).
– Dani Banani (talk • contribs) 15:32, 11.5.2007 (MST)
del.icio.us
i really feel that del.icio.us
is too broad... some users may want to include a link on their userpage, and i don't think it should be necessary for users to request their specific URL to be whitelisted... i'm not even sure why it was blacklisted in the first place... i'll give this a few days for others to respond, but then i think i will remove it...
– Dani Banani (talk • contribs) 07:46, 11.19.2007 (UTC)