You are viewing the MafiaScum.net Wiki. To play the game, visit the forum.
MediaWiki talk:Spam-blacklist: Difference between revisions
Dani Banani (talk | contribs) |
(→Funny thing: think I found it) |
||
Line 37: | Line 37: | ||
:Yep, that's exactly what the whitelist is for. We can try whitelisting deviantart.com for now and see if that helps... -- [[User:Mr. Flay|Mr.]] [[User_talk:Mr. Flay|Flay]] 00:07, 31 October 2007 (MDT) | :Yep, that's exactly what the whitelist is for. We can try whitelisting deviantart.com for now and see if that helps... -- [[User:Mr. Flay|Mr.]] [[User_talk:Mr. Flay|Flay]] 00:07, 31 October 2007 (MDT) | ||
::i guess i don't understand how the blacklist works... i understand why <code>.deviantart.com</code> was blocked, but i don't understand why <code><nowiki>http://vacuidaddeluz.blogspot.com</nowiki></code> was blocked... it's not just blocking addresses that exactly match <code>.de</code>, it blocks anything with <code>de</code> in the URL unless it doesn't come after a period and is the first letter... for example, it doesn't block <code><nowiki>http://del.icio.us.com</nowiki></code>, but it would block <code><nowiki>http://bdelicious.com</nowiki></code> even though the <code>de</code> isn't immediately proceeded by a period... so with this additional information, i can't help but repeat my concerns that the first few items on the blacklist are too broad... it may not be that big of a problem right now, as we can add exceptions to the whitelist, but i foresee it becoming a bigger problem, so instead of continuing to increase the number of exceptions to the rule, i think we should make the items on the blacklist more specific, especially the first few...<br/>-- [[User:Dani Banani|Dani]] [[User_talk:Dani Banani|Banani]] 01:58, 2 November 2007 (MDT) | ::i guess i don't understand how the blacklist works... i understand why <code>.deviantart.com</code> was blocked, but i don't understand why <code><nowiki>http://vacuidaddeluz.blogspot.com</nowiki></code> was blocked... it's not just blocking addresses that exactly match <code>.de</code>, it blocks anything with <code>de</code> in the URL unless it doesn't come after a period and is the first letter... for example, it doesn't block <code><nowiki>http://del.icio.us.com</nowiki></code>, but it would block <code><nowiki>http://bdelicious.com</nowiki></code> even though the <code>de</code> isn't immediately proceeded by a period... so with this additional information, i can't help but repeat my concerns that the first few items on the blacklist are too broad... it may not be that big of a problem right now, as we can add exceptions to the whitelist, but i foresee it becoming a bigger problem, so instead of continuing to increase the number of exceptions to the rule, i think we should make the items on the blacklist more specific, especially the first few...<br/>-- [[User:Dani Banani|Dani]] [[User_talk:Dani Banani|Banani]] 01:58, 2 November 2007 (MDT) | ||
:::No, you're right, there was a problem with using the . in the list, it just took me a while to remember why. http://www.regular-expressions.info/quickstart.html is a good start at using regular expressions; we need to escape all of the special characters in the list with a \ to make them read like we want to; I forgot that . itself was a special character. |
Revision as of 18:32, 4 November 2007
Any user can suggest URLS that the see in repeated spam here. Sysops should comment on entries that have been added to the list or declined, to avoid duplication of effort.
Suggestions for blacklisting
- hajl.cheefmsn.com
- opredo.info
- azresults.com
- Both added. 08:50, 17 October 2007 (MDT)
- adsmc.com
- cdq369.com
- WWW.yhht-valve.com
- byzxw.com (from that spammer you just blocked) -gorc
Specific entries that should be whitelisted
- This can include personal webpages of actual Scummers on troublesome domains, or that happen to contain a word that is often used in spam.
alphabetical
should we alphabetize the list?.. just seems like it would be easier to not get duplicates if it was organized somehow
-- LyingBrian 13:22, 18 October 2007 (MDT)
- ok, i went ahead & did this... if there's a reason why it shouldn't be alphabetized, feel free to revert it, adding 'casino' to the list...
-- LyingBrian 13:29, 18 October 2007 (MDT)
Funny thing
Hi! Some minutes ago I was updating my userpage and when I hit "Save" it came a warning page saying the the Spam filter blocked my page. Reason? Apparently the link to my deviantart account. screenshot The same happened with my blog link (which is vacuidaddeluz in blogspot) Maybe the filter's looking for URLs with "de". It's not a big deal to me, as I'm not whoring for traffic to my blog, but this means the filter's possibly too good. Andycyca||me||What? 10:11, 30 October 2007 (MDT)
- yeah, i'm assuming it was the same thing... i removed the '.de' protocol from the list, so try it again, now, and it should let you post it... Flay, do you think the first few protocols, i.e. .ru - .to, are a little too broad?..
-- Dani Banani 12:52, 30 October 2007 (MDT) - Yep, that's exactly what the whitelist is for. We can try whitelisting deviantart.com for now and see if that helps... -- Mr. Flay 00:07, 31 October 2007 (MDT)
- i guess i don't understand how the blacklist works... i understand why
.deviantart.com
was blocked, but i don't understand whyhttp://vacuidaddeluz.blogspot.com
was blocked... it's not just blocking addresses that exactly match.de
, it blocks anything withde
in the URL unless it doesn't come after a period and is the first letter... for example, it doesn't blockhttp://del.icio.us.com
, but it would blockhttp://bdelicious.com
even though thede
isn't immediately proceeded by a period... so with this additional information, i can't help but repeat my concerns that the first few items on the blacklist are too broad... it may not be that big of a problem right now, as we can add exceptions to the whitelist, but i foresee it becoming a bigger problem, so instead of continuing to increase the number of exceptions to the rule, i think we should make the items on the blacklist more specific, especially the first few...
-- Dani Banani 01:58, 2 November 2007 (MDT)- No, you're right, there was a problem with using the . in the list, it just took me a while to remember why. http://www.regular-expressions.info/quickstart.html is a good start at using regular expressions; we need to escape all of the special characters in the list with a \ to make them read like we want to; I forgot that . itself was a special character.
- i guess i don't understand how the blacklist works... i understand why