You are viewing the MafiaScum.net Wiki. To play the game, visit the forum.
MediaWiki talk:Spam-blacklist: Difference between revisions
No edit summary |
|||
(75 intermediate revisions by 56 users not shown) | |||
Line 1: | Line 1: | ||
{{TOCright}} | {{TOCright}}{{protected}} | ||
[[Category:Administration]] | [[Category:Administration]] | ||
Any user can suggest URLs that the see in repeated spam here. [[SysOp]]s should comment on entries that have been added to the list or declined, to avoid duplication of effort. | Any user can suggest URLs that the see in repeated spam here. [[SysOp]]s should comment on entries that have been added to the list or declined, to avoid duplication of effort. | ||
== [[Blacklist]] Suggestions == | == [[Blacklist]] Suggestions == | ||
<!--New entries should be listed each on a new line, with a * in front to form a bulleted list.--> | <!--New entries should be listed each on a new line, with a * in front to form a bulleted list.--> | ||
From the latest spam attack: | |||
(removed because they're now on the spamfilter) | |||
there are more, but listing them all seems counterproductive... | |||
[[User:Shanba|Shanba]] 16:22, 24 May 2008 (MDT) | |||
Not if/how subdomains work, but _____.yi.org needs blacklisting.<sup>[[User:Somestrangeflea|some]]</sup>[[User Talk:Somestrangeflea|strange]]<sub>[[User:Somestrangeflea/Games In Detail|flea]]</sub> 17:11, 10 July 2008 (MDT) | |||
------------- | |||
We could add: | |||
* | *http://coastline.com | ||
*http://anchorchemicalsinc.com | |||
--<span style="color:#336600">Andycyca</span>||[[User:Andycyca|<span style="color:#006633">me</span>]]||[[User_talk:Andycyca|<span style="color:green">What?</span>]] 15:10, 25 October 2008 (MDT) | |||
( | |||
== [[Whitelist]] Suggestions == | == [[Whitelist]] Suggestions == | ||
:''This can include personal webpages of actual Scummers on troublesome domains, or that happen to contain a word that is often used in spam.'' | :''This can include personal webpages of actual Scummers on troublesome domains, or that happen to contain a word that is often used in spam.'' | ||
<!--New entries should be listed each on a new line, with a * in front to form a bulleted list.--> | <!--New entries should be listed each on a new line, with a * in front to form a bulleted list.--> | ||
Question: I played mafia on an external site, and I made a list of games with external links to those games to make it easier for scummers to check my off-site experience. It seems smash boards dot com is blacklisted after I save it. : [ Is there a reason it is blacklisted? Is it possible to be taked off that list? | |||
~ Ranmaru | |||
Would it be possible to whitelist bay12forums . com ? They have a mafia subforum (Their forum software is Simple Machines (SMF) so their links are constructed differently. Their Mafia subforum is found at bay12forums . com /smf/index.php?board=20.0 without the white space. [[User:Tn5421|Tn5421]] 04:09, 10 May 2014 (UTC) | |||
== hajl.cheefmsn.com == | == hajl.cheefmsn.com == | ||
Line 55: | Line 58: | ||
::i guess i don't understand how the blacklist works... i understand why <code>.deviantart.com</code> was blocked, but i don't understand why <code><nowiki>http://vacuidaddeluz.blogspot.com</nowiki></code> was blocked... it's not just blocking addresses that exactly match <code>.de</code>, it blocks anything with <code>de</code> in the URL unless it doesn't come after a period and is the first letter... for example, it doesn't block <code><nowiki>http://del.icio.us.com</nowiki></code>, but it would block <code><nowiki>http://bdelicious.com</nowiki></code> even though the <code>de</code> isn't immediately proceeded by a period... so with this additional information, i can't help but repeat my concerns that the first few items on the blacklist are too broad... it may not be that big of a problem right now, as we can add exceptions to the whitelist, but i foresee it becoming a bigger problem, so instead of continuing to increase the number of exceptions to the rule, i think we should make the items on the blacklist more specific, especially the first few...{{sig|Dani Banani|01:58, 2 November 2007 (MDT)}} | ::i guess i don't understand how the blacklist works... i understand why <code>.deviantart.com</code> was blocked, but i don't understand why <code><nowiki>http://vacuidaddeluz.blogspot.com</nowiki></code> was blocked... it's not just blocking addresses that exactly match <code>.de</code>, it blocks anything with <code>de</code> in the URL unless it doesn't come after a period and is the first letter... for example, it doesn't block <code><nowiki>http://del.icio.us.com</nowiki></code>, but it would block <code><nowiki>http://bdelicious.com</nowiki></code> even though the <code>de</code> isn't immediately proceeded by a period... so with this additional information, i can't help but repeat my concerns that the first few items on the blacklist are too broad... it may not be that big of a problem right now, as we can add exceptions to the whitelist, but i foresee it becoming a bigger problem, so instead of continuing to increase the number of exceptions to the rule, i think we should make the items on the blacklist more specific, especially the first few...{{sig|Dani Banani|01:58, 2 November 2007 (MDT)}} | ||
:::No, you're right, there was a problem with using the . in the list, it just took me a while to remember why. | :::No, you're right, there was a problem with using the . in the list, it just took me a while to remember why. We need to escape all of the special characters in the list with a \ to make them read like we want to; I forgot that . itself was a special character. | ||
:::-- [[User:Mr. Flay|Mr.]] [[User_talk:Mr. Flay|Flay]] 11:32, 4 November 2007 (MST) | :::-- [[User:Mr. Flay|Mr.]] [[User_talk:Mr. Flay|Flay]] 11:32, 4 November 2007 (MST) | ||
Line 61: | Line 64: | ||
ok, so i'm trying to come up w/ a regex to block any instance of the word regardless of the character that comes before it... using <code>sample</code> as an example, the closest thing i've found is <code>.+sample</code>... this blocks ''any'' character before <code>sample</code> except for <code>[ ] " < ></code> and a single or double slash (<code>/ or //</code>), but it would block a triple slash... of course just adding <code>sample</code> to the [[blacklist]] would block <code><nowiki>http://sample.com</nowiki></code> whereas <code>.+sample</code> would not, but <code><nowiki>http://example-sample.com</nowiki></code> would ''not'' be blocked by <code>sample</code>, but ''would'' be blocked by <code>.+sample</code>. if anyone knows how to write a regex that would block both, please advise, as the second example (<code><nowiki>http://example-sample.com</nowiki></code>) is what i'm seeing more frequently in the spam pages... of course one option is to add both instances of each word, one that is plain (<code>sample</code>), and one that is ''nearly'' wild (<code>.+sample</code>).{{sig|Dani Banani|15:32, 11.5.2007 (MST)}} | ok, so i'm trying to come up w/ a regex to block any instance of the word regardless of the character that comes before it... using <code>sample</code> as an example, the closest thing i've found is <code>.+sample</code>... this blocks ''any'' character before <code>sample</code> except for <code>[ ] " < ></code> and a single or double slash (<code>/ or //</code>), but it would block a triple slash... of course just adding <code>sample</code> to the [[blacklist]] would block <code><nowiki>http://sample.com</nowiki></code> whereas <code>.+sample</code> would not, but <code><nowiki>http://example-sample.com</nowiki></code> would ''not'' be blocked by <code>sample</code>, but ''would'' be blocked by <code>.+sample</code>. if anyone knows how to write a regex that would block both, please advise, as the second example (<code><nowiki>http://example-sample.com</nowiki></code>) is what i'm seeing more frequently in the spam pages... of course one option is to add both instances of each word, one that is plain (<code>sample</code>), and one that is ''nearly'' wild (<code>.+sample</code>).{{sig|Dani Banani|15:32, 11.5.2007 (MST)}} | ||
:Sorry, I'm not actually very good with regex. I think the 'two entries' idea will probably be best for now, unless you've come up with something better. -- [[User:Mr. Flay|Mr.]] [[User_talk:Mr. Flay|Flay]] 01:21, 5 March 2010 (EST) | |||
::Wow. I've never seen a Wiki Talk page necro'd. --[[User:Yawetag|yawetag]] 10:51, 5 March 2010 (EST) | |||
== del.icio.us == | == del.icio.us == |
Latest revision as of 04:09, 10 May 2014
- This page has been protected against IP/brand new user edits. Please log in if you wish to edit this page, or use the talk page to discuss removing protection.
Any user can suggest URLs that the see in repeated spam here. SysOps should comment on entries that have been added to the list or declined, to avoid duplication of effort.
Blacklist Suggestions
From the latest spam attack: (removed because they're now on the spamfilter) there are more, but listing them all seems counterproductive... Shanba 16:22, 24 May 2008 (MDT) Not if/how subdomains work, but _____.yi.org needs blacklisting.somestrangeflea 17:11, 10 July 2008 (MDT)
We could add:
--Andycyca||me||What? 15:10, 25 October 2008 (MDT)
Whitelist Suggestions
- This can include personal webpages of actual Scummers on troublesome domains, or that happen to contain a word that is often used in spam.
Question: I played mafia on an external site, and I made a list of games with external links to those games to make it easier for scummers to check my off-site experience. It seems smash boards dot com is blacklisted after I save it. : [ Is there a reason it is blacklisted? Is it possible to be taked off that list?
~ Ranmaru
Would it be possible to whitelist bay12forums . com ? They have a mafia subforum (Their forum software is Simple Machines (SMF) so their links are constructed differently. Their Mafia subforum is found at bay12forums . com /smf/index.php?board=20.0 without the white space. Tn5421 04:09, 10 May 2014 (UTC)
hajl.cheefmsn.com
a subdomain of cheefmsn? It keeps coming no matter what
– Andycyca (talk • contribs) 23:36, 10.8.2007
- The problem with this one is that none of the links on the spammed pages actually go to cheefmsn.com, which I assume is why the filter isn't catching them.
- somestrangeflea 00:36, 9 October 2007 (MDT)
alphabetical
should we alphabetize the list?.. just seems like it would be easier to not get duplicates if it was organized somehow
– LyingBrian (talk • contribs) 13:22, 10.18.2007 (MDT)
- ok, i went ahead & did this... if there's a reason why it shouldn't be alphabetized, feel free to revert it, adding 'casino' to the list...
– LyingBrian (talk • contribs) 13:29, 18 October 2007 (MDT)
funny thing
Hi! Some minutes ago I was updating my userpage and when I hit "Save" it came a warning page saying the the Spam filter blocked my page. Reason? Apparently the link to my deviantart account. screenshot The same happened with my blog link (which is vacuidaddeluz in blogspot) Maybe the filter's looking for URLs with "de". It's not a big deal to me, as I'm not whoring for traffic to my blog, but this means the filter's possibly too good.
Andycyca||me||What? 10:11, 30 October 2007 (MDT)
- yeah, i'm assuming it was the same thing... i removed the '.de' protocol from the list, so try it again, now, and it should let you post it... Flay, do you think the first few protocols, i.e. .ru - .to, are a little too broad?..
– Dani Banani (talk • contribs) 12:52, 10.30.2007 (MDT)
- Yep, that's exactly what the whitelist is for. We can try whitelisting deviantart.com for now and see if that helps...
- -- Mr. Flay 00:07, 31 October 2007 (MDT)
- i guess i don't understand how the blacklist works... i understand why
.deviantart.com
was blocked, but i don't understand whyhttp://vacuidaddeluz.blogspot.com
was blocked... it's not just blocking addresses that exactly match.de
, it blocks anything withde
in the URL unless it doesn't come after a period and is the first letter... for example, it doesn't blockhttp://del.icio.us.com
, but it would blockhttp://bdelicious.com
even though thede
isn't immediately proceeded by a period... so with this additional information, i can't help but repeat my concerns that the first few items on the blacklist are too broad... it may not be that big of a problem right now, as we can add exceptions to the whitelist, but i foresee it becoming a bigger problem, so instead of continuing to increase the number of exceptions to the rule, i think we should make the items on the blacklist more specific, especially the first few...
– Dani Banani (talk • contribs) 01:58, 2 November 2007 (MDT)
- i guess i don't understand how the blacklist works... i understand why
- No, you're right, there was a problem with using the . in the list, it just took me a while to remember why. We need to escape all of the special characters in the list with a \ to make them read like we want to; I forgot that . itself was a special character.
- -- Mr. Flay 11:32, 4 November 2007 (MST)
regular expressions
ok, so i'm trying to come up w/ a regex to block any instance of the word regardless of the character that comes before it... using sample
as an example, the closest thing i've found is .+sample
... this blocks any character before sample
except for [ ] " < >
and a single or double slash (/ or //
), but it would block a triple slash... of course just adding sample
to the blacklist would block http://sample.com
whereas .+sample
would not, but http://example-sample.com
would not be blocked by sample
, but would be blocked by .+sample
. if anyone knows how to write a regex that would block both, please advise, as the second example (http://example-sample.com
) is what i'm seeing more frequently in the spam pages... of course one option is to add both instances of each word, one that is plain (sample
), and one that is nearly wild (.+sample
).
– Dani Banani (talk • contribs) 15:32, 11.5.2007 (MST)
- Sorry, I'm not actually very good with regex. I think the 'two entries' idea will probably be best for now, unless you've come up with something better. -- Mr. Flay 01:21, 5 March 2010 (EST)
- Wow. I've never seen a Wiki Talk page necro'd. --yawetag 10:51, 5 March 2010 (EST)
del.icio.us
i really feel that del.icio.us
is too broad... some users may want to include a link on their userpage, and i don't think it should be necessary for users to request their specific URL to be whitelisted... i'm not even sure why it was blacklisted in the first place... i'll give this a few days for others to respond, but then i think i will remove it...
– Dani Banani (talk • contribs) 07:46, 11.19.2007 (UTC)