Talk:Main Page/Archive 20080229

From DIYWiki
Jump to navigation Jump to search

This page is for discussion of the DIY Wiki main page, and the Wiki as a whole.

It also seems to be a favoured place for spammers advertising dodgy pharmaceuticals, fakereplica Rolexes and so on <sigh>

Previous discussions on this page have been Archived

HpsEnL <a href="http://zzbdvgpksazm.com/">zzbdvgpksazm</a>, [url=http://wopjsjysioch.com/]wopjsjysioch[/url], [link=http://pnxgempbahtb.com/]pnxgempbahtb[/link], http://rcomtnrhijho.com/

Spam avoidance: CAPTCHA

I think I've found what I was looking for.

http://www.mediawiki.org/wiki/Extension:ConfirmEdit

I'm off to ask Grunff if it would be possible for him to install this, if we want. Can we have a quick show of hands that we do want it?

I suggest we have it configured for:

  • $wgCaptchaTriggers['addurl'] = true; -- Check on edits that add URLs

with

  • $ceAllowConfirmedEmail, (Allow users who have confirmed their e-mail addresses to post URL links without being harassed by the captcha)


Also:

  • $wgCaptchaTriggers['createaccount'] = true; -- Check on account creation.

And what about these also?

  • $wgCaptchaTriggers['create'] = true; -- Check on page creation.
  • $wgCaptchaTriggers['badlogin'] = true; -- Check after a failed log-in attempt.

--John Stumbles 13:50, 8 September 2007 (BST)


Excellent idea! NT 21:59, 8 September 2007 (BST)


Yup, Sounds good to me as well. Options look fine (although I have not read the description of all those available yet! --John Rumm 00:26, 9 September 2007 (BST)

Over to you then Grunff please, if you can? --John Stumbles 12:32, 9 September 2007 (BST)

Ah, ok.... took me a moment to work out what was going on (for obvious reasons!). Will do. --John Rumm 21:07, 11 September 2007 (BST)

Protect

I'm getting fed up with daily despamming, and think maybe its time to protect all pages that get spammed until we get the captcha up and running. What do you think?

The danger there is that when a bot originated update fails, it will presumably draw the attention of a human to locate another suitable page. The problem then just shifts. Before long the whole site is protected. We could restrict posts to registered users, but even there I notice the number of accounts has been growing recently - possibly in anticipation of that move. (it also blocks the small but useful numbers of useful edits we get from IPs). --John Rumm 02:34, 19 October 2007 (BST)

Or is there a way we can get it going? Or block all edits with http in them? etc? NT 23:46, 18 October 2007 (BST)

Did we get any response from Grunff on that capatcha mod? Not aware of much else we can do without assistance from the server maintainer.

He said it's not compatible with the version of MW we're running even though the docs say it should be. Upgrading to a later version of MW may solve the problem, but he can't commit the time to it at present. I came across another MW which seemed to avoid spam quite well by blacklisting certain domains (e.g. *.cn!). I don't know if that'd work for us, and whether we could set that up ourselves of if Grunff would have to do it. Fraid I haven't time ATM... -- John Stumbles 12:33, 19 October 2007 (BST)


I'm aware we'd soon have a site with all protected pages, and that we'd lose out on the occasional input - and its not where I want the wiki to go, just am getting weary of despamming it day after day, knowing full well that blocking IPs has no real effect on the spammers. How do you feel about the daily despam? NT 14:02, 19 October 2007 (BST)

We can try it - but it is obviously hard to go back from. Having said that we can relax the restrictions if we do get effective spam control in the future. I would suggest setting the block to "unregistered" users to start with and seeing if that has any effect. It may just result in spammers registering, but it might increase their hassle factor a tad. Shall we try it for one article to stat with? Say the fitting TRVs to microbore one? --John Rumm 00:11, 20 October 2007 (BST)
uncanny that - I post the above edit and guess which article the latest dweeb had fiddled with as I was typing! OK set the protection on it to registered users only now. Let's see what occurs. --John Rumm 00:17, 20 October 2007 (BST)

stats

I did some analysis of the spam rate...

You can see the results in a spreadsheet here: [1]

So August resulted in 14 hits, 75 in sept, and 118 in the first couple of weeks of oct! Clearly not an acceptable trend.

--John Rumm 22:44, 20 October 2007 (BST)


not good :( I hope something will get done so we contributions vaneasier again, but for now it just has to stay alive. NT 22:53, 20 October 2007 (BST)

Results

Interesting results over the last few days... talk:main page seems to be taking the vast bulk of the spam and black holing it quite nicely, with the rest mopped up here. My general feeling is that we ought to leave these two pages unprotected so that they can carry on doing this without raising attention. The anticipated diversion to other target articles does not seem to have happened (yet!)

I did a bit of hunting though the logs and concluded that we have never had a genuine edit from an IP that has been used for spam (typically not even from the same country as the IP is from). Of the IPs we have banned for short periods, a number of them have re-offended after the ban elapsed. I am tempted to suggest a policy of feeding an offending IP into samspade.org, to see where it is based. If it looks like an unlikely country to be participating in a UK based wiki, just ban it permanently on the first offence. If it ever turns out that someone was genuinely attempting to edit from there they can always email to be unblocked. Any thoughts?

--John Rumm 03:26, 29 October 2007 (GMT)

Sounds like sense to me John. Spam is a threat to the existence of the wiki, and I'm not sure how useful a uk wiki is in outer mongolia, nor what contribution we could really expect from distant foreign countries.

I suspect that often banned spammers will simply come in with the same junk from another IP, and in the case of talk:main a daily trim to remove spam avoids the relatively large workload of banning an increasingly large number of spammers. And the number may just keep increasing.

Perhaps between the 2 approaches life will become easier: ban those that are easily banned en masse, using the methods you describe, and just trim the ones that get through that net.

Any targeted site needs a honeypot, and it looks like we've got one already working. Not sure why they pick it but they do, and thats good enough. NT 00:02, 1 November 2007 (GMT)