Jump to content

You Were Obvious Before You Even Began to Type


You are about to reply to a thread that has been inactive for 2363 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

"A new paper suggests that it might be possible to identify potential trolls before they do their worst. Researchers at Stanford and Cornell have pulled out patterns of behaviour exhibited by the approximately one in 40 users of three news sites—CNN, Breitbart and IGN—who were subsequently banned for abuse. These include trolls’ unwillingness to mould their conversation to the slang of an online community; their propensity to swear; and the volume of contributions they make to a debate. Making an algorithm of these patterns, the researchers believe they can be 80% confident of identifying those likely to cause trouble within five posts online."

I think this is an excellent idea. Really. Pre-Crimes -- er, Pre-Trolling R Us!

Of course, what would really save time and money is an algorithm that predicts what I am going to write here before I write it, thereby obviating the need to, you know, actually post.

The short article is here.

 

(Of course, I know someone in command of a small army of alts who already knows what I'm going to say before I say it. He even knew I'd write this. Didn't you?)

Link to comment
Share on other sites

  • Replies 127
  • Created
  • Last Reply

Top Posters In This Topic

is quite interesting this

+

basically is a predictor model that the authors say can help identify FBUs with approx. 80% accuracy. FBU = Future Banned User. A person who will get banned from a forum at some time in the future due to their behaviour

 

edit: link typo

Link to comment
Share on other sites


Ivanova Shostakovich wrote:

I look forward to reading automatically generated forum posts and comments. It could be a whole new breed of drama.

 

"But you are 42. Why is school something you look forward to?"

 

I ran your post through Google's "Cleverbot." That's what it returned. Although uncanny in some respects, its relevance to your own comment doesn't exactly fill one with confidence, does it?

Link to comment
Share on other sites


irihapeti wrote:

is quite interesting this

+

basically is a predictor model that the authors say can help identify FBUs with approx. 80% accuracy. FBU = Future Banned User. A person who will get banned from a forum at some time in the future due to their behaviour

 

edit: link typo

Interesting, yes.

 

 

Also terrifying, on a number of counts.

Link to comment
Share on other sites


LaskyaClaren wrote:


Perrie Juran wrote:

minority report.jpg

 

I saw the movie....

I calculated an 87.5% probability that you'd post that.

I calculated a 91.3% probability you'd respond this way.

 

Creatures of habit, aren't we.

Link to comment
Share on other sites

OK, I lied.

The study was biased, because it only looked at those who WERE banned, not distinguishing whether they SHOULD have been banned or not, nor investigating those who SHOULD have been banned but were not.

The abstract suggested three activity indicators:

1. Concentration of efforts on a small number of threads

2. Posting irrelevantly

3. Successful at garnering responses from other users.

Well, thank goodness I only satisfy the last of these, which is one that I would argue (particularly in a DISCUSSION forum) that would NOT indicate a troll, but would be the characteristic of a participant who engendered spirited exchanges of a range of viewpoints.

The study suggests that trolls "write worse than other users over time" which suggests that the authors not only need to revisit Unclumsy English 101, but haven't identified any qualitative criteria regarding content, rather than activity.

As far as their statement: "Our analysis also reveals distinct groups of users with different levels of antisocial behavior that can change over time." goes, I entered that into a content analyser and it registered a negative meaningfulness score.

I also noted that the three principal authors of the paper had names which suggested strongly that they were ESLers.

Go figure!

PS Thank you Laskya for bringing this pile of steaming manure to our attention. I am delighted that it is the tax payers of North America who are paying for putative academics to waste their time.

PPS Our friend Derek might be particularly unsurprised to note that the study observed those whose posts were deleted were more likely to have subsequent posts deleted. No attempt, however, is made to assess whether the deletions were justified.

Link to comment
Share on other sites


Laurin Sorbet wrote:

Given the vagaries of the various incarnations of moderation in this forum, maybe the Lab already has a proprietary algorithm in place.  

If they do, it's one that has introduced a deliberate random element.

Link to comment
Share on other sites


LaskyaClaren wrote:


Laurin Sorbet wrote:

Given the vagaries of the various incarnations of moderation in this forum, maybe the Lab already has a proprietary algorithm in place.  

If they do, it's one that has introduced a deliberate random element.

I am afraid that you have fallen into the trap of using the word "random" as do most teenagers today, which is to describe behaviour which is entirely rational, but for which their limited cognitive capabilities is unable to fathom a rationale.

Link to comment
Share on other sites


Laurin Sorbet wrote:

Yes, I was thinking the developers may have named it, 'Erratic.'

I doubt if the developers were sufficiently literate to know what that word meant.

And certainly not to spell it correctly.

Not twice in succession anyway.

PS Hi Laurin; yes, it's who you think it is, of course.

Link to comment
Share on other sites


ZoeTick wrote:

OK, I lied.

The study was biased, because it only looked at those who WERE banned, not distinguishing whether they SHOULD have been banned or not, nor investigating those who SHOULD have been banned but were not.

The abstract suggested three activity indicators:

1. Concentration of efforts on a small number of threads

2. Posting irrelevantly

3. Successful at garnering responses from other users.

Well, thank goodness I only satisfy the last of these, which is one that I would argue (particularly in a DISCUSSION forum) that would NOT indicate a troll, but would be the characteristic of a participant who engendered spirited exchanges of a range of viewpoints.

The study suggests that trolls "write worse than other users over time" which suggests that the authors not only need to revisit Unclumsy English 101, but haven't identified any qualitative criteria regarding content, rather than activity.

As far as their statement: "Our analysis also reveals distinct groups of users with different levels of antisocial behavior that can change over time." goes, I entered that into a content analyser and it registered a negative meaningfulness score.

I also noted that the three principal authors of the paper had names which suggested strongly that they were ESLers.

Go figure!

PS Thank you Laskya for bringing this pile of steaming manure to our attention. I am delighted that it is the tax payers of North America who are paying for putative academics to waste their time.

PPS Our friend Derek might be particularly unsurprised to note that the study observed those whose posts were deleted were more likely to have subsequent posts deleted. No attempt, however, is made to assess whether the deletions were justified.

I have all sorts of problems with this study, some of which you identify, and a few that are different from yours. I think it's an interesting, if highly problematic and reductive, portrait of how "trolling" and "moderation" function, and relate to each other.

I find its conclusions -- that an algorithm might be created that would permit software to "predict" future trolls, by which it really means people likely to be banned -- highly suspect and dangerous.

 

Another interesting aspect, btw, is the conclusion that severe or heavy-handed moderation may actually increase "bad" behaviour.

Link to comment
Share on other sites


LaskyaClaren wrote:

Another interesting aspect, btw, is the conclusion that severe or heavy-handed moderation may actually increase "bad" behaviour.

Their conclusion is flawed. It equates further post deletions with "worse" behaviour, which is particularly indicative of moderator bias against individuals rather than their posts. Hence my PPS.

Link to comment
Share on other sites

I forget how to quote, but, "Another interesting aspect, btw, is the conclusion that severe or heavy-handed moderation may actually increase "bad" behaviour"  has been another theme in the forums.  When moderation was light, or you got the odd CALM DOWN! email or BEHAVE! post in a thread, behavior was much better.  The forum was busy, lively and filled with insightful, amusing, and sometimes naughty posters.  

Threads weren't deleted, bannings were rare, and no ones head fell off.  

 

Link to comment
Share on other sites


ZoeTick wrote:


LaskyaClaren wrote:

Another interesting aspect, btw, is the conclusion that severe or heavy-handed moderation may actually increase "bad" behaviour.

Their conclusion is flawed. It equates further post deletions with "worse" behaviour, which is particularly indicative of moderator bias against individuals rather than their posts. Hence my PPS.

It may well be flawed, but it also factors in moderator bias by noting that part of the mechanism at work is the response of people to "unfair" moderation:

 

 

While we present effective mechanisms for identifying and potentially weeding antisocial users out of a community, taking extreme action against small infractions can exacerbate antisocial behavior (e.g., unfairness can cause users to write worse). Though average classifier precision is relatively high (0.80), one in five users identified as antisocial are nonetheless misclassified.

Link to comment
Share on other sites


Laurin Sorbet wrote:

I forget how to quote, but, "
Another interesting aspect, btw, is the conclusion that severe or heavy-handed moderation may actually increase "bad" behaviour"  has been another theme in the forums.  When moderation was light, or you got the odd CALM DOWN! email or BEHAVE! post in a thread, behavior was much better.  The forum was busy, lively and filled with insightful, amusing, and sometimes naughty posters.  

Threads weren't deleted, bannings were rare, and
no ones head fell off
.  

 

Mine did. And rolled onto another avatar.

Hi Laurin, btw. :-)

Link to comment
Share on other sites


LaskyaClaren wrote:
While we present effective mechanisms for identifying and potentially weeding antisocial users out of a community, taking extreme action against small infractions can exacerbate antisocial behavior (e.g., unfairness can cause users to write worse). Though average classifier precision is relatively high (0.80), one in five users identified as antisocial are nonetheless misclassified.


There seems to be a "perceived" or two missing from that.

But I am not surprised; the standard of argument in the paper rarely rises above Meaningless Statistics for Social Scientists 101.

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 2363 days.

Please take a moment to consider if this thread is worth bumping.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...