Hello everyone, 

Just a reminder that this event will be happening in about half an hour! Here's the Youtube link again: https://www.youtube.com/watch?v=WiUfpmeJG7E

On Tue, Jun 25, 2019 at 9:14 AM Janna Layton <jlayton@wikimedia.org> wrote:
Time correction: 

The next Research Showcase will be live-streamed next Wednesday, June 26, at 11:30 AM PDT/18:30 UTC.

On Mon, Jun 24, 2019 at 4:11 PM Janna Layton <jlayton@wikimedia.org> wrote:

Hi all,


The next Research Showcase will be live-streamed this Wednesday, June 26, at 11:30 AM PST/19:30 UTC. We will have three presentations this showcase, all relating to Wikipedia blocks. 


YouTube stream: https://www.youtube.com/watch?v=WiUfpmeJG7E


As usual, you can join the conversation on IRC at #wikimedia-research. You can also watch our past research showcases here: https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase


This month's presentations:


Trajectories of Blocked Community Members: Redemption, Recidivism and Departure


By Jonathan Chang, Cornell University


Community norm violations can impair constructive communication and collaboration online. As a defense mechanism, community moderators often address such transgressions by temporarily blocking the perpetrator. Such actions, however, come with the cost of potentially alienating community members. Given this tradeoff, it is essential to understand to what extent, and in which situations, this common moderation practice is effective in reinforcing community rules. In this work, we introduce a computational framework for studying the future behavior of blocked users on Wikipedia. After their block expires, they can take several distinct paths: they can reform and adhere to the rules, but they can also recidivate, or straight-out abandon the community. We reveal that these trajectories are tied to factors rooted both in the characteristics of the blocked individual and in whether they perceived the block to be fair and justified. Based on these insights, we formulate a series of prediction tasks aiming to determine which of these paths a user is likely to take after being blocked for their first offense, and demonstrate the feasibility of these new tasks. Overall, this work builds towards a more nuanced approach to moderation by highlighting the tradeoffs that are in play.



Automatic Detection of Online Abuse in Wikipedia 


By Lane Rasberry, University of Virginia


Researchers analyzed all English Wikipedia blocks prior to 2018 using machine learning. With insights gained, the researchers examined all English Wikipedia users who are not blocked against the identified characteristics of blocked users. The results were a ranked set of predictions of users who are not blocked, but who have a history of conduct similar to that of blocked users. This research and process models a system for the use of computing to aid human moderators in identifying conduct on English Wikipedia which merits a block.

Project page: https://meta.wikimedia.org/wiki/University_of_Virginia/Automatic_Detection_of_Online_Abuse

Video: https://www.youtube.com/watch?v=AIhdb4-hKBo



First Insights from Partial Blocks in Wikimedia Wikis


By Morten Warncke-Wang, Wikimedia Foundation


The Anti-Harassment Tools team at the Wikimedia Foundation released the partial block feature in early 2019. Where previously blocks on Wikimedia wikis were sitewide (users were blocked from editing an entire wiki), partial blocks makes it possible to block users from editing specific pages and/or namespaces. The Italian Wikipedia was the first wiki to start using this feature, and it has since been rolled out to other wikis as well. In this presentation, we will look at how this feature has been used in the first few months since release.



--
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology 


--
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology 


--
Janna Layton (she, her)
Administrative Assistant - Audiences & Technology 
Wikimedia Foundation