Editing Content Integrity and Disinformation Risks Across Wikipedia Language Editions

From CommunityData
Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.

The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then publish the changes below to finish undoing the edit.

Latest revision Your text
Line 3: Line 3:
any questions you may have before agreeing to be in the study.
any questions you may have before agreeing to be in the study.


The study is being conducted by PhD student [https://www.hcde.washington.edu/profiles/students/profile.php?id=445&search=zarine%20&BS=1&MS=1&PhD=1&UCD=1 Zarine Kharazian] in the Department of Human-Centered Engineering, and [https://mako.cc/ Dr. Benjamin Mako Hill], Associate Professor in the Department of Communication, at the University of Washington.
The study is being conducted by PhD student [https://www.hcde.washington.edu/profiles/students/profile.php?id=445&search=zarine%20&BS=1&MS=1&PhD=1&UCD=1 Zarine Kharazian] in the Department of Human-Centered Engineering, and [https://mako.cc/ Dr. Benjamin Mako Hill], Assistant Professor in the Department of Communication, at the University of Washington.


==Purpose of the Study==
==Purpose of the Study==
The purpose of this study is to better understand the disinformation and content integrity risks across Wikipedia language editions. While there has been extensive research on “one-off” risks to knowledge integrity—in the form of vandalism, sockpuppet editing, and “edit wars” at the article level—there has been little empirical examination of systematic knowledge integrity risks to entire Wikipedia language projects.  
The purpose of this study is to better understand the disinformation and content integrity risks across Wikipedia language editions. Specifically, we are interested in understanding whether some Wikipedia language editions are more vulnerable to systematic disinformation and ideologically motivated editing than others, and why. We are also interested in understanding the cross-wiki monitoring mechanisms currently in place to defend against systematic disinformation risks across Wikipedia editions.


Specifically, we are interested in understanding whether some Wikipedia language editions are more vulnerable to systematic disinformation and ideologically motivated editing than others, and why. We are also interested in understanding the cross-wiki monitoring mechanisms currently in place to defend against systematic disinformation risks across Wikipedia editions.
In recent years, as large, for-profit platforms such as Facebook, YouTube, and Twitter have faced increasing scrutiny for their content moderation practices, Wikipedia has been hailed as an alternative model for how platforms can combat problematic information and maintain information integrity and reliability. Existing studies of Wikipedia and problematic information, however, focus largely on English-language Wikipedia, which is the largest and most well-resourced. Wikipedia currently consists of over 300 language editions. With each Wikipedia edition functioning somewhat independently, diverging on points of policy, and differing with respect to other key characteristics, such as size and community capacity, there is diversity with respect to how successful various language editions are in combating misinformation.
[[File:Wiki photo.png|450px|thumb|right]]
 
The recent case of Croatian Wikipedia, which has received both media attention and attention from the Wikimedia Foundation, demonstrates some of the difficulties that can befall small language communities on the platform. According to a [https://meta.wikimedia.org/wiki/Croatian_Wikipedia_Disinformation_Assessment-2021 2021 report] commissioned by the WMF, over a decade, a small group of ideologically motivated editors exploited the 2003 split of the larger Serbo-Croatian Wikipedia community into two additional national Wikipedias – one Serbian, the other Croatian – to seize control of the newly created Croatian Wikipedia. According to the report, this group systematically introduced far-right bias into articles and stamped out dissenting editorial voices that attempted to introduce a neutral point of view.
 
While Croatian Wikipedia may have been an extreme case, small language communities across Wikipedia may be uniquely vulnerable to particular disinformation risks, such as the form of project capture by a cohort of ideologically motivated individuals observed on Croatian Wikipedia. While there has been extensive research on “one-off” risks to knowledge integrity – in the form of vandalism, sockpuppet editing, and “edit wars” at the article level – there has been little empirical examination of systematic knowledge integrity risks to entire Wikipedia language projects.


==Eligibility to Participate==
==Eligibility to Participate==
Line 29: Line 32:


==Questions or Problems==
==Questions or Problems==
If you have questions, concerns, or complaints, contact the lead researcher, Zarine Kharazian, at zkharaz@uw.edu. The work is being supervised by [[Benjamin Mako Hill]], Associate Professor at the University of Washington (makohill@uw.edu).
If you have questions, concerns, or complaints, contact the lead researcher, Zarine Kharazian, at zkharaz@uw.edu.


If you have questions about your rights as a research subject, you may contact the UW Human Subjects Division at 206-543-0098 or hsdinfo@uw.edu.
If you have questions about your rights as a research subject, you may contact the UW Human Subjects Division at 206-543-0098 or hsdinfo@uw.edu.
Please note that all contributions to CommunityData are considered to be released under the Attribution-Share Alike 3.0 Unported (see CommunityData:Copyrights for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource. Do not submit copyrighted work without permission!

To protect the wiki against automated edit spam, we kindly ask you to solve the following CAPTCHA:

Cancel Editing help (opens in new window)