Editing CommunityData:Message Walls

From CommunityData

Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.

The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then publish the changes below to finish undoing the edit.

Latest revision Your text
Line 2: Line 2:


* Notes on Wikia Dumps [[CommunityData:Wikia Dumps]]
* Notes on Wikia Dumps [[CommunityData:Wikia Dumps]]
* Notes on the code -- Now with a diagram! [[CommunityData:Message Walls Code]]
* Notes on the code [[CommunityData:Message Walls Code]]


= Robustness Checks =
* Pre-period matching placebo test
* Normal placebo test


= Task Management =  
= Task Management =  


==Overview ==
==Next Steps (Aug 24)==
*(Sneha) Add list of variables to build to the wiki
*(Salt) Verify whether the wiki dumps are solid
*(Nate) Generate 'user experience' variables for every edit in the dataset


(Updated March 15th)
==Next Steps (Aug 15)==
*(Salt) Edit build wikilist code to map filenames with message wall transition dates
*(Sneha) Continue preliminary analysis with 25 wikis
*(Nate) Continue investigating what dumps we can get from wikiteam


===Get missing wikis===
==Next Steps (Aug 1)==
*'''ASAP''' Need to use wikilist3.csv to determine which wikis we don't have - Salt (with Mako's help)
*(Salt) Make file with mapping between urls and the newly scraped dumps.  
*'''ASAP''' Download the rest and put them through wikiq and build edit weeks - Salt (with Mako's help)
*(Nate with Mako's help) Figure out what's going on in the wiki mapping code
*(Sneha) Plan for visit in September
*(Sneha) Continue preliminary analysis with 25 wikis


===Analysis===
== Retreat Tasks ==  
* Another meeting with full team to go over the results and try to make sense of them (after Sneha takes a first stab)
* Document and organize the git repository.
* Determine any other models we want to run
* Data exploration / preliminary analysis.


===Writing===
== Next Steps (July 18) ==  
* Switch from Haythornwaite to Reader to Leader framing (Sneha)
* (Salt with Nate's help) Add new dumps to wikilist
* knitr integration (Sneha + Nate)
* (Nate) Update wikiteam mapping.
* plots (Salt)
* (Sneha) (Using wikiq data) Check that dumps, even if valid xml, have message wall data.
* Better pictures of message walls (Sneha)
* (Sneha) create list of subsetting characteristics (inclusion criteria for Wikis) for study.
* Better explanations of why talk pages suck (Sneha)
* (Sneha) create exploratory plots for a larger set of wikis of different sizes.
* Zotero streamlining
* (Sneha) Request new dumps for missing wikis.


== Archive ==
== Next Steps (July 11) ==
* (Sneha) (Using wikiq data) Check that dumps, even if valid xml, have message wall data.
* (Sneha) create list of subsetting characteristics (inclusion criteria for Wikis) for study.
* (Sneha) create exploratory plots for a larger set of wikis of different sizes.
* (Sneha) Request new dumps for missing wikis.
* (Salt) Download wikis available on Special:statistics. '''Done'''
* (Nate) Scrape admin and bot edits using a script from Mako. '''Done'''


* [[/Archived_tasks|Past next steps]]
== Next Steps (June 27th) ==
* (Sneha) (Using wikiq data) Check that dumps, even if valid xml, have message wall data.
* (Sneha) Take a look namespaces 1200-1202 to understand what they mean. '''Done'''
* (Sneha) create list of subsetting characteristics (inclusion criteria for Wikis) for study.
* (Sneha) create exploratory plots for a larger set of wikis of different sizes.
* (Salt) Download wikis available on Special:statistics.
* (Salt) Request new dumps for missing wikis.
* (Nate) Scrape admin and bot edits using a script from Mako.
* (Nate) Finish identifying wikiteam mapping '''Done'''.
 
==Next Steps (June 20th)==
* (Nate) Improve wiki list by identifying wikis that turn off the feature without turning on first (Done)
* (Nate) Get <strike>muppet wiki</strike> Dr. Horrible Wiki edit weeks for Sneha (Done)
* (Nate) Do brute force mapping using revision ids and and hashing texts (Done)
* (Sneha) Will play with Dr. Horrible data (Done)
* (Sneha) create list of subsetting characteristics for study
 
==Next Steps (June 13th)==
 
* Build a new dataset of dumps of the ~4800 wikis (Salt/Nate) (May take more than a week to generate all the new dumps)
* Build a msgwall version of the build_edit_weeks file from the anon_edits paper (Nate)
 
* Do analysis of alt history wiki and update (Sneha)
 
* Create list of criteria to identify wikis we want to use in this study (Sneha)
 
== Next Steps (June 6th)==
 
* Identify list of Wikis we will analyze from the tsv file. 
 
* Attempt to obtain a good dump for each of these wikis. See [[CommunityData:Wikia Dumps]] for information.
 
** This may depend on mapping between the urls in the tsv file and the dumps. Consider using HTTP redirects from the url under <siteinfo>. 
* Modify Wikiq to give an error message if the closing </mediawiki> tag is missing.
 
* Sneha to take a look althistory data from Nate. 
 
* Nate will write a version of build_edit_weeks for the message wall project
 
* Check back next meeting Tuesday (June 13th)
Please note that all contributions to CommunityData are considered to be released under the Attribution-Share Alike 3.0 Unported (see CommunityData:Copyrights for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource. Do not submit copyrighted work without permission!

To protect the wiki against automated edit spam, we kindly ask you to solve the following CAPTCHA:

Cancel Editing help (opens in new window)