Not logged in
Talk
Contributions
Create account
Log in
Navigation
Main page
About
People
Publications
Teaching
Resources
Research Blog
Wiki Functions
Recent changes
Help
Licensing
Page
Discussion
Edit
View history
Editing
Community Data Science Course (Spring 2023)/Week 6 coding challenges
(section)
From CommunityData
Jump to:
navigation
,
search
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== #1 MediaWiki API == Identify a movie, television, video game, or other media property that has both (a) 5 or more related articles on Wikipedia '''and''' (b) 5 or more other articles on the same topic on a [https://fandom.com Fandom.com] website. Any large entertainment franchise will definitely work but feel free to get creative! For example, you might choose 5 Wikipedia articles about the anime Naruto and 5 articles (pages) from the naruto.fandom.com site. You may notice that fandom.com has a top layer with staff-produced video content, but once you dig down into a particular fandom's wiki, you'll start to see a more familiar wiki style page. For example, compare [https://spongebob.fandom.com/wiki/Help_Wanted the fandom.com page about the SpongeBob pilot episode 'Help Wanted'] and [https://en.wikipedia.org/wiki/Help_Wanted_(SpongeBob_SquarePants) the Wikipedia page about the same pilot episode]. # First modify the code from first sets of notebooks I used in the [[../Week 6 lecture]] to download data (and metadata) about revisions to the 5 articles you chose from Wikipedia. Be ready to share: ## (i) what proportion of those edits were made by users without accounts ("anon"), ## (ii) what proportion of those edits were marked as "minor", and ## (iii) make and share a visualization of the total number of edits across those 5 articles over time (I didn't do this in class but I made the TSV file would allow this). # Now grab data for the 5 articles you chose from the Fandom.com wiki you identified and grab revision/edit data from there. ('''''Hint:''' Your wikipedia work will give you lots of clues here: for example, the fandom API endpoint for The Wire is https://thewire.fandom.com/api.php and the Fandom API, as I said in class, is the same as the Wikipedia API''). Produce answers to the same three questions (i, ii, and iii) above but using this dataset. # Finally, choose either your Wikipedia or Fandom datasets as the data source for a visualization that shows how each of those articles have grown in length (as measured in characters or "bytes") over time. ('''''Hint:''' you'll need to return "size" as one of the revision properties (<code>rvprop</code>) if you are not doing it already.'')
Summary:
Please note that all contributions to CommunityData are considered to be released under the Attribution-Share Alike 3.0 Unported (see
CommunityData:Copyrights
for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource.
Do not submit copyrighted work without permission!
To protect the wiki against automated edit spam, we kindly ask you to solve the following CAPTCHA:
Cancel
Editing help
(opens in new window)
Tools
What links here
Related changes
Special pages
Page information