DS4UX (Spring 2016)/Day 4 lecture: Difference between revisions

From CommunityData
No edit summary
No edit summary
Line 102: Line 102:
* [http://mako.cc/teaching/2014/cdsw-autumn/lecture2-web_apis.odp Slides (ODP Libreoffice Slides Format)] — For editing and modification
* [http://mako.cc/teaching/2014/cdsw-autumn/lecture2-web_apis.odp Slides (ODP Libreoffice Slides Format)] — For editing and modification


<!-- FROM http://wiki.communitydata.cc/Community_Data_Science_Workshops_(Spring_2015)/Day_3_Lecture
== Material for the lecture ==
For the lecture, you will need two files. Download both of these to your computer by using right or control click on the link and then using ''Save as'' or ''Save link as''. Keep track of where you put the files.
* http://mako.cc/teaching/2015/cdsw-spring/harrypotter-wikipedia-cdsw.zip
* http://communitydata.cc/~mako/hp_wiki.tsv
== Overview of the day ==
* Lecture
** Our philosophy around data visualization
** Introduce some new programming tools!
** We're going to walk through some analysis of edits to Harry Potter in Wikipedia, start to finish
** We'll focus on manipulating data in Python
** Visualizing things in Google Docs
* Project based work
** More [[Harry Potter on Wikipedia]] project (or your own topic) on doing analysis using Google Docs
** [[Matplotlib]]
** Civic Data - More interactive working on projects
== Lecture outline ==
'''Step 1: Pre-Requisites'''
* My philosophy about data analysis: ''use the tools you have''
* Four things in Python I have to teach you:
** while loops
*** infinite loops
*** loops with a greater than or less than
** break / continue
** "\t".join()
** defining your own functions with <code>def foo(argument):</code>
'''Step 2: Walking through a Program'''
* Walk-through of <code>get_hpwp_dataset.py</code>
* Look at dataset with <code>more</code> and/or in spreadsheet
'''Step 3: Loading Data Back In'''
* Load data into Python
** review of opening files
*** we can also open them for reading with <code>open('file', 'r', encoding="utf-8")</code>
** csv.DictReader()
* Basic counting: <code>hpwp-minor.py</code>
** Answer question: ''What proportion of edits to Wikipedia Harry Potter articles are minor?''
*** Count the number of minor edits and calculate proportion
* Looking at time series data <code>hpwp-trend.py</code>
** "Bin" data by day to generate the trend line
* Exporting and visualizing data
** Export dataset on edits over time
** Export dataset on articles over users
** Load data into Google Docs
-->
[[Category:DS4UX (Spring 2016)]]
[[Category:DS4UX (Spring 2016)]]

Revision as of 00:23, 1 April 2016

In which you learn how to use Python and web APIs to meet the likes of her!

Lecture Outline

Introduction and context
  • You can write some tools in Python now. Congratulations!
  • Today we'll learn how to find/create data sets
  • Next week we'll get into data science (asking and answering questions)


Outline
  • What is an API?
  • How do we use one to fetch interesting datasets?
  • How do we write programs that use the internet?
  • How can we use the placekitten API to fetch kitten pictures?
  • Introduction to structured data (JSON)
  • How do we use APIs in general?


What is a (web) API?
  • API: a structured way for programs to talk to each other (aka an interface for programs)
  • Web APIs: like a website your programs can visit (you:a website::your program:a web API)


How do we use an API to fetch datasets?

Basic idea: your program sends a request, the API sends data back

  • Where do you direct your request? The site's API endpoint.
  • How do I write my request? Put together a URL; it will be different for different web APIs.
    • Check the documentation, look for code samples
  • How do you send a request?
    • Python has modules you can use, like requests (they make HTTP requests)
  • What do you get back?
    • Structured data (usually in the JSON format)
  • How do you understand (i.e. parse) the data?
    • There's a module for that!


How do we write Python programs that make web requests?

To use APIs to build a dataset we will need:

  • all our tools from last session: variables, etc
  • the ability to open urls on the web
  • the ability to create custom URLS
  • the ability to save to files
  • the ability to understand (i.e., parse) JSON data that APIs usually give us


New programming concepts
  • interpolate variables into a string using % and %()s
  • requests
  • open files and write to them


How do we use an API to fetch kitten pictures?

placekitten.com

  • API that takes specially crafted URLs and gives appropriately sized picture of kittens
  • Exploring placekitten in a browser:
    • visit the API documentation
    • kittens of different sizes
    • kittens in greyscale or color
  • Now we write a small program to grab an arbitrary square from placekitten by asking for the size on standard in: placekitten_raw_input.py


Introduction to structured data (JSON, JavaScriptObjectNotation)
  • what is json: useful for more structured data
  • import json; json.loads()
  • like Python (except no single quotes)
  • simple lists, dictionaries
  • can reflect more complicated data structures
  • Example file at http://mako.cc/cdsw.json
  • You can parse data directly with .json() on a requests call
Using other APIs
  • every API is different, so read the documentation!
  • If the documentation isn't helpful, search online
  • for popular APIs, there are python modules that help you make requests and parse json

Possible issues:

  • rate limiting
  • authentication
  • text encoding issues

Other Potentially Resources

My friend Frances gave a version of this lecture last year and create slides. They are written for Python 2, so the code might not all work (remember, use print() with parentheses) but the basic ideas might be helpful: