Not logged in
Talk
Contributions
Create account
Log in
Navigation
Main page
About
People
Publications
Teaching
Resources
Research Blog
Wiki Functions
Recent changes
Help
Licensing
Page
Discussion
Edit
View history
Editing
DS4UX (Spring 2016)/Day 5 lecture
From CommunityData
Jump to:
navigation
,
search
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
[[File:Highfivekitten.jpeg|200px|thumb|In which you learn how to use Python and web APIs to meet the likes of her!]] ===Introduction and context === You can manipulate data in Python now. Congratulations! Today we'll learn how to put your new skills to good use. You'll learn how to gather data from the internet, using APIs (Application Programming Interfaces) and JSON ("JavaScript Object Notation"). You can manipulate this data in Python to answer questions (just like you did with the BabyNames, Wordplay, and Seattle Traffic datasets), and then export your findings to a file for later use or additional analysis. === Outline === ;Lecture 1 (Ray) * What is an API? * How do we use one to fetch interesting datasets? * How do we write programs that use the internet? ;API exercise 1 (Jonathan) * How can we use the placekitten API to fetch kitten pictures? ;Lecture 2 (Ray) * Introduction to structured data (JSON) * JSON and Python ;API exercise 2 (Jonathan) *How do we build API queries to gather data about Wikipedia articles and editors? == Lecture 1: What is an API? == * API: a structured way for programs to talk to each other (aka an interface for programs) * Web APIs: like a website your programs can visit (you:a website::your program:a web API) === Download the Week 5 lecture scripts and data === <big>'''[http://jtmorgan.net/ds4ux/week5/lecture.zip Click here to download the week 5 lecture scripts and data examples]'''</big> # The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents. # Start up your terminal, navigate to the new directory you have unpacked called <code>lecture</code> === How do we use an API to fetch datasets? === Basic idea: your program sends a request, the API sends data back * Where do you direct your request? The site's API endpoint. ** For example: Wikipedia's web API endpoint is http://en.wikipedia.org/w/api.php * How do I write my request? Put together a URL; it will be different for different web APIs. ** Check the documentation, look for code samples * How do you send a request? ** Python has modules you can use, like <code>requests</code> (they make HTTP requests) * What do you get back? ** Structured data (usually in the JSON format) * How do you understand (i.e. parse) the data? ** There's a module for that! (introduce requests) == Exercise 1: How do we use APIs to get kitten pictures? == [http://placekitten.com/ placekitten.com] * API that takes specially crafted URLs and gives appropriately sized picture of kittens * Exploring placekitten in a browser: ** visit the API documentation ** kittens of different sizes ** kittens in greyscale or color * Now we are ready to walk through a small program (<code>placekitten_input.py</code>) that grabs a random picture from placekitten.com of a user-specified height and width, and saves that image as a .jpg image on your computer. == Lecture 2: Using data from APIs in Python == === How do we write Python programs that make web requests? === To use APIs to build a dataset we will need: * all our tools from last session: variables, etc * the ability to open urls on the web * the ability to create custom URLS * the ability to save to files * the ability to understand (i.e., parse) JSON data that APIs usually give us === Introduction to structured data (JSON, JavaScriptObjectNotation) === * what is json: useful for more structured data * import json; json.load(), json.loads() * like Python (except no single quotes) * simple lists, dictionaries * can reflect more complicated data structures * You can parse data directly with <code>.json()</code> on a <code>requests</code> call * You can parse JSON files with <code>json.load()</code> * You can parse JSON ''strings'' with <code>json.loads()</code> (the 's' is for 'string') * Example JSON file: <code>family.json</code> * script to parse this file into a CSV: <code>family_json_to_csv.py</code> * script to parse and print a ''JSON string'' version of this example file: <code>parse_json.py</code> === Using other APIs === * every API is different, so read the documentation! * If the documentation isn't helpful, search online * for popular APIs, there are python modules that help you make requests and parse json Possible issues: * rate limiting * authentication * text encoding issues == Week 5 project: How do we use APIs to get data from Wikipedia? == * Brief intro to Wikipedia * Overview of what you can get from [[DS4UX_(Spring_2016)/Wikipedia_API|the Wikipedia API]] * Example queries in the [https://en.wikipedia.org/wiki/Special:ApiSandbox Wikipedia API sandbox] * Introduce [[DS4UX_(Spring_2016)/Day_5_coding_challenge|week 5 coding challenges]] [[Category:DS4UX (Spring 2016)]]
Summary:
Please note that all contributions to CommunityData are considered to be released under the Attribution-Share Alike 3.0 Unported (see
CommunityData:Copyrights
for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource.
Do not submit copyrighted work without permission!
To protect the wiki against automated edit spam, we kindly ask you to solve the following CAPTCHA:
Cancel
Editing help
(opens in new window)
Tools
What links here
Related changes
Special pages
Page information