Community Data Science Course (Spring 2023)/Week 4 lecture notes
Using APIs to download data from the internet
API (Application Programmer Interface) is a structured way for two programs to communicate. Think of it like a contract or a secret handshake. APIs exist on both the Internet and, in a sense, we've already been using some APIs in Python.
An interface typically has two parts:
- A description of how to request something
- A description of what one will get in return
Once you understand those two things, you know the API. An API within Python typically includes a set of functions.
A web API is quite a lot of like functions in Python but it describes how a program running on your computer can talk to another computer running a website. Basically, it's like a website your programs can visit (you:a website::your program:a web API).
Examples:
- The API for twitter describes how to read tweets, write tweets, and follow people. See details here: https://dev.twitter.com/
- Yelp has an API described here: https://www.yelp.com/developers
- Zillow's API: https://www.zillow.com/howto/api/APIOverview.htm
Questions to consider when choosing an API
- Where is the documentation? Are there examples or code samples?
- Are there any rate limits or restrictions on use? For instance, Twitter doesn't want you downloading tweets. Zillow forbids storing bulk results. (Why?)
- Is there a python package that will help me? For instance, Twitter has a great python package called tweepy that will simplify access.
- All the things on the checklist below!
Checklist: How do we use an API to fetch datasets?
Basic idea: your program sends a request, the API sends data back:
- Where do you direct your request? (i.e., what are the site's API endpoints)
- For example: Wikipedia's web API endpoint is http://en.wikipedia.org/w/api.php
- How do I write my request? Put together a URL; it will be different for different web APIs.
- Check the documentation, look for code samples
- How do you send a request?
- Often the simplest way is to try it in your browser
- Python has modules you can use, like
requests
, to make HTTP requests. The requests library, which has excellent documentation here.
- What do you get back?
- Structured data (usually in the JSON format).
- JSON is javascript object notation. JSON data looks like python lists and dictionaries, and we'll see that it's easy to turn it into a python variable that is a list or dictionary. Here's a sample:
- Structured data (usually in the JSON format).
- How do you understand (i.e. parse) the data?
- We can display it in Firefox automatically?
- We can draw it out with https://jsonformatter.curiousconcept.com/
- When it's time to do it Python, we can use the
.json()
function in the requests module!
How do we write Python programs that make web request
To use APIs to build a dataset we will need:
- all our tools from last session: variables, etc [DONE!]
- the ability to open URLs on the web
- the ability to create custom URLS
- the ability to understand (i.e., parse) JSON data that APIs usually give us
- the ability to save to files [DONE!]
Our first API: Bored API
- First of all, lets check out this page: http://www.boredapi.com/
- Let's click through the about page and the documentation and try stuff on their web interface
- Try the random endpoint
- The output is JSON
- Let's click through the about page and the documentation and try stuff on their web interface
- JSON
- HTML versus JSON
- The good news is that JSON is (almost!) the same as Python! Just lists, strings, dictionaries, integers, floats, etc. (e.g., What type is key?
- Most APIs will return JSON directly
- It's often helpful to format JSON to understand it
- Passing parameters to an API
- let's go back and look at the documentation
- lets request things based on a specific number of participants
- lets try to request thing based on a price range (give you all 3-4 minutes to try)
- Making requests in Python
import requests
response = requests.get(URL, params={})
print(response.status_code)
. it also contains .url which is pretty useful!data = response.json()
; now we can check type and poke around in it (I typically use tab completion!)
- e.g., lets work through a quick example
- Let's put it into a Python program to print out one activity for 1 through 5 people!
- Let's add the type of activity to what we print out
- Let's add another parameter (maybe a price range?)
Introducing the OSM Nominatim API
We're going to spend today looking at Open Street Map's api called Nominatim.
simple request:
import requests
response = requests.get('http://nominatim.openstreetmap.org/', params={'q': '[bakery] near seattle wa', 'format': 'json'})
print(response.status_code) # 200 means it worked.
data = response.json()
print(type(data))
Do this:
Go to http://nominatim.openstreetmap.org/?q=[bakery]+near+seattle&format=json
to see the same query in JSON format.
Let's break down each line:
import requests
imports the library so we can use it.response = requests.get('http://nominatim.openstreetmap.org/', params={'q': '[bakery] near seattle wa', 'format': 'json'})
This is the most important line! Here, we "get" information from the web server. Note that we pass the url up to the "?" character as the first argument. Compare the dictionary second argument to the query we did above in our browser. How do they differ? How are they the same?
print(response.status_code)
the response is a python object that contains the actual contents of the web page as well as some status information. Here, we're getting the status_code, which tells us whether the call succeeded. 200 is "good", and you will sometimes see 404 for "not found" or 500 for "server error".print(response.content)
this wasn't above, but try it anyway. response.content contains the raw contents of the webpage as a string.data = response.json()
response.json() tries to convert the string content to a python list or dictionary if the content was stored in JSON format. (What happens if the content wasn't JSON?)
Now lets break down the result:
[
{
"place_id":"21583441",
"licence":"Data © OpenStreetMap contributors, ODbL 1.0. http:\/\/www.openstreetmap.org\/copyright",
"osm_type":"node",
"osm_id":"2131716956",
"boundingbox":[
"47.6248735",
"47.6249735",
"-122.3207478",
"-122.3206478"
],
"lat":"47.6249235",
"lon":"-122.3206978",
"display_name":"The Confectional, 618, Broadway East, Eastlake, Capitol Hill, Seattle, King County, Washington, 98102, United States of America",
"class":"shop",
"type":"bakery",
"importance":0.201,
"icon":"http:\/\/nominatim.openstreetmap.org\/images\/mapicons\/shopping_bakery.p.20.png"
}
]
Things to realize:
- We get a list with multiple dictionaries and of some of the values for those keys are lists!
- We're given latitude and longitude. It's important to be able to find these!
- Right clicking on https://openstreetmap.org seems to work
- Google Maps: (1) On your computer, open Google Maps. (2) Right-click the place or area on the map. This will open a pop-up window. You can find your latitude and longitude in decimal format at the top. (3) To copy the coordinates automatically, left click on the latitude and longitude.
- This website: https://www.latlong.net/
Additional examples:
Let's look at the result limitation section of the documentation and try two things:
- Let's read the documentation
- Let's write a program to ask for more bakeries that are the the ones we've been given.
- Let's ask for a list of bakeries that are within the university district
- Let's plug the whole thing into Python
Introduce the problem set
FAQ
- What if there's no API?
- Sometimes, the only way to get data is to extract it from potentially messy HTML. This is called scrapping and python has a library called BeautifulSoup to help with that.