Community Data Science Course (Spring 2023)/Week 4 lecture notes

Using APIs to download data from the internet
API (Application Programmer Interface) is a structured way for two programs to communicate. Think of it like a contract or a secret handshake. APIs exist on both the Internet and, in a sense, we've already been using some APIs in Python.

An interface typically has two parts:


 * A description of how to request something
 * A description of what one will get in return

Once you understand those two things, you know the API. An API within Python typically includes a set of functions.

A web API is quite a lot of like functions in Python but it describes how a program running on your computer can talk to another computer running a website. Basically, it's like a website your programs can visit (you:a website::your program:a web API).

Examples:
 * The API for twitter describes how to read tweets, write tweets, and follow people. See details here: https://dev.twitter.com/
 * Yelp has an API described here: https://www.yelp.com/developers
 * Zillow's API: https://www.zillow.com/howto/api/APIOverview.htm

Questions to consider when choosing an API

 * 1) Where is the documentation? Are there examples or code samples?
 * 2) Are there any rate limits or restrictions on use? For instance, Twitter doesn't want you downloading tweets. Zillow forbids storing bulk results. (Why?)
 * 3) Is there a python package that will help me? For instance, Twitter has a great python package called tweepy that will simplify access.
 * 4) All the things on the checklist below!

Checklist: How do we use an API to fetch datasets?
Basic idea: your program sends a request, the API sends data back:


 * Where do you direct your request? (i.e., what are the site's API endpoints)
 * For example: Wikipedia's web API endpoint is http://en.wikipedia.org/w/api.php
 * How do I write my request? Put together a URL; it will be different for different web APIs.
 * Check the documentation, look for code samples
 * How do you send a request?
 * Often the simplest way is to try it in your browser
 * Python has modules you can use, like, to make HTTP requests. The requests library, which has excellent documentation here.
 * What do you get back?
 * Structured data (usually in the JSON format).
 * JSON is javascript object notation. JSON data looks like python lists and dictionaries, and we'll see that it's easy to turn it into a python variable that is a list or dictionary. Here's a sample:
 * How do you understand (i.e. parse) the data?
 * We can display it in Firefox automatically?
 * We can draw it out with https://jsonformatter.curiousconcept.com/
 * When it's time to do it Python, we can use the  function in the requests module!

Our first API: Bored API

 * First of all, lets check out this page: http://www.boredapi.com/
 * Let's click through the about page and the documentation and try stuff on their web interface
 * Try the random endpoint
 * The output is JSON
 * JSON
 * HTML versus JSON
 * The good news is that JSON is (almost!) the same as Python! Just lists, strings, dictionaries, integers, floats, etc. (e.g., What type is key?
 * Most APIs will return JSON directly
 * It's often helpful to format JSON to understand it
 * Passing parameters to an API
 * let's go back and look at the documentation
 * lets request things based on a specific number of participants
 * lets try to request thing based on a price range (give you all 3-4 minutes to try)
 * Making requests in Python
 * . it also contains .url which is pretty useful!
 * ; now we can check type and poke around in it (I typically use tab completion!)
 * e.g., lets work through a quick example
 * Let's put it into a Python program to print out one activity for 1 through 5 people!
 * Let's add the type of activity to what we print out
 * Let's add another parameter (maybe a price range?)
 * Let's show how to add parameters via dictionaries
 * Let's add another parameter (maybe a price range?)
 * Let's show how to add parameters via dictionaries

Introducing the OSM Nominatim API
We're going to spend today looking at Open Street Map's api called Nominatim.


 * Visiting the website to play around with it first: lets search for "bakery"
 * Lets pull up the documentation!
 * These query strings have a particular form and they are often multiple; in, near, etc
 * bakery in seattle; bakery in snohomish; bakery in bellevue
 * Passing in [] brackets for amenities
 * If we want to do it with Python, we will just reproduce the URL the same way
 * Let's do it with Python!
 * What if we want to have spaces? Uhoh. URLs can't have spaces...
 * Instead, we can use use parameters to query the API
 * If we go back to boredapi, turns out we can do that too
 * lets turn url into a variable too!
 * Understanding the output and extracting information
 * go to the formatter
 * Passing using bounded and viewbox to limit where we search
 * looking up latlong
 * passing in viewbox data from the website

Details on the Nominatim API
Simple request:

Do this: Go to  to see the same query in JSON format.

Let's break down each line:

This is the most important line! Here, we "get" information from the web server. Note that we pass the url up to the "?" character as the first argument. Compare the dictionary second argument to the query we did above in our browser. How do they differ? How are they the same?
 * imports the library so we can use it.
 * the response is a python object that contains the actual contents of the web page as well as some status information. Here, we're getting the status_code, which tells us whether the call succeeded. 200 is "good", and you will sometimes see 404 for "not found" or 500 for "server error".
 * this wasn't above, but try it anyway. response.content contains the raw contents of the webpage as a string.
 * response.json tries to convert the string content to a python list or dictionary if the content was stored in JSON format. (What happens if the content wasn't JSON?)

Now lets break down the result:

Things to realize:


 * We get a list with multiple dictionaries and of some of the values for those keys are lists!
 * We're given latitude and longitude. It's important to be able to find these!
 * Right clicking on https://openstreetmap.org seems to work
 * Google Maps: (1) On your computer, open Google Maps. (2) Right-click the place or area on the map. This will open a pop-up window. You can find your latitude and longitude in decimal format at the top. (3) To copy the coordinates automatically, left click on the latitude and longitude.
 * This website: https://www.latlong.net/

Additional examples:

Let's look at the result limitation section of the documentation and try two things:


 * 1) Let's read the documentation
 * 2) Let's write a program to ask for more bakeries that are the the ones we've been given.
 * 3) Let's ask for a list of bakeries that are within the university district
 * 4) Let's plug the whole thing into Python

FAQ

 * What if there's no API?: Sometimes, the only way to get data is to extract it from potentially messy HTML. This is called scrapping and python has a library called BeautifulSoup to help with that.