Community Data Science Course (Spring 2023)/Week 4 coding challenges

From CommunityData

There's actually nothing to download this time so you simply start with a fresh Jupyter notebook! Be sure to give a nice descriptive name, as always.

Although there's nothing to download, you will likely want to look at the following resource when working through these:

#1 Bored API[edit]

First, let's work through a few examples from the Bored API. In order to answer these, you'll need access to the API activity endpoint (https://www.boredapi.com/api/activity) and the documentation at http://www.boredapi.com/. My strong advice is to start building your work off the code I did in class which is in the week 4 lecture part 1Jupyter notebook.

  1. Write some Python code to use the API to get an activity that I could do with my partner. Turn the data from the API into a sentence that tells me the activity, its accessibility, and its price. (Note: this used to say "free activity"; we removed that because it's more boring that way.)
    1. Extend that code so that it gets me 5 activities and have it write a nicely formatted output.
  2. First, get a totally random activity from the API. Print it out, along with its activity type. Now get me another activity that's of the same type as the first random activity. Print it out too, plus its activity type (to check that they're the same).
  3. Write a Python program that prints out one random activity of each type that the Bored API supports. (See if you can use a loop for this!)

#2 Learning a new API[edit]

In this exercise, I want to you to practice learning to use a new API and practice reading some API documentation. We're going to start with the Dog API which is online at: https://dogapi.dog

  1. Visit the Dog API website and read the API documentation
  2. Write a URL that will return a single dog fact (you don't need to turn in any python code for this! just the link is fine!)
  3. Write a URL that will return 5 dog facts at once (just the link is fine)
  4. Take your URL to request a single dog fact and put into a Python program that uses requests.get() and passes in parameters with the params= argument. Your program should just print out the fact itself and not the full JSON object
  5. Finally, write a for loop that gets sets of 5 facts 5 times (you can just try something like for number in [1,2,3,4,5], downloads the lists of dog facts, and then writes out a new tab-separated values (TSV) file with columns: (1) ID of the dog fact, and (2) the fact itself!

#3 Mapping![edit]

This set of questions will all require the the Nominatim API. As always, API documentation is online. My strong advice is to look at the code in the week 4 lecture part 2 Jupyter notebook and to closely watch the video I recorded after class.

  1. Are there more gas stations in Duvall or Carnation?
  2. Are there more dentists near the University or near Downtown? (you will need to look at the limit on the number of returned items)
  3. Use the geocoding API endpoint in Nominatim to look up a specific latitude and longitude of your choice (try this building or your hometown).
  4. Write a program to find all the ziplines in King County, Washington (or at least all the ones that OSM knows about!).
  5. It's important to be alert and well-caffeinated when ziplining! Once you've found a nearby zipline, have your program use that result to find the nearest cafe each of the ziplines you identify!
  6. Craft a query using the search API to find colleges in Seattle. (Hint: you'll want to set bounded=1 and use the viewbox). Print the name and location of every college you find.
    1. Modify your query so that you include the address details separated out (this is an API option you can find in the documentation)?
    2. How can you tell that a place returned by the API is in fact a college?
    3. Print the list of colleges into a new TSV file with the following columns: osm_id (a unique ID that OSM uses), the name of the college, latitude, and longitude.