Not logged in
Talk
Contributions
Create account
Log in
Navigation
Main page
About
People
Publications
Teaching
Resources
Research Blog
Wiki Functions
Recent changes
Help
Licensing
Page
Discussion
Edit
View history
Editing
Community Data Science Course (Spring 2023)/Week 4 lecture notes
From CommunityData
Jump to:
navigation
,
search
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Using APIs to download data from the internet == '''API (Application Programmer Interface)''' is a structured way for two programs to communicate. Think of it like a contract or a secret handshake. APIs exist on both the Internet and, in a sense, we've already been using some APIs in Python. An interface typically has two parts: * A description of ''how to request something'' * A description of ''what one will get in return'' Once you understand those two things, you know the API. An API ''within Python'' typically includes a set of functions. A web API is quite a lot of like functions in Python but it describes how a program running on your computer can talk to another computer running a website. Basically, it's like a website your programs can visit (you:a website::your program:a web API). Examples: * The API for twitter describes how to read tweets, write tweets, and follow people. See details here: https://dev.twitter.com/ * Yelp has an API described here: https://www.yelp.com/developers * Zillow's API: https://www.zillow.com/howto/api/APIOverview.htm === Questions to consider when choosing an API === # Where is the documentation? Are there examples or code samples? # Are there any rate limits or restrictions on use? For instance, Twitter doesn't want you downloading tweets. Zillow forbids storing bulk results. (Why?) # Is there a python package that will help me? For instance, Twitter has a great python package called tweepy that will simplify access. # All the things on the checklist below! === Checklist: How do we use an API to fetch datasets? === Basic idea: your program sends a request, the API sends data back: * Where do you direct your request? (i.e., what are the site's API ''endpoints'') ** For example: Wikipedia's web API endpoint is http://en.wikipedia.org/w/api.php * How do I write my request? Put together a URL; it will be different for different web APIs. ** Check the documentation, look for code samples * How do you send a request? ** Often the simplest way is to try it in your browser ** Python has modules you can use, like <code>requests</code>, to make HTTP requests. The requests library, which has excellent documentation [http://docs.python-requests.org/en/latest/api/ here]. * What do you get back? ** Structured data (usually in the JSON format). *** JSON is ''javascript object notation''. JSON data looks like python lists and dictionaries, and we'll see that it's easy to turn it into a python variable that is a list or dictionary. Here's a sample: * How do you understand (i.e. parse) the data? ** We can display it in Firefox automatically? ** We can draw it out with https://jsonformatter.curiousconcept.com/ ** When it's time to do it Python, we can use the <code>.json()</code> function in the requests module! == Our first API: Bored API == * First of all, lets check out this page: http://www.boredapi.com/ ** Let's click through the about page and the documentation and try stuff on their web interface *** Try the random endpoint *** The output is JSON * JSON ** HTML versus JSON ** The good news is that JSON is (almost!) the same as Python! Just lists, strings, dictionaries, integers, floats, etc. (e.g., What type is key? ** Most APIs will return JSON directly ** It's often helpful to format JSON to understand it * Passing parameters to an API ** let's go back and look at the documentation ** lets request things based on a specific number of participants ** lets try to request thing based on a price range (give you all 3-4 minutes to try) * Making requests in Python ** <code>import requests</code> ** <code>response = requests.get(URL, params={})</code> ** <code>print(response.status_code)</code>. it also contains .url which is pretty useful! ** <code>data = response.json()</code>; now we can check type and poke around in it (I typically use tab completion!) * e.g., lets work through a quick example ** Let's put it into a Python program to print out one activity for 1 through 5 people! ** Let's add the type of activity to what we print out ** Let's add another parameter (maybe a price range?) ** Let's show how to add parameters via dictionaries == Introducing the OSM Nominatim API == We're going to spend today looking at Open Street Map's api called [http://nominatim.openstreetmap.org/ Nominatim]. * Visiting the website to play around with it first: lets search for "bakery" ** Lets pull up the documentation! * These query strings have a particular form and they are often multiple; in, near, etc ** bakery in seattle; bakery in snohomish; bakery in bellevue ** Passing in [] brackets for amenities * If we want to do it with Python, we will just reproduce the URL the same way * Let's do it with Python! * What if we want to have spaces? Uhoh. URLs can't have spaces... ** Instead, we can use use parameters to query the API ** If we go back to boredapi, turns out we can do that too ** lets turn url into a variable too! * Understanding the output and extracting information ** go to the formatter * Passing using bounded and viewbox to limit where we search ** looking up latlong ** passing in viewbox data from the website === Details on the Nominatim API === Simple request: <syntaxhighlight lang="python"> import requests response = requests.get('http://nominatim.openstreetmap.org/', params={'q': '[bakery] near seattle wa', 'format': 'json'}) print(response.status_code) # 200 means it worked. data = response.json() print(type(data)) </syntaxhighlight> '''Do this:''' Go to <code>http://nominatim.openstreetmap.org/?q=[bakery]+near+seattle&format=json</code> to see the same query in JSON format. Let's break down each line: * <code>import requests</code> imports the library so we can use it. * <code>response = requests.get('http://nominatim.openstreetmap.org/', params={'q': '[bakery] near seattle wa', 'format': 'json'})</code> This is the most important line! Here, we "get" information from the web server. Note that we pass the url up to the "?" character as the first argument. Compare the dictionary second argument to the query we did above in our browser. How do they differ? How are they the same? * <code>print(response.status_code)</code> the response is a python object that contains the actual contents of the web page as well as some status information. Here, we're getting the status_code, which tells us whether the call succeeded. 200 is "good", and you will sometimes see 404 for "not found" or 500 for "server error". * <code>print(response.content)</code> this wasn't above, but try it anyway. response.content contains the raw contents of the webpage as a string. * <code>data = response.json()</code> response.json() tries to convert the string content to a python list or dictionary if the content was stored in JSON format. (What happens if the content wasn't JSON?) Now lets break down the result: <syntaxhighlight lang="json"> [ { "place_id":"21583441", "licence":"Data Β© OpenStreetMap contributors, ODbL 1.0. http:\/\/www.openstreetmap.org\/copyright", "osm_type":"node", "osm_id":"2131716956", "boundingbox":[ "47.6248735", "47.6249735", "-122.3207478", "-122.3206478" ], "lat":"47.6249235", "lon":"-122.3206978", "display_name":"The Confectional, 618, Broadway East, Eastlake, Capitol Hill, Seattle, King County, Washington, 98102, United States of America", "class":"shop", "type":"bakery", "importance":0.201, "icon":"http:\/\/nominatim.openstreetmap.org\/images\/mapicons\/shopping_bakery.p.20.png" } ] </syntaxhighlight> Things to realize: * We get a list with multiple dictionaries and of some of the values for those keys are lists! * We're given latitude and longitude. It's important to be able to find these! ** Right clicking on https://openstreetmap.org seems to work ** Google Maps: (1) On your computer, open Google Maps. (2) Right-click the place or area on the map. This will open a pop-up window. You can find your latitude and longitude in decimal format at the top. (3) To copy the coordinates automatically, left click on the latitude and longitude. ** This website: https://www.latlong.net/ Additional examples: Let's look at the [https://nominatim.org/release-docs/develop/api/Search/#result-limitation result limitation] section of the documentation and try two things: # Let's read the documentation # Let's write a program to ask for more bakeries that are the the ones we've been given. # Let's ask for a list of bakeries that are within the university district # Let's plug the whole thing into Python == Introduce the problem set == == FAQ == ; What if there's no API?: Sometimes, the only way to get data is to extract it from potentially messy HTML. This is called scrapping and python has a library called BeautifulSoup to help with that.
Summary:
Please note that all contributions to CommunityData are considered to be released under the Attribution-Share Alike 3.0 Unported (see
CommunityData:Copyrights
for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource.
Do not submit copyrighted work without permission!
To protect the wiki against automated edit spam, we kindly ask you to solve the following CAPTCHA:
Cancel
Editing help
(opens in new window)
Tools
What links here
Related changes
Special pages
Page information