If you have come here directly without reading the first part then I suggest you to skim the first part so that you will understand what is going on in this tutorial. Find the Part 1 here: Making a Bottle app with python tutorial part 1. Or if you have come here just to know about the python-requests module then you can go ahead and read this tutorial directly.
I am writing a part on this module because it is easy to learn, easy to use and finally when I saw a few other tutorials they directly used requests but didn't introduce it.
If you know python-requests module then you can skip this section and go for the Tutorial part 3. If not spare a few minutes to complete this tutorial.
We will be using the python-requests module for this tutorial to get the json data from the websites(Weather data). You may have a doubt that why are we using the python-requests module instead of using the urllib2 and the json standard libraries to get the data? The answer is "Yes", for this tutorial you can use the standard libraries to get the data but if you want to accomplish higher level HTTP requests then it is better or recommended to use the requests library, because requests can perform better when compared to the python standard library. Even the official python documentation recommends us to use the requests module.
If you want to know more on why you should use requests instead of others you can find a simple and sharp answer written by joren and Gary from here:
Python should I use urllib, urllib2 or requests.
Okay lets get started!!
Even though we will not be using freegeoip.net json data in our web app we will practice our python-requests on this API so that not only if you want to use this API in any of your future projects you will not face any problems, but also you will be understand the requests module easily.
As the data is not in a readable format, for us to use the code we will have to see that in a neat format. For this first we will store the data into a variable so that we can use that easily. And then we will use the pprint(Data pretty printer) to print the data in a neat way. Finally we will find out in what type the data has been stored so that we can apply respective operations. Now see the code:
We will try to use the data once. Once you have typed the above code then type the following code:
Now as you can see we can do any operations on this data, i.e. we can use this data in if statement or while loop or even we can slice the strings, depending upon how we want to use the data.
Next I want to explain in this part on how to send a request for url's which would look like something like this:
To accomplish this requests gives us a simple solution. See the following code:
We all are familiar with "404 page not found". Here 404 is the status code. Similar to 404 we have 200 which tells us that the operation is successful, 500 meaning internal error and so on. This data is really important in many cases. So to get the status code we can do this:
If you want to know more about the status codes then you can see it from here: Status codes in HTTP
We will end this post by covering some important more things that we can do with python-requests. If you are here only for developing the web app then you can skip the rest of this post. But I seriously recommend you to read them as they might be helpful to you in the future.
If you want to check the encoding of the data you have received, then you can do the following:
As we have seen that different website deliver different type of data like some delivering json data, some delivering html and so on. Sometimes this data is also required. To get this data we can do the following:
If you visit some websites, you may have observed that the they will redirect you to some other website or some other url. If you use python-requests then you will get the status code as 200. Because of this sometimes you may end up with wrong data. So to check this requests has a solution. For example if the code is like this then:
Congrats, you have completed this part on introduction to python requests module. I have covered all the important things that are required for beginners. But if you want to know more about the python-requests module then you can have a look at the documentation or the quickstart.
I have tried to explain each and every thing in this post in such a way that it is easy for everyone to understand. But if you have any doubt or didn't understand anything then please do contact me or comment in the comment box below. I will answer your question as soon as possible. You can contact me from here: Contact me.
Finally please do comment on how I can improve this post so that everyone will feel comfortable to read it, tell me if I have missed any topic or have made any typo or anything else. Thank you, see you in the next post ☺!
In this post the code has been high lighted using: tohtml.com
I am writing a part on this module because it is easy to learn, easy to use and finally when I saw a few other tutorials they directly used requests but didn't introduce it.
If you know python-requests module then you can skip this section and go for the Tutorial part 3. If not spare a few minutes to complete this tutorial.
We will be using the python-requests module for this tutorial to get the json data from the websites(Weather data). You may have a doubt that why are we using the python-requests module instead of using the urllib2 and the json standard libraries to get the data? The answer is "Yes", for this tutorial you can use the standard libraries to get the data but if you want to accomplish higher level HTTP requests then it is better or recommended to use the requests library, because requests can perform better when compared to the python standard library. Even the official python documentation recommends us to use the requests module.
If you want to know more on why you should use requests instead of others you can find a simple and sharp answer written by joren and Gary from here:
Python should I use urllib, urllib2 or requests.
Okay lets get started!!
Even though we will not be using freegeoip.net json data in our web app we will practice our python-requests on this API so that not only if you want to use this API in any of your future projects you will not face any problems, but also you will be understand the requests module easily.
Prerequisite
If you have not installed requests-python then head on to Installation-Installation - Requests 2.9.1 documentation. Also you should have a python on our local computer ☆ . That's it you are ready to take off ☺.
First start your idle and then, As usual we will import requests to use it.
Now lets see what happened. When you have asked the python to get the data from the website it will go to the website find the HTML data and store it in the variable r.
But you cannot read the data directly by typing :
If you want to read the data then you will have to type the following:
Your IP is: 4.4.4.4
Based on your ip address. Now if you can verify, then you will find the same text in the python idle. 'u' in the front of the text tells the encoding is UTF-8. Don't worry if you don't know encoding, just skip it.
Actually the data is imported as HTML but in this case as the HTML is same as the one we see in the browser we get that output.
Note: The number 4.4.4.4 given in the example is my ip and will be different for yours. You will get a different value. So don't get panicked by seeing this.
Now lets see how we will use the json data. For this example we will be using freegeoip.net api from here: freegeoip.net/json. If you open this link then you will find that your ip address, your country code, country name and some others are available. Okay now we will use the requests module to scrape the data. First we will send a request to the website using the requests module. (Don't forget to import the requests module if you haven't!)
Here also the HTML is same as the one that is displayed. Now as in this case the data that we have got is in the json format we will type the following and the output should be similar to this:First start your idle and then, As usual we will import requests to use it.
>>> import requests
>>>
Next we will send a HTTP request to get the data.>>> r = requests.get('https://anivarth.pythonanywhere.com/myip')
>>>
Note: When you send the request to the website it might take some time python to download the data on to your computer depending on your internet speed, so please wait till the next ">>>" appears. Now lets see what happened. When you have asked the python to get the data from the website it will go to the website find the HTML data and store it in the variable r.
But you cannot read the data directly by typing :
>>> print r
Because the data is stored as a requests response object. If you want to read the data then you will have to type the following:
>>> r.text
u'Your IP is: 4.4.4.4\n'
>>>
Now head over to the website: https://anivarth.pythonanywhere.com/myip. Then you will find some thing like this:Your IP is: 4.4.4.4
Based on your ip address. Now if you can verify, then you will find the same text in the python idle. 'u' in the front of the text tells the encoding is UTF-8. Don't worry if you don't know encoding, just skip it.
Actually the data is imported as HTML but in this case as the HTML is same as the one we see in the browser we get that output.
Note: The number 4.4.4.4 given in the example is my ip and will be different for yours. You will get a different value. So don't get panicked by seeing this.
Now lets see how we will use the json data. For this example we will be using freegeoip.net api from here: freegeoip.net/json. If you open this link then you will find that your ip address, your country code, country name and some others are available. Okay now we will use the requests module to scrape the data. First we will send a request to the website using the requests module. (Don't forget to import the requests module if you haven't!)
>>> r = requests.get('https://freegeoip.net/json')
>>>
>>> r.json()
{"ip":"4.4.4.4","country_code":"US","country_name":"United States","region_code":"CA","region_name":"California","city":"Los Angeles","zip_code":"90067","time_zone":"America/Los_Angeles","latitude":34.0565,"longitude":-118.4136,"metro_code":803}
>>>
Depending on your ip address you will get the respective data.As the data is not in a readable format, for us to use the code we will have to see that in a neat format. For this first we will store the data into a variable so that we can use that easily. And then we will use the pprint(Data pretty printer) to print the data in a neat way. Finally we will find out in what type the data has been stored so that we can apply respective operations. Now see the code:
>>> t = r.json()
>>> from pprint import pprint
>>> pprint(t)
{u'city': u'Las Vegas',
u'country_code': u'US',
u'country_name': u'United States',
u'ip': u'4.4.4.4',
u'latitude': 36.175,
u'longitude': -115.1372,
u'metro_code': 839,
u'region_code': u'NV',
u'region_name': u'Nevada',
u'time_zone': u'America/Los_Angeles',
u'zip_code': u'89126'}
>>> type(t)
<type 'dict'>
>>>
As the variable is in python dictionary data type we can reference the data in key value pair format.We will try to use the data once. Once you have typed the above code then type the following code:
>>> t['city']
u'Las Vegas'
>>> t['latitude']
36.175
>>> int(t['latitude'])+55
91
>>>
Output you get will change depending on your ip address.Now as you can see we can do any operations on this data, i.e. we can use this data in if statement or while loop or even we can slice the strings, depending upon how we want to use the data.
Next I want to explain in this part on how to send a request for url's which would look like something like this:
http://example.com/data?q=nevada&ans=usa
A quick answer will be something like this:>>> r = requests.get('http://example.com/data?q=nevada&ans=usa')
>>>
But what if the I will give you a bunch of url's(say some 500) which are similar to the above one but the value after the q and ans will change for each and every url?To accomplish this requests gives us a simple solution. See the following code:
>>> payload = {'q':'nevada','ans':'usa'}
>>> r = requests.get('http://example.com/data',params=payload)
>>>
By using this method we can create url's as given above. If you don't believe me then you can see the url that you have passed using the requests-python as follows:>>> r.url
u'http://example.com/data?q=nevada&ans=usa'
>>>
If you still don't believe me then try changing the values in the payload variable and then you will find that the url will be changed. Also you can add any number of items in the dictionary and this is not confined to 2. A final example to demonstrate this:>>> country = 'india'
>>> pi = 3.14
>>> items = {'key1':'value1','c':country,'pi':pi}
>>> r = requests.get('http://httpbin.org/data',params=items)
>>> r.url
u'http://httpbin.org/data?key1=value1&pi=3.14&c=india'
>>>
It should be noted here that we will be using this(the above code) in our upcoming tutorials for building weather app(Weave).We all are familiar with "404 page not found". Here 404 is the status code. Similar to 404 we have 200 which tells us that the operation is successful, 500 meaning internal error and so on. This data is really important in many cases. So to get the status code we can do this:
>>> r = requests.get('http://httpbin.org')
>>> r.status_code
200
>>> r = requests.get('http://httpbin.org/404')
>>> r.status_code
404
>>>
If you want to know more about the status codes then you can see it from here: Status codes in HTTP
We will end this post by covering some important more things that we can do with python-requests. If you are here only for developing the web app then you can skip the rest of this post. But I seriously recommend you to read them as they might be helpful to you in the future.
If you want to check the encoding of the data you have received, then you can do the following:
>>> r = requests.get('http://example.com')
>>> r.encoding
'ISO-8859-1'
>>> r = requests.get('http://httpbin.org')
>>> r.encoding
'utf-8'
>>>
As we have seen that different website deliver different type of data like some delivering json data, some delivering html and so on. Sometimes this data is also required. To get this data we can do the following:
>>> r = requests.get('http://example.com')
>>> r.headers['Content-Type']
'text/html'
>>> r.headers.get('content-type')
'text/html'
>>>
You can use any one, which ever is convenient to you. You may be wondering on why we are using the .headers. When the requests object is called as 'r.headers' then we will get the data which the server have sent us. Some of the data that the server sends to us are the content-encoding, type of server and so on. We can see all the data by typing the following:>>> r.headers
{'Content-Length': '606', 'x-ec-custom-error': '1', 'X-Cache': 'HIT', 'Content-Encoding': 'gzip', 'Expires': 'Sun, 28 Feb 2016 18:24:23 GMT', 'Vary': 'Accept-Encoding', 'Server': 'ECS (cpm/F9D5)', 'Last-Modified': 'Fri, 09 Aug 2013 23:54:35 GMT', 'Etag': '"359670651+gzip"', 'Cache-Control': 'max-age=604800', 'Date': 'Sun, 21 Feb 2016 18:24:23 GMT', 'Content-Type': 'text/html'}
>>>
If you visit some websites, you may have observed that the they will redirect you to some other website or some other url. If you use python-requests then you will get the status code as 200. Because of this sometimes you may end up with wrong data. So to check this requests has a solution. For example if the code is like this then:
>>> r = requests.get('http://github.com')
>>> r.url
u'https://github.com/'
>>> r.status_code
200
>>> r.history
[<Response [301]>]
>>>
From the above demonstration we can see that we have been redirected from 'http://github.com' to 'http://github.com/'. Even though the status code for redirect is 301 it didn't appear when you have typed 'r.status_code', because 'r.status_code' only returns the current status code. For this reason we will have to type 'r.history' to get the status codes on the original url('http://github.com').Congrats, you have completed this part on introduction to python requests module. I have covered all the important things that are required for beginners. But if you want to know more about the python-requests module then you can have a look at the documentation or the quickstart.
I have tried to explain each and every thing in this post in such a way that it is easy for everyone to understand. But if you have any doubt or didn't understand anything then please do contact me or comment in the comment box below. I will answer your question as soon as possible. You can contact me from here: Contact me.
Finally please do comment on how I can improve this post so that everyone will feel comfortable to read it, tell me if I have missed any topic or have made any typo or anything else. Thank you, see you in the next post ☺!
In this post the code has been high lighted using: tohtml.com