APPLICATION - Extracting the Product Names, Links and Prices. Saving to Excel

Python 3: Automating Your Job Tasks Superhero Level: Automate Web Scraping with Python 3
12 minutes
Share the link to this page
Copied
  Completed
You need to have access to the item to view this lesson.
One-time Fee
$99.99
List Price:  $139.99
You save:  $40
€92.78
List Price:  €129.90
You save:  €37.11
£79.40
List Price:  £111.16
You save:  £31.76
CA$136.11
List Price:  CA$190.56
You save:  CA$54.44
A$154.13
List Price:  A$215.78
You save:  A$61.65
S$135.08
List Price:  S$189.12
You save:  S$54.03
HK$782.28
List Price:  HK$1,095.23
You save:  HK$312.94
CHF 90.61
List Price:  CHF 126.85
You save:  CHF 36.24
NOK kr1,085.23
List Price:  NOK kr1,519.37
You save:  NOK kr434.13
DKK kr692.01
List Price:  DKK kr968.84
You save:  DKK kr276.83
NZ$167.80
List Price:  NZ$234.94
You save:  NZ$67.13
د.إ367.19
List Price:  د.إ514.08
You save:  د.إ146.89
৳10,976.08
List Price:  ৳15,366.96
You save:  ৳4,390.87
₹8,339.52
List Price:  ₹11,675.66
You save:  ₹3,336.14
RM473.25
List Price:  RM662.57
You save:  RM189.32
₦141,842.81
List Price:  ₦198,585.61
You save:  ₦56,742.80
₨27,810.04
List Price:  ₨38,935.18
You save:  ₨11,125.13
฿3,647.70
List Price:  ฿5,106.92
You save:  ฿1,459.22
₺3,232.12
List Price:  ₺4,525.11
You save:  ₺1,292.98
B$499.21
List Price:  B$698.91
You save:  B$199.70
R1,908.54
List Price:  R2,672.04
You save:  R763.49
Лв180.65
List Price:  Лв252.92
You save:  Лв72.26
₩135,197.71
List Price:  ₩189,282.20
You save:  ₩54,084.49
₪368.63
List Price:  ₪516.10
You save:  ₪147.47
₱5,633.91
List Price:  ₱7,887.71
You save:  ₱2,253.79
¥15,144.47
List Price:  ¥21,202.86
You save:  ¥6,058.39
MX$1,659.40
List Price:  MX$2,323.22
You save:  MX$663.82
QR364.31
List Price:  QR510.04
You save:  QR145.73
P1,370.91
List Price:  P1,919.33
You save:  P548.42
KSh13,148.68
List Price:  KSh18,408.68
You save:  KSh5,260
E£4,729.52
List Price:  E£6,621.52
You save:  E£1,892
ብር5,680.63
List Price:  ብር7,953.11
You save:  ብር2,272.48
Kz83,612.74
List Price:  Kz117,061.18
You save:  Kz33,448.44
CLP$97,978.20
List Price:  CLP$137,173.40
You save:  CLP$39,195.20
CN¥722.95
List Price:  CN¥1,012.16
You save:  CN¥289.21
RD$5,921.50
List Price:  RD$8,290.34
You save:  RD$2,368.83
DA13,490.83
List Price:  DA18,887.70
You save:  DA5,396.87
FJ$226.12
List Price:  FJ$316.58
You save:  FJ$90.46
Q779.86
List Price:  Q1,091.83
You save:  Q311.97
GY$20,923.51
List Price:  GY$29,293.76
You save:  GY$8,370.24
ISK kr13,946.60
List Price:  ISK kr19,525.80
You save:  ISK kr5,579.20
DH1,013.19
List Price:  DH1,418.51
You save:  DH405.32
L1,763.34
List Price:  L2,468.75
You save:  L705.40
ден5,702.11
List Price:  ден7,983.18
You save:  ден2,281.07
MOP$805.89
List Price:  MOP$1,128.28
You save:  MOP$322.39
N$1,893.44
List Price:  N$2,650.90
You save:  N$757.45
C$3,681.15
List Price:  C$5,153.75
You save:  C$1,472.60
रु13,335.63
List Price:  रु18,670.42
You save:  रु5,334.78
S/370.84
List Price:  S/519.19
You save:  S/148.35
K382.72
List Price:  K535.82
You save:  K153.10
SAR375
List Price:  SAR525.01
You save:  SAR150.01
ZK2,522.76
List Price:  ZK3,531.96
You save:  ZK1,009.20
L461.43
List Price:  L646.02
You save:  L184.59
Kč2,350.75
List Price:  Kč3,291.15
You save:  Kč940.39
Ft36,729.02
List Price:  Ft51,422.10
You save:  Ft14,693.08
SEK kr1,071.30
List Price:  SEK kr1,499.86
You save:  SEK kr428.56
ARS$85,766.82
List Price:  ARS$120,076.98
You save:  ARS$34,310.16
Bs691.04
List Price:  Bs967.48
You save:  Bs276.44
COP$387,583.68
List Price:  COP$542,632.66
You save:  COP$155,048.97
₡50,832.34
List Price:  ₡71,167.31
You save:  ₡20,334.97
L2,468.78
List Price:  L3,456.40
You save:  L987.61
₲737,805.73
List Price:  ₲1,032,957.54
You save:  ₲295,151.80
$U3,781.90
List Price:  $U5,294.82
You save:  $U1,512.91
zł400.73
List Price:  zł561.05
You save:  zł160.31
Already have an account? Log In

Transcript

Hi, and welcome to this video. In this lecture we are going to build, analyze and test a web scraping application that will browse for and extract information about the products listed on a test page. The application will grab the product name, the URL link of that product, and also its price from the webpage, then it will use this data to build a panda's data frame. And finally, it will write that data frame to an Excel spreadsheet that generating a nicely formatted report or database, which is actually the goal of every web scrapping script. As a side note, if you're not familiar with pandas data frames, please go through the data analysis section of this course. And then you will be able to move on with building this application.

Assuming you already went through that section and you're comfortable using pandas data frames. Let's move further and see the webpage that we are going to scrap so the This is the webpage right here. By the way, I have attached the URL of this test page to this video. So you can either type in the link yourself or use the attached link. So here we have a web page listing 21 tablets. For each device, we have its name, a short description, a price, a link, if you hover your mouse over the name, then we have the review score.

And finally, the number of reviews. This is how most ecommerce websites look like. And we are going to use this exact page to test our application against next let me show you the code for this application. So this is it right here in Notepad plus plus, we have exactly 50 lines of code, including comments and blank lines. Now let's start analyzing the code. You can either choose to type in the code in your own preferred text editor, or you can copy and paste the code from the upcoming notebook.

More over you're going to find the Python script itself also attached To the notebook following this video. Okay, let's get to the fun part. Now. First of all, as always, you have to import the necessary modules. In our case, we need to import the pandas and request modules, as well as the beautiful subclass from within the ps4 module. The next four lines of code, these ones right here are pretty easy to understand since we've already discussed them in the previous videos of this section.

Basically, we are passing the link of the web page to the get method. And then we are loading and parsing the page content. And finally, we are building a result set object by searching for all the div tags that match the class passed as a dictionary value in the second argument of the Find all method right here. Now let's check the web page again, to verify that this is indeed the class that corresponds to each of the 21 products on the page. I'm going to right click on one of these products and hit inspect and indeed If we look at the div itself, we have class called SM four called LG four called MD four, which is exactly the same class I have in my code. Okay, this is correct.

Next, notice that I have created three empty lists, called names, links and prices. Each of these lists will get populated with the names of all the products on the page, as well as the URL link and the prices of each of these products respectively. to populate this list, we have to iterate over the products variable, which is actually our result set object containing the data for all 21 tablets on the page right. Now let's see what each of these products looks like in HTML code, and I'll switch to idle and execute the code that we've discussed thus far in this video. Okay, at this point, let's extract and print out the first product from our products result set. So let's use print of products.

Let's use index zero because I said we want the first product dot prettify open and close parentheses, enter. So as you can see, it is the Lenovo idea tab at a price of 6999. Is this the first product listed on the webpage? Let's check this. And yes, indeed it is. Great.

Now at the beginning of this video, I said that for each product, we want to extract its name, the URL link to the product, and also its price. Let's do that for the first product in the list based on the HTML code that we're seeing on the screen right now, and the things you've learned throughout this section. First, we see that the name of the device is included in this a tag right here. So to get to that tag, and also extract only the string enclosed by that tag, we can use the following code products of zero dot A the name of the Tag not string. Okay, great. Now what about the link to this product, that would be the underlying URL of the same a tag right here.

So in order to extract the value of that attribute, all we have to do is to treat the a tag as a dictionary, as we did earlier in this section, and then extract the attribute value using the attribute name as a dictionary key. So simple enough, we will use products of zero dot A. And now the key, the name of the key href, which as I said is the name of the attribute up here. Let's hit enter. Now since this is not a valid link, we will have to prepend it with the domain name and extension of this website by simply performing string concatenation. Let me show you how to do this.

So we have HTTPS, colon slash slash, the name of the website and then we are concatenating this string With the string above, let's hit enter. Okay, great. Finally, we need to extract the price of the product as well. As you can clearly see the price is enclosed in an h4 tag. So let's go ahead and extract it using the string object attribute once again, this time applied to the h4 tag. So that would be productive zero dot h for dot string, Enter.

And great job we got the price as well. Now let's get back to our application code. And notice that the operation we just did in the interpreter are the ones performed for each product in the product result set using a basic for loop. So this is the for loop right here. Additionally, for each product in the list, we are appending the specific name, link and price information to the correct list. As soon as all the product information is extracted and saved to the correct list, we have to combine the data in the three lists.

Above, so that the name link and price of each product gets grouped together into one unique container. Let's say this is easily done using the zip function. As you can see, I did right here. This function takes the three lists as arguments and combines the first element of list one with the first element of list two with the first element of list three into a single topple, then it does the same for all the elements and puts all the resulting topples into a list. Let me show you what I mean by using a basic example in the interpreter. So let's assume we have list one equals 123. list two equals 1020 and 30 and list three equals 100 203 hundred.

Now we want to have 110 and 100 inside the same container, then we want the same thing. 220 and 203 30 and 300, and so on. For this, all we have to do is use the zip function and pass the three lists as arguments to the function. So let's try this zip of list one, list two, and list three. Now notice that we got a zip object, not a plain list. To convert this to a list, you just have to simply use the list function like this.

So list of zip of list one, list two, list three, enter, and there's the result we were looking for. Similarly, we want the same operation to be performed on our lists in our application, so that we can have the name link and price of the first product in the first couple. Then the same information for the second product in the second topple and so on until the products are exhausted. Now let me create the same three lists in the interpreter and then populate them and zip them together. As we are doing inside the application code, so we have names, links, and prices. And now let me just copy and paste this for loop into the interpreter.

Finally, let me copy this line as well. So data equals list of zip of names, links and prices. Let's see data. Okay, so this is the list of tuples in our application, where each topple represents a product. Notice the first topple here with name, link, and price, then the second topple, and so on. Since we have 21 products on the page, the length of this list should be 21.

Right? So let's use Len of data to confirm this. And indeed it is great. Now back to the application code once again. The next thing I did is I have created a panda's data frame on this row here, using the data stored as a list of tables. also adding the columns name For each of the three columns, this is going to get written to the Excel file as well as column headers.

Last but not least, I'm using the to excel method from within the pandas module. To write the contents of this data frame to a brand new Excel file. All you have to do is enter the back and name of the file, and pandas will write the data to that file. If the file does not exist, then pandas will go ahead and create it for you and then it will populate it with the desired information. The final thing to notice here in my code is the use of these try except else finally block. Basically Python will try to write the data frame to the Excel file.

If for some reason an exception gets raised, then the program will print out this string to the screen, the one under the except clause. On the other hand, if the code under the try statement gets executed successfully, then the string under the else clause gets executed as well and printed out to the screen. Finally, regardless of the execution success status of the code under the try statement, the string specified under the finally clause right here gets executed. So like it or not, the spring function will be called No matter if the above try accept code raises an exception or not. That's it. That's our application, it's time to test it.

First, let's go to the folder I have created on my D drive to check if there's any Excel spreadsheets saved there. And it isn't great. Now let's open up the windows command line and run our script. By the way, don't worry about the second script right here. We are going to discuss it in the next lecture. For now we are going to run the web scraper.py script.

So let's open up the windows cmd. And I'm going to write in Python. The web scrapping web scraper.py enter. Okay, so we have web data successfully written to excel. quitting the program. So you should first notice the strings under the else and finally clauses being printed out to the screen.

Now let's check the folder once again. And here's the brand new Excel file. Let's open it. Okay. And indeed, we have all the products 21 lines right here starting at index zero up to index 20, along with their names, URL links and prices being saved inside this file. Also notice the column headers right here that we've configured in our code.

So congratulations on building your first web scraper with Python and beautifulsoup. I'll see you in the next lecture where we are going to upgrade this application.

Sign Up

Share

Share with friends, get 20% off
Invite your friends to LearnDesk learning marketplace. For each purchase they make, you get 20% off (upto $10) on your next purchase.