Create Web Scraper Python



  1. Build A Web Scraper Python
  2. Creating A Web Scraper Python
Create Web Scraper Python

Recently I come across a tool that takes care of many of the issues you usually face while scraping websites. The tool is called Scraper API which provides an easy to use REST API to scrape a different kind of websites(Simple, JS enabled, Captcha, etc) with quite an ease. Before I proceed further, allow me to introduce Scraper API.

Jan 05, 2021 In this article, we’re going to talk about how to perform web scraping with python, using Selenium in the Python programming language. Web scraping, also called web data extraction, refers to the technique of harvesting data from a web page through leveraging the patterns in the page’s underlying code. It’s a lightweight web browser with an HTTP API, implemented in Python 3 using Twisted and QT5. Essentially we are going to use Splash to render Javascript generated content. Run the splash server: sudo docker run -p 8050:8050 scrapinghub/splash.

What is Scraper API

If you visit their website you’d find their mission statement:

Scraper API handles proxies, browsers, and CAPTCHAs, so you can get the HTML from any web page with a simple API call!

As it suggests, it is offering you all the things to deal with the issues you usually come across while writing your scrapers.

Development

Scraper API provides a REST API that can be consumed in any language. Since this post is related to Python so I’d be mainly focusing on requests library to use this tool.

You must first signup with them and in return, they will provide you an API KEY to use their platform. They provide 1000 free API calls which are enough to test their platform. Otherwise, they offer different plans from starter to the enterprise which you can view here.

Let’s try a simple example which is also giving in the documentation.

Build A Web Scraper Python

2
4
payload={'api_key':API_KEY,'url':URL_TO_SCRAPE,'session_number':'123'}
r=requests.get('http://api.scraperapi.com',params=payload,timeout=60)

And it’d produce the following result:

Can you notice the same proxy IP here?

Creating OLX Scrapper

Like previous scraping related posts, I am going to pick OLX again for this post. I will iterate the list first and then will scrape individual items. Below is the complete code.