Skip to content

The goal is to scrape contents from a website using Scrapy and to save the scraped data in a csv file

Notifications You must be signed in to change notification settings

madhu091998/Web-scraping-using-Scrapy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Scraping-bike-specifications-using-Scrapy

The goal is to scrape contents from a website using Scrapy and to save the scraped data in a csv file. This project contains python script to scrape bike's specifications from carandbike website.

  • You can find the python script in scrape_bikes/spiders/bike.py

PRE-REQUISITES

1.Anaconda Navigator

Installation : Anaconda

2.Scrapy

Scrapy documentation : Scrapy

  • Installing scrapy using command prompt/terminal : pip install scrapy
  • Installing scrapy using anaconda's cmd.exe prompt : conda install -c conda-forge scrapy pylint autopep8 -y

3. Visual Studio Code (To make any changes in the script)

Installation : VSCode.

4. Python

Documentation : link

HOW TO USE

  1. Download the whole project and save it in a folder.
  2. Open cmd.exe Prompt in anaconda.
  3. To run the spider, use this command : scrapy crawl bike
  4. To save the data in csv form : scrapy crawl bike -o bike_dataset.csv
  5. To save the data in json format : scrapy crawl bike -o bike_dataset.json

About

The goal is to scrape contents from a website using Scrapy and to save the scraped data in a csv file

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages