How to Install Scrapy in Python
A high-level Web Crawling and Web Scraping framework
pip install Scrapy
What is Scrapy?
A high-level Web Crawling and Web Scraping framework
Scrapy is a web scraping framework to extract structured data from websites. It is cross-platform, and requires Python 3.10+. It is maintained by Zyte (formerly Scrapinghub) and .
And follow the documentation to learn how to use it.
If you wish to contribute, see Contributing.
Quick Start
Minimal example to get started with Scrapy:
import scrapy
print(scrapy.__version__)
Installation
pip (standard)
pip install Scrapy
Virtual environment (recommended)
python -m venv venv
source venv/bin/activate # Windows: venv\Scripts\activate
pip install Scrapy
pip3
pip3 install Scrapy
conda
conda install -c conda-forge Scrapy
Poetry
poetry add Scrapy
Dependencies
Installing Scrapy will also install these packages:
Verify the Installation
After installing, confirm the package is available:
python -c "import scrapy; print(scrapy.__version__)"
If this prints a version number, installation succeeded. If you see a ModuleNotFoundError, see the errors section below.
Installation Errors
Common errors when installing Scrapy with pip.
ModuleNotFoundError: No module named 'scrapy'
Cause: The package is not installed in the current Python environment.
Fix: Run pip install Scrapy. If using a virtual environment, ensure it is activated first.
ModuleNotFoundError: No module named 'scrapy' (installed but still failing)
Cause: pip installed the package into a different Python than the one running your script.
Fix: Use python -m pip install Scrapy to install into the interpreter you are running.
ImportError: cannot import name 'X' from 'scrapy'
Cause: The function or class does not exist in the installed version.
Fix: Check the version with pip show Scrapy and upgrade with pip install --upgrade Scrapy.
pip: command not found
Cause: pip is not in PATH or Python was not added to PATH during installation.
Fix: Try python -m pip install Scrapy. On macOS/Linux try pip3.
PermissionError: [Errno 13] Permission denied
Cause: No write access to the system Python package directory.
Fix: Use a virtual environment, or add --user: pip install --user Scrapy
SSL: CERTIFICATE_VERIFY_FAILED
Cause: pip cannot verify PyPI's SSL certificate — common behind corporate proxies.
Fix: Try: pip install --trusted-host pypi.org --trusted-host files.pythonhosted.org Scrapy
ConnectionError: Failed to establish a new connection
Cause: Server unreachable, URL invalid, or firewall/proxy blocking the connection.
Fix: Verify the URL and network access. Set HTTP_PROXY / HTTPS_PROXY env vars if behind a proxy.
SSLError: CERTIFICATE_VERIFY_FAILED
Cause: The remote server's SSL certificate cannot be verified.
Fix: Update CA certificates on your system. For testing only, disable SSL verification (never in production).
Recent Releases
| Version | Released |
|---|---|
2.15.0 latest |
2026-04-09 |
2.14.2 |
2026-03-12 |
2.14.1 |
2026-01-12 |
2.14.0 |
2026-01-05 |
2.13.4 |
2025-11-17 |
Manage Scrapy
Upgrade to latest version
pip install --upgrade Scrapy
Install a specific version
pip install Scrapy==2.15.0
Uninstall
pip uninstall Scrapy
Check what is installed
pip show Scrapy