Exporting Scrapy items to JSON produces a portable snapshot of crawl results that can be archived, compared between runs, or imported into analysis and reporting workflows without re-crawling the source.
Scrapy’s feed exporter serializes each item as the spider yields it and writes the result to the chosen output destination. Using a target filename ending in .json selects the JSON feed exporter automatically and writes a JSON array to the file.
The JSON array is only fully valid once the crawl completes and the closing bracket is written, so interrupted crawls can leave a truncated export that fails to parse. Large exports can also be slow to open in editors and memory-heavy to load, so keeping item fields consistent and minimal improves reliability on long crawls.
Related: How to export Scrapy items to CSV
Related: How to enable item pipelines in Scrapy
Steps to export Scrapy items to JSON:
- Open a terminal in the Scrapy project directory.
$ cd /root/sg-work/catalog_demo
- Run the spider with the -O feed export option to write items to a .json file.
$ scrapy crawl catalog -O products.json 2026-01-01 09:39:01 [scrapy.extensions.feedexport] INFO: Stored json feed (6 items) in: products.json
The -O option overwrites products.json if it already exists.
The -o option appends output, which is usually better suited to line-based formats like jsonlines.
- Inspect the exported file to confirm the JSON array contains items.
$ head -n 8 products.json [ {"name": "Starter Plan", "price": "$29", "url": "http://app.internal.example:8000/products/starter-plan.html"}, {"name": "Team Plan", "price": "$79", "url": "http://app.internal.example:8000/products/team-plan.html"}, {"name": "Enterprise Plan", "price": "$199", "url": "http://app.internal.example:8000/products/enterprise-plan.html"}, {"name": "Growth Plan", "price": "$129", "url": "http://app.internal.example:8000/products/growth-plan.html"}, {"name": "Agency Plan", "price": "$249", "url": "http://app.internal.example:8000/products/agency-plan.html"}, {"name": "Platform Plan", "price": "$499", "url": "http://app.internal.example:8000/products/platform-plan.html"} ]The closing ] is written at crawl end; the file may appear incomplete mid-run.
- Validate the exported file parses as JSON.
$ python3 -c "import json; f=open('products.json', encoding='utf-8'); json.load(f); f.close(); print('JSON OK')" JSON OK
Mohd Shakir Zakaria is a cloud architect with deep roots in software development and open-source advocacy. Certified in AWS, Red Hat, VMware, ITIL, and Linux, he specializes in designing and managing robust cloud and on-premises infrastructures.
