All Code Working December 2025 🟢
What would you do if you woke up one day with the ability to scrape anything from Amazon?
You better have an answer because that day is today.
A comprehensive Amazon scraping toolkit powered by Scrape.do. Extract product data, search results, reviews, seller offers, and more from Amazon with ready-to-use scripts in Python, Node.js, and cURL.
Note: The Python and Node.js scripts can work without Scrape.do. If you have your own headless browser setup with rotating proxies and proper headers, simply replace the API URL with your direct Amazon request. The parsing logic will work the same way.
| Scraper | Description | Output |
|---|---|---|
| searchResults | Scrape product listings from search queries | CSV |
| singleProduct | Extract detailed product information by ASIN | CSV |
| productVariations | Recursively scrape all color/size variations | CSV |
| reviews | Extract customer reviews for any product | CSV |
| bestSellers | Scrape best seller rankings by category | CSV |
| sellerOffers | Get all seller listings for a product | CSV |
| sponsoredProducts | Extract sponsored ads from search results | JSON |
| relatedSearches | Get related search suggestions | Terminal |
For detailed step-by-step guides on how each scraper works:
- How to Scrape Amazon Product Pages - PDP, seller offers, and variations
- How to Scrape Amazon Search Results - Search results, related searches, and sponsored products
- How to Scrape Amazon Best Sellers - Best seller rankings by category
- How to Scrape Amazon Reviews - Product reviews (includes technical guidelines for authenticated scraping, though large-scale or commercial use behind login is prohibited)
- Best Amazon Scraper APIs Compared - Performance benchmarks of top Amazon scraping APIs
cd python
pip install -r ../requirements.txt
python searchResults.pycd node.js
npm install
node searchResults.jscd "cURL(ready-API)"
bash search.shEach script has a configuration section at the top. Update the token and parameters as needed:
# Python
token = "<SDO-token>"
asin = "B07TCJS1NS" # Change this to any product ASIN
geocode = "us"
zipcode = "10001"// Node.js
const token = "<SDO-token>";
const asin = "B07TCJS1NS"; // Change this to any product ASIN
const geocode = "us";
const zipcode = "10001";The cURL(ready-API) folder contains scripts that use Scrape.do's structured Amazon API endpoints:
GET https://api.scrape.do/plugin/amazon/pdp
?token=<SDO-token>
&asin=B0C7BKZ883
&geocode=us
&zipcode=10001
GET https://api.scrape.do/plugin/amazon/search
?token=<SDO-token>
&keyword=laptop%20stands
&geocode=us
&zipcode=10001
&page=1
GET https://api.scrape.do/plugin/amazon/offer-listing
?token=<SDO-token>
&asin=B0DGJ7HYG1
&geocode=us
&zipcode=10001
pip install requests beautifulsoup4npm install axios cheerio csv-writerpython python/searchResults.py
# Output: searchResults.csvnode node.js/reviews.js
# Output: reviews.csvbash "cURL(ready-API)/pdp.sh"
# Output: output/SDOpdp.jsonSample outputs can be found in output folder.
MIT - see LICENSE for details.