I'm running scrapyd and ScrapyKeeper. Spiders are set for periodic jobs. Each time they run, I've got this error.
I've inspected my envvars and there isn't any SCRAPY_ variable there.
Any ideas who is setting this SCRAPY_JOB variable?
INFO: Scrapy 2.1.0 started (bot: mybot)
CRITICAL: Unhandled error in Deferred:
CRITICAL:
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/twisted/internet/defer.py", line 1418, in _inlineCallbacks
result = g.send(result)
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/scrapy/crawler.py", line 86, in crawl
self.spider = self._create_spider(*args, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/scrapy/crawler.py", line 98, in _create_spider
return self.spidercls.from_crawler(self, *args, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/scrapy/spiders/__init__.py", line 49, in from_crawler
spider = cls(*args, **kwargs)
TypeError: __init__() got an unexpected keyword argument '_job'
I'm running scrapyd and ScrapyKeeper. Spiders are set for periodic jobs. Each time they run, I've got this error.
I've inspected my
envvarsand there isn't anySCRAPY_variable there.Any ideas who is setting this
SCRAPY_JOBvariable?