我有2台,PC A没有错误,爬虫已成功部署,但在PC B上,错误发生 .

我的Scrapyd服务器正在运行但是当我尝试部署我的爬虫时,会发生这些错误 .

{"status":"error","message":回溯(最近通话最后一个):\ n文件\ "/usr/lib/python2.7/runpy.py",线162,在_run_module_as_main \ n \” main \ ", fname, loader, pkg_name)\n File " /usr/lib/python2.7/runpy.py \ ", line 72, in _run_code\n exec code in run_globals\n File "在/ usr / local / lib目录/ python2.7 / DIST-包/ scrapyd / runner.py \ ", line 40, in \n main()\n File " /usr/local/lib/python2.7/dist-packages/scrapyd/runner.py \ ", line 37, in main\n execute()\n File "在/ usr / local / lib目录/ python2 . 7 / dist-packages / scrapy / cmdline.py \ ", line 148, in execute\n cmd.crawler_process = CrawlerProcess(settings)\n File " /usr/local/lib/python2.7/dist-packages/scrapy/crawler.py \“,第243行,在 init \ n super(CrawlerProcess,self)中 . init (设置)\ n文件\ "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py",线134,在 init \ n self.spider_loader = _get_spider_loader(设置)\ n文件\ "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py",线330,在_get_spider_loader \ n返回loader_cls.from_settings(settings.frozencopy())\ n文件\ "/usr/local/lib/python2.7/dist-packages/scrapy/spiderloader.py",线61,在from_settings \ n返回CLS(设置)\ n文件\ "/usr/local/lib/python2.7/dist-packages/scrapy/spiderloader.py" 25行,在 init \ n self._load_all_spiders()\ n文件\ "/usr/local/lib/python2.7/dist-packages/scrapy/spiderloader.py" 47行,在_load_all_spiders \ n,用于模块在walk_modules(name)中:\ n文件\ "/usr/local/lib/python2.7/dist-packages/scrapy/utils/misc.py",第71行,在walk_modules \ n \ submod = import_module(fullpath)\ n文件\“/ usr / lib / python2.7 / importlib / init .py \”,第37行,在import_module \ n import (名)\ n文件\ "spiderman/spiders/scraper.py" 16行,在\ n邮件=输入('Email : ')\ nEOFError:EOF读取线\ n时", " NODE_NAME ": " MY PC“}