scrapy — 暂停爬虫、恢复爬虫

1、启动爬虫

scrapy crawl spider -s JOBDIR=record/spider-1

record/spider-1表示一个路径,作用是记录爬虫状态

2、暂停爬虫

  • 终端输入Ctrl+C,此时爬虫并不会立即停止,需要等待一会
  • 当终端出现可以敲击命令时,说明爬虫已暂停

3、恢复爬虫

scrapy crawl spider -s JOBDIR=record/spider-1
微信公众号
手机浏览(小程序)

Warning: get_headers(): SSL operation failed with code 1. OpenSSL Error messages: error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed in /mydata/web/wwwshanhubei/web/wp-content/themes/shanhuke/single.php on line 57

Warning: get_headers(): Failed to enable crypto in /mydata/web/wwwshanhubei/web/wp-content/themes/shanhuke/single.php on line 57

Warning: get_headers(https://static.shanhubei.com/qrcode/qrcode_viewid_11769.jpg): failed to open stream: operation failed in /mydata/web/wwwshanhubei/web/wp-content/themes/shanhuke/single.php on line 57
0
分享到:
没有账号? 忘记密码?