知识点

Traceback (most recent call last):

File "/usr/bin/scrapy", line 7, in <module>

  1. from scrapy.cmdline import execute

File "/usr/lib64/python2.7/site-packages/scrapy/cmdline.py", line 9, in <module>

  1. from scrapy.crawler import CrawlerProcess

File "/usr/lib64/python2.7/site-packages/scrapy/crawler.py", line 7, in <module>

  1. from twisted.internet import reactor, defer

File "/usr/lib64/python2.7/site-packages/twisted/internet/reactor.py", line 38, in <module>

  1. from twisted.internet import default

File "/usr/lib64/python2.7/site-packages/twisted/internet/default.py", line 56, in <module>

  1. install = \_getInstallFunction\(platform\)

File "/usr/lib64/python2.7/site-packages/twisted/internet/default.py", line 44, in _getInstallFunction

  1. from twisted.internet.epollreactor import install

File "/usr/lib64/python2.7/site-packages/twisted/internet/epollreactor.py", line 24, in <module>

  1. from twisted.internet import posixbase

File "/usr/lib64/python2.7/site-packages/twisted/internet/posixbase.py", line 18, in <module>

  1. from twisted.internet import error, udp, tcp

File "/usr/lib64/python2.7/site-packages/twisted/internet/tcp.py", line 28, in <module>

  1. from twisted.internet.\_newtls import \(

File "/usr/lib64/python2.7/site-packages/twisted/internet/_newtls.py", line 21, in <module>

  1. from twisted.protocols.tls import TLSMemoryBIOFactory, TLSMemoryBIOProtocol

File "/usr/lib64/python2.7/site-packages/twisted/protocols/tls.py", line 63, in <module>

  1. from twisted.internet.\_sslverify import \_setAcceptableProtocols

File "/usr/lib64/python2.7/site-packages/twisted/internet/_sslverify.py", line 38, in <module>

  1. TLSVersion.TLSv1\_1: SSL.OP\_NO\_TLSv1\_1,

AttributeError: 'module' object has no attribute 'OP_NO_TLSv1_1

看看 pip install scrapy需要的 pyopenssl twisted 等和你安装的版本一样么 我的就是因为TWist 版本高于 需要的

用pip install twisted==13.1.0 才成功

linux下递归删除目录下所有exe文件

今天给学校部署校庆网站,做网站的同学传给我的网站文件夹里,有很多exe文件,而且这些exe的文件名都是原来的目录名,看起来是他的机器中了病毒。虽然exe文件在linux下无法运行,但是还是要删除这些exe文件。

  1. 在网上找了一下,《linux下递归删除某个文件夹或文件》给了我满意的方法,让我可以一次性删除某目录及其子目录下所有的exe文件。

find . -name '*.exe' -type f -print -exec rm -rf {} \;(1) "." 表示从当前目录开始递归查找

(2) “ -name '*.exe' "根据名称来查找,要查找所有以.exe结尾的文件夹或者文件

(3) " -type f "查找的类型为文件

(4) "-print" 输出查找的文件目录名

(5) 最主要的是是-exec了,-exec选项后边跟着一个所要执行的命令,表示将find出来的文件或目录执行该命令。

  1. exec选项后面跟随着所要执行的命令或脚本,然后是一对儿{},一个空格和一个\,最后是一个分号

案例

删除所有的.pyc文件

find . -name '*.pyc' -exec rm -rf {} \;

Scrapyd指定版本号

默认情况下, scrapyd-deploy使用当前的时间戳作为版本号,我们可以使用—version来指定版本号

  1. scrapyd-deploy <target> -p <project> --version <version>

版本号的格式必须满足LooseVersion

  1. 1.5.1
  2. 1.5.2b2
  3. 161
  4. 3.10a
  5. 8.02
  6. 3.4j
  7. 1996.07.12
  8. 3.2.pl0
  9. 3.1.1.6
  10. 2g6
  11. 11g
  12. 0.960923
  13. 2.2beta29
  14. 1.13++
  15. 5.5.kw

如:设置版本号为0.1

  1. scrapyd-deploy tutorial_deploy -p tutorial --version 0.1
  2. Packing version 0.1
  3. Deploying to project "tutorial" in http://192.168.17.129:6800/addversion.json
  4. Server response (200):
  5. {"status": "ok", "project": "tutorial", "version": "0.1", "spiders": 2, "node_name": "ubuntu"}

如果使用了Git管理代码,可以使用GIT作为version的参数,也可以将它写入scrapy.cfg文件,那么就会使用当前的reversion作为版本号。

  1. [deploy:target]
  2. ...
  3. version = GIT

编辑文件:scrapy.cfg

  1. [deploy:tutorial_deploy]
  2. url = http://192.168.17.129:6800/
  3. project = tutorial
  4. username = enlong
  5. password = test
  6. version = GIT

当前版本号为r7-master

  1. scrapyd-deploy tutorial_deploy -p tutorial
  2. fatal: No names found, cannot describe anything.
  3. Packing version r7-master
  4. Deploying to project "douban-movies" in http://localhost:6800/addversion.json
  5. Server response (200):
  6. {"status": "ok", "project": "douban-movies", "version": "r7-master", "spiders": 1, "node_name": "sky"}