防封禁策略

Scrapy:

http://doc.scrapy.org/en/master/topics/practices.html#avoiding-getting-banned

如何让你的scrapy爬虫不再被ban

根据scrapy官方文档:http://doc.scrapy.org/en/master/topics/practices.html#avoiding-getting-banned里面的描述,要防止scrapy被ban,主要有以下几个策略。

由于Google cache受国内网络的影响,你懂得;所以主要从动态随机设置user agent、禁用cookies、设置延迟下载和使用代理IP这几个方式。

本文以cnblogs为例

  • 创建middlewares.py  scrapy代理IP、user agent的切换都是通过DOWNLOADER_MIDDLEWARES进行控制,下面我们创建middlewares.py文件。
  1. 编辑 `middlewares.py`
  1. [root@bogon cnblogs]# vi cnblogs/middlewares.py

如下内容:

  1. import random
  2. import base64
  3. from settings import PROXIES
  4. class RandomUserAgent(object):
  5. """Randomly rotate user agents based on a list of predefined ones"""
  6. def __init__(self, agents):
  7. self.agents = agents
  8. @classmethod
  9. def from_crawler(cls, crawler):
  10. return cls(crawler.settings.getlist('USER_AGENTS'))
  11. def process_request(self, request, spider):
  12. #print "**************************" + random.choice(self.agents)
  13. request.headers.setdefault('User-Agent', random.choice(self.agents))
  14. class ProxyMiddleware(object):
  15. def process_request(self, request, spider):
  16. proxy = random.choice(PROXIES)
  17. if proxy['user_pass'] is not None:
  18. #request.meta['proxy'] = "http://YOUR_PROXY_IP:PORT"
  19. request.meta['proxy'] = "http://%s" % proxy['ip_port']
  20. #proxy_user_pass = "USERNAME:PASSWORD"
  21. encoded_user_pass = base64.encodestring(proxy['user_pass'])
  22. request.headers['Proxy-Authorization'] = 'Basic ' + encoded_user_pass
  23. print "**************ProxyMiddleware have pass************" + proxy['ip_port']
  24. else:
  25. print "**************ProxyMiddleware no pass************" + proxy['ip_port']
  26. request.meta['proxy'] = "http://%s" % proxy['ip_port']

类RandomUserAgent主要用来动态获取user agent,user agent列表USER_AGENTS在settings.py中进行配置。

类ProxyMiddleware用来切换代理,proxy列表PROXIES也是在settings.py中进行配置。

如果你用的是socks5代理,那么对不起,目前scrapy还不能直接支持,可以通过Privoxy等软件将其本地转化为http代理

  • 修改settings.py配置USER_AGENTS和PROXIES
  1. a):添加USER_AGENTS
  1. USER_AGENTS = [
  2. "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; AcooBrowser; .NET CLR 1.1.4322; .NET CLR 2.0.50727)",
  3. "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0; Acoo Browser; SLCC1; .NET CLR 2.0.50727; Media Center PC 5.0; .NET CLR 3.0.04506)",
  4. "Mozilla/4.0 (compatible; MSIE 7.0; AOL 9.5; AOLBuild 4337.35; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)",
  5. "Mozilla/5.0 (Windows; U; MSIE 9.0; Windows NT 9.0; en-US)",
  6. "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Win64; x64; Trident/5.0; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET CLR 2.0.50727; Media Center PC 6.0)",
  7. "Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 6.0; Trident/4.0; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET CLR 1.0.3705; .NET CLR 1.1.4322)",
  8. "Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.2; .NET CLR 1.1.4322; .NET CLR 2.0.50727; InfoPath.2; .NET CLR 3.0.04506.30)",
  9. "Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN) AppleWebKit/523.15 (KHTML, like Gecko, Safari/419.3) Arora/0.3 (Change: 287 c9dfb30)",
  10. "Mozilla/5.0 (X11; U; Linux; en-US) AppleWebKit/527+ (KHTML, like Gecko, Safari/419.3) Arora/0.6",
  11. "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.2pre) Gecko/20070215 K-Ninja/2.1.1",
  12. "Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN; rv:1.9) Gecko/20080705 Firefox/3.0 Kapiko/3.0",
  13. "Mozilla/5.0 (X11; Linux i686; U;) Gecko/20070322 Kazehakase/0.4.5",
  14. "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.8) Gecko Fedora/1.9.0.8-1.fc10 Kazehakase/0.5.6",
  15. "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.56 Safari/535.11",
  16. "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_3) AppleWebKit/535.20 (KHTML, like Gecko) Chrome/19.0.1036.7 Safari/535.20",
  17. "Opera/9.80 (Macintosh; Intel Mac OS X 10.6.8; U; fr) Presto/2.9.168 Version/11.52",
  18. ]
  1. b):添加代理IP设置PROXIES
  1. PROXIES = [
  2. {'ip_port': '111.11.228.75:80', 'user_pass': ''},
  3. {'ip_port': '120.198.243.22:80', 'user_pass': ''},
  4. {'ip_port': '111.8.60.9:8123', 'user_pass': ''},
  5. {'ip_port': '101.71.27.120:80', 'user_pass': ''},
  6. {'ip_port': '122.96.59.104:80', 'user_pass': ''},
  7. {'ip_port': '122.224.249.122:8088', 'user_pass': ''},
  8. ]

代理IP可以网上搜索一下,上面的代理IP获取自:http://www.xici.net.co/

  1. c):禁用cookies
  1. COOKIES_ENABLED=False
  1. d):设置下载延迟
  1. DOWNLOAD_DELAY=3
  1. e):最后设置DOWNLOADER_MIDDLEWARES
  1. DOWNLOADER_MIDDLEWARES = {
  2. 'cnblogs.middlewares.RandomUserAgent': 1,
  3. 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware': 110,
  4. 'cnblogs.middlewares.ProxyMiddleware': 100,
  5. }

保存settings.py

3、测试

  1. [root@bogon cnblogs]# scrapy crawl CnblogsSpider

本文的user agent和proxy列表都是采用settings.py的方式进行设置的,实际生产中user agent和proxy有可能会经常更新,每次更改配置文件显得很笨拙也不便于管理。因而,可以根据需要保存在mysql数据库