随机生成User-Agent——fake-useragent

安装

pip install fake-useragent

使用

基于python3的使用

fake-useragent在内部维护了很多的User-Agent,它提供了接口,只需要直接用就行。

>>> from fake_useragent import UserAgent
>>> ua=UserAgent()
>>> ua.firefox
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.8; rv:21.0) Gecko/20100101 Firefox/21.0'
>>> ua.chrome
'Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1500.55 Safari/537.36'
>>> ua.opera
'Opera/9.80 (X11; Linux i686; U; it) Presto/2.7.62 Version/11.00'
>>> ua.random
'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1453.90 Safari/537.36'
>>> ua.random
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/29.0.1547.62 Safari/537.36'

基于scrapy的使用

class PackageRequestHeader(object):
    def __init__(self, crawler):
        super(PackageRequestHeader, self).__init__()
        #实例化UserAgent对象
        self.useragent = UserAgent()
        #获取setting.py中USER_AGENT_TYPE的值,默认为random
        #可以改为firefox, chrome等
        self.useragent_type = crawler.settings.get('USER_AGENT_TYPE', 'random')

    @classmethod
    def from_crawler(cls, crawler):
        return cls(crawler)

    def process_request(self, request, spider):
        #该方法根据setting.py中的USER_AGENT_TYPE调用useragent的属性
        #并返回User-Agent
        def user_agent():
            #getattr(a,b)获取a中属性名为b的值
            return getattr(self.useragent, self.useragent_type)
        # 设置header
        request.headers['User-Agent'] = user_agent()

猜你喜欢

转载自blog.csdn.net/hengdawei3087/article/details/88087440