爬虫如何设置ua和代理ip

一、设置User-Agent
1、创建Request对象时指定headers
url = 'http://ip.zdaye.com/'
head = {}
head['User-Agent'] = 'Mozilla/5.0 (Linux; Android 4.1.1; Nexus 7 Build/JRO03D) AppleWebKit/535.19 (KHTML, like Gecko) Chrome/18.0.1025.166 Safari/535.19'
req = request.Request(url, headers=head)
#传入创建好的Request对象
response = request.urlopen(req)
html = response.read().decode('utf-8')
print(html)
2、创建Request对象后使用add_header
req.add_header('User-Agent', 'Mozilla/5.0 (Linux; Android 4.1.1; Nexus 7 Build/JRO03D) AppleWebKit/535.19 (KHTML, like Gecko) Chrome/18.0.1025.166 Safari/535.19')

参数为 (key, value)

二、设置代理IP
使用install_opener方法之后,会将程序默认的urlopen方法替换掉。也就是说,如果使用install_opener之后,在该文件中,再次调用urlopen会使用自己创建好的opener。如果不想替换掉,只是想临时使用一下,可以使用opener.open(url),这样就不会对程序默认的urlopen有影响。
from urllib import request
if name == "main":
url = www.16yun.cn'
#这是代理IP
proxy = {'http':'168.68.8.88:66666'}
#创建ProxyHandler
proxy_support = request.ProxyHandler(proxy)
#创建Opener
opener = request.build_opener(proxy_support)
#添加User Angent
opener.addheaders = [('User-Agent','Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36')]
#安装OPener
request.install_opener(opener)
#使用自己安装好的Opener
response = request.urlopen(url)
html = response.read().decode("utf-8")
print(html)

猜你喜欢

转载自blog.51cto.com/14201222/2376110