scrapy use -Request

Request object we write reptiles, data crawling a need to re-send a request when called. This class needs to pass some parameters. The more commonly used parameters are:

url requested objects 1.url

Callback 2.callback-downloaded after downloading the corresponding data performed

3.method request mode, the default is the GET method, other methods may be provided

4.meta commonly used for transferring data for an initial value .Request.meta properties between different requests. Given, then this parameter will be passed dict shallow copy.

5.encoding coding. The default is utf-8, default on it.

6.dot_filter the scheduling means no filter, it is more used in the implementation of repeated requests.

7.cookie request cookie.

8. errback     function when the error occurred executed.

Response object you generally have to Scrapy automatically builds, so developers do not need to be concerned about how to create a Response object, but how to use it, the Response object has many properties that can be extracted using open data, mainly about the property:

1.meta pass over other requests from the meta attributes can be used to maintain a data connection between a plurality of requests.

Returns the current string 2.encoding encoding and decoding formats.

3.text The data is returned back to a unicode string.

4.xpath xpath selectors.

5.css css selector.

6.body will return back to the data bytes as a string.

7.status HTTP status response. The default is 200.

8.flags Response.flags a list of attributes contains an initial value. If the given list will be shallow copy

The initial value of the property 9.request Response.request. This represents Request generate this response.

Guess you like

Origin www.cnblogs.com/superSmall/p/12063968.html