Click on " Python Crawler and Data Mining " above to pay attention
Reply to " Books " to get a total of 10 e-books on Python from entry to advanced
now
Day
chicken
Soup
In the middle of the night, I suddenly dream about juvenile affairs, and the dream cries, make-up and tears are red.
Hello everyone, I am Pippi.
I. Introduction
A few days ago, I asked a Python
question about web crawlers in the Python Silver Group [Hou De Zai Wu], and I will share it with you here.
2. Implementation process
In fact, this problem can be solved with a for loop. It seems that the fan's code does not include the request headers, resulting in the inability to obtain data. Later, [Teacher Yuliang] and [Little Prince] gave specific ideas to help fans solve problems.
Later, when he was running, he also encountered an exception, and the error was reported as follows:
This problem seems to be that no data was obtained. Later [Wei Ge] addressed this problem and gave an exception handling solution, as follows:
res = response.json()
try:
data = res["data"]
symbol1 = data["quote"]["symbol"]
name = data["quote"]["name"]
current = data["quote"]["current"]
chg = data["quote"]["chg"]
percent = data["quote"]["percent"]
print(symbol1, name, current, chg, percent)
with open('股票.csv', 'a+', encoding='utf-8') as f:
f.write('{},{},{},{},{}\n'.format(symbol1, name, current, chg, percent))
except:
print("该股票url无具体信息: ", symbol)
But then this exception handling was not favored. Here [Teacher Yuliang] has optimized the program, the code is as follows:
if res['data']['tags'] is not None:
data = res["data"]
symbol1 = data["quote"]["symbol"]
name = data["quote"]["name"]
current = data["quote"]["current"]
chg = data["quote"]["chg"]
percent = data["quote"]["percent"]
print(symbol1, name, current, chg, percent, " ==> 数据下载成功!")
with open('股票.csv', 'a+', encoding='utf-8') as f:
f.write('{},{},{},{},{}\n'.format(symbol1, name, current, chg, percent))
else:
print(f"{symbol}无具体信息: ", res)
time.sleep(1)
Later tests found that, in fact, the is not None in if res['data']['tags'] is not None: can be removed, but it is more friendly to novices if added. In addition, it is also possible to use if res['data']['tags'] in [Teacher Yuliang] code, and change the tag in the judgment to if res['data']['quote']:, so When printing, it will be more intuitive.
Solved the problem of fans smoothly. There are many ways, all roads lead to Rome, as long as it can solve the problem.
Finally [kim] also shared a knowledge point, the reasons for common types of error reporting, I hope it will be helpful to everyone's study.
3. Summary
Hello everyone, I am Pippi. This article mainly takes stock of a Python
web crawler problem. Aiming at this problem, the article gives specific analysis and code implementation to help fans solve the problem smoothly.
Finally, I would like to thank fans [such creatures] for asking questions, and thank [Teacher Yuliang], [Wei Ge], [kim], [巭嫑勥烎] for their ideas and code analysis, and thanks to [冫马讠成], [Ineverleft] Others participated in the learning exchange.
[Supplementary questions] Warm reminder, when you ask questions in the group. You can pay attention to the following points: if it involves large file data, you can desensitize the data, send some demo data (meaning small files), and then paste some code (the kind that can be copied), and remember to send the screenshot of the error report (complete cut ). If there are not many codes, just send the code text directly. If the code exceeds 50 lines, just send a .py file.
If you have any problems during the learning process, please feel free to contact me to solve them (my WeChat: pdcfighting1). At the request of fans, I have created some high-quality Python paid learning exchange groups and paid order receiving groups. Welcome everyone to join me Python learning exchange group and order receiving group!
Friends, hurry up and practice it! If you encounter any problems during the learning process, please add me as a friend, and I will pull you into the Python learning exchange group to discuss learning together.
------------------- End -------------------
Recommendations for past wonderful articles:
Inventory of a Python web crawler over verification code (method 3)
Inventory of a Python web crawler over verification code (method 2)
Inventory of a Python web crawler over verification code (method 1)
Welcome everyone to like, leave a message, forward, repost, thank you for your company and support
If you want to join the Python learning group, please reply in the background [ join the group ]
Thousands of rivers and thousands of mountains are always in love, can you click [ Looking ]
/Today's Message Topic/
Just say a few words~~