Python is a super easy to use third-party library AkShare to crawl financial data

Python is a super easy to use third-party library AkShare to crawl financial data, crawl the data and save it in excel format

This is a super easy to use third-party module I found when I was crawling data!
It also has an official website like this https://www.akshare.xyz/zh_CN/latest/introduction.html There are also sample codes on the official website! I will not give you examples one by one here,
although the official website recommends installing python3.7 or higher! But I can still use python3.6

Installation code (make sure you install python and pip first):

pip install akshare -i http://mirrors.aliyun.com/pypi/simple/ --trusted-host=mirrors.aliyun.com  --upgrade

Note: Direct pip install akshare will be very slow!
By the way, this package seems to support R language? (Although I haven't used it, but the official website says it can)

Why do I highly recommend this?
Before I crawled the code, I had to query the webpage one by one, and then select which piece of xpath to crawl, crawl it into txt and then convert it into excel form

use this! Although the sample code is very simple (I will attach the sample code from the official website to you first)

import akshare as ak
stock_zh_a_spot_df = ak.stock_zh_a_spot()
print(stock_zh_a_spot_df)

The crawled content is like this (and you won't need to pop up the web page to crawl!)
Insert picture description here
This is the content displayed in the output line (because there is too much information, the output line will use... omit the intermediate information)

Yes, yes, you read that right! No need for you to typeset and organize! Just borrow this library! Three lines of code data get!
But if you have to save it in excel format, you still have to modify the code!
Attach the modified code

import time
import akshare as ak
import pandas as pd

s = ''
current_time=time.strftime('%Y-%m-%d',time.localtime())
for i in current_time.split("-"):
    s+=i
stock_zh_a_spot_df = ak.stock_zh_a_spot()
a = stock_zh_a_spot_df

# 根据日期定义文件名字
current_time=time.strftime('%Y-%m-%d',time.localtime())
file_name=current_time+".xlsx"

writer = pd.ExcelWriter(file_name, encoding="utf-8-sig")
a.to_excel(writer, "sheet1")
writer.save()
print("数据保存成功")

Run, and then you can get the crawled form!

Is it super simple? ? Is the code super concise? ? Big love, right! !

If you think this article is useful to you, you can give the author a thumbs up or bookmark it. Your support is my great motivation! ! Thank you.

Guess you like

Origin blog.csdn.net/m0_50481455/article/details/109067336