Hello everyone, in order to improve efficiency, we often use some Python efficiency tools in our daily work. As an older programming language, Python can realize various automation of daily work.
In order to facilitate the development of projects, here are a few Python efficiency tools recommended for you. Like to remember to collect, like, and follow.
1. Pandas - for data analysis
Pandas is a powerful toolset for analyzing structured data; its use is based on Numpy (providing high-performance matrix operations); it is used for data mining and data analysis, and also provides data cleaning functions.
# 1、安装包
$ pip install pandas
# 2、进入python的交互式界面
$ python -i
# 3、使用Pandas>>> import pandas as pd>>> df = pd.DataFrame() >>> print(df)
# 4、输出结果
Empty DataFrame
Columns: []
Index: []
2. Selenium - automated testing
Selenium is a tool for web application testing to test applications from the end user's perspective. Browser incompatibilities are easier to spot by running tests in different browsers. And it works with many browsers.
A simple test can be done by opening a browser and visiting Google's homepage:
from selenium import webdriver
import time
browser = webdriver.Chrome(executable_path ="C:\Program Files (x86)\Google\Chrome\chromedriver.exe")
website_URL ="https://www.google.co.in/"
brower.get(website_URL)
refreshrate = int(3) #每3秒刷新一次Google主页。
# 它会一直运行,直到你停掉编译器。
while True:
time.sleep(refreshrate)
browser.refresh()
3. Flask - Micro Web Framework
Flask is a lightweight customizable framework, written in Python, which is more flexible, lightweight, secure and easy to use than other frameworks of the same type. Flask is currently a very popular web framework. Developers can use the Python language to quickly implement a website or web service.
from flask import Flask
app = Flask(__name__)
@app.route('/')
def hello_world():
return 'Hello, World!'
4, Scrapy - page crawling
Scrapy can provide you with powerful support, enabling you to accurately scrape information from websites. is very practical.
Now basically most developers use crawler tools to automate crawling work. So you can use this Scrapy when writing crawler code.
Starting Scrapy Shell is also very simple:
scrapy shell
We can try to extract the value of the search button on the Baidu homepage. First, we need to find the class used by the button. An inspect element shows that the class is "bt1".
Specifically do the following:
response = fetch("https://baidu.com")
response.css(".bt1::text").extract_first()
==> "Search"
5. Requests - make API calls
Requests is a powerful HTTP library. With it you can easily send requests. No need to manually add query strings to URLs. In addition to this, there are many functions, such as authorization processing, JSON/XML parsing, session processing, etc.
Official example:
>>> r = requests.get('https://api.github.com/user', auth=('user', 'pass'))
>>> r.status_code
200
>>> r.headers['content-type']
'application/json; charset=utf8'
>>> r.encoding
'utf-8'
>>> r.text
'{"type":"User"...'
>>> r.json()
{
'private_gists': 419, 'total_private_repos': 77, ...}
6. Faker - for creating fake data
Faker is a Python package that generates fake data for you. Whether you need to bootstrap a database, create nice XML documents, fill in your persistence to stress testing it, or fetch data of the same name from a production service, Faker is for you
With it, you can generate fake names, addresses, descriptions, etc. very quickly! The following script as an example, I create a contact entry with name, address and some description text:
Install:
pip install Faker
from faker import Faker
fake = Faker()
fake.name()
fake.address()
fake.text()
7. Pillow - image processing
Python image processing tools - Pillow has quite powerful image processing functions. It can be used when you need to do image processing in ordinary times. After all, as a developer, you should choose a more powerful image processing tool.
Simple example:
from PIL import Image, ImageFilter
try:
original = Image.open("Lenna.png")
blurred = original.filter(ImageFilter.BLUR)
original.show()
blurred.show()
blurred.save("blurred.png")
except:
print "Unable to load image"
Effective tools can help us complete tasks more quickly, so I will share with you a few tools that I think are easy to use, and I hope these 7 efficiency tools in Python can help you.
recommended article
-
Li Hongyi's "Machine Learning" Mandarin Course (2022) is here
-
Someone made a Chinese version of Mr. Wu Enda's machine learning and deep learning
-
I'm addicted, and recently I gave the company a big visual screen (with source code)
-
So elegant, 4 Python automatic data analysis artifacts are really fragrant
-
It's very fragrant, and 20 visual large-screen templates have been organized
Technology Exchange
Welcome to reprint, collect, like and support!
At present, a technical exchange group has been opened, and the group has more than 2,000 members . The best way to remark when adding is: source + interest direction, which is convenient to find like-minded friends
- Method 1. Send the following picture to WeChat, long press to identify, and reply in the background: add group;
- Method ②, add micro-signal: dkl88191 , note: from CSDN
- Method ③, WeChat search public account: Python learning and data mining , background reply: add group