Python batch compresses all subdirectories in a directory on window

1. Demand:

This was because I wanted to compress more than 20 subdirectories (that is, folders) in a directory, but I couldn't compress them all together (compression is very CPU and memory, and then I wondered whether python can be done, a I compressed it myself, and then I found the zipfile module), not to mention, directly upload the function I encapsulated, pass the directory that needs to be compressed, and then use the os module to read the subdirectories under a directory, so Don't delay me doing other things.

Just upload the code, take it away and use it

import zipfile
import os

def zip_yasuo(start_dir):
    file_news = start_dir + '.zip'
    if not os.path.isfile(file_news):
        z = zipfile.ZipFile(file_news, 'w', zipfile.ZIP_DEFLATED)
        for dir_path, dir_names, file_names in os.walk(start_dir):
            file_path = dir_path.replace(start_dir, '')
            file_path = file_path and file_path + os.sep or ''
            for filename in file_names:
                z.write(os.path.join(dir_path, filename), file_path+filename)
        z.close()

if __name__ == "__main__"
    base_path = r"主目录"
    base_path_list = os.listdir(base_path)
    for base_path_list_one in base_path_list:
        base_path_list_one_dir = os.path.join(base_path,base_path_list_one)
        # 子目录
        print("准备压缩需要压缩的子目录", base_path_list_one_dir)
        if os.path.isdir(base_path_list_one_dir):
            zip_yasuo(base_path_list_one_dir)
            

Just execute and wait for compression. You don’t need to stare at it. Compress one file before compressing the other. If there are only one or two directories, there is no need to be so troublesome. If there are more, you can save trouble. The compression speed is not bad.

Feel good, give me a thumbs up
Insert picture description here

Guess you like

Origin blog.csdn.net/weixin_42081389/article/details/105320752
Recommended