[python] wget batch download files (time series files of many years, months and days)

The example code is as follows (I downloaded NSIDC's sea ice drift speed data for many years, months and days)

import wget
import os

str1 = "https://thredds.met.no/thredds/fileServer/osisaf/met.no/ice/conc/"
str2 = "ice_conc_nh_polstere-100_multi_"
str3 = "1200.nc"
for year in range(2006,2023):
    for month in range(1, 13):
        for day in range(1, 32):
            url = str1 + "{}".format(year) +"/"+ "{:02d}".format(month)+"/"+ str2+"{}".format(year)+"{:02d}".format(month) + "{:02d}".format(day) + str3
            path1 = '/mnt/d/SIC'
            file_path = os.path.join(path1, "{}".format(year),"{:02d}".format(month))
            if not os.path.exists(file_path):
                os.makedirs(file_path)
            
            try:
                wget.download(url, out=file_path)
            except:
                print("File:" + str2 + "{0}{1:02d}{2:02d}".format(year, month, day) + str3 + " does not exist")

Just modify it according to the file you want to download. The modified places are:

①str1/ste2/str3/url: Just modify it according to the specific URL

② Three for loops: change to the time series you want to download

③path1: The folder path where the downloaded file is stored (file_path divides the file into different folders according to the year and month)

Guess you like

Origin blog.csdn.net/Mluoo/article/details/128557284
Recommended