Huggingface's from pretrained download proxy server method setting

The author needs to download the pre-training model, but at this time the mirroring of TUNA and BSFU has stopped, and I hope there is an available way to download the pre-training model on the Internet.
At this time, I found the document Configuration of huggingface

According to the documentation, the parameter proxies (Dict, optional)

  • A dictionary of proxy servers to use by protocol or endpoint,e.g.:

{'http': 'foo.bar:3128', 'http://hostname': 'foo.bar:4012'}.

The proxies are used on each request.

You can proxy http(s) traffic

proxies={'http': 'http://127.0.0.1:8118', 'https': 'http://127.0.0.1:8118'}

BertTokenizerFast.from_pretrained("bert-base-uncased", proxies=proxies)
 

There is a similar solution on StackOverflow How to specify a proxy in transformers pipeline

 If it doesn’t work, zoom in:

--------------------------------------------------------------

I failed the following method, the result version (TF or PT) downloaded so easily did not match, failed! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !

  cd 到 ~/.cache/huggingface/diffusers/models--runwayml--stable-diffusion-v1-5/snapshots/39593d5650112b4cc580433f6b0435385882d819/unet

 aria2c   https://huggingface.co/runwayml/stable-diffusion-v1-5/resolve/main/unet/diffusion_pytorch_model.safetensors  --max-connection-per-server=4 --min-split-size=1M  --all-proxy='http://127.0.0.1:8118'

Guess you like

Origin blog.csdn.net/u010087338/article/details/128666892