Installation of transformers to avoid pits

1.4 Download rust editor

You will definitely be confused when you see this. Don’t we need to use python ?

I don’t know about this either. Just download it, otherwise you transformerswon’t be able to install it later.

Because it is windows, go to the official website and choose the recommended download method https://www.rust-lang.org/tools/install .

Insert image description here
To execute the file, enter 1.

Insert image description here
Test whether the installation is successful.
Insert image description here

Avoid Pitfall 2: All downloads must be git cloned, do not download locally

This is because git clone will automatically have one when downloading node tree. I don’t know what the hash value is for . It should be similar to a key. If they are downloaded locally, they will not have this hash value and webui-user.batan error will be prompted when running.

The error is similar to:

reference is not a tree: 24268930bf1dce879235a7fddd0b2355b84d7ea6

2.1 After completing the above steps, runwebui-user.bat

Wait for a while, when the command line tool displays the download installing GFPGAN, Ctrl+Cstop the batch tool operation, and a \venv\Scriptspath will be generated.

2.2 The git clone command and corresponding operations are as follows (see your own installation directory changes)

use git bashtools

git clone https://github.com/TencentARC/GFPGAN.git "D:\SD\venv\Scripts\GFPGAN"

D:\SD\venv\Scripts\GFPGANThen enter the following command on the command line .

D:\SD\venv\Scripts\python.exe -m pip install basicsr facexlib 
D:\SD\venv\Scripts\python.exe -m pip install -r requirements.txt 
D:\SD\venv\Scripts\python.exe setup.py develop
D:\SD\venv\Scripts\python.exe -m pip install realesrgan

Then go to download open_clip, it’s the samegit bash

git clone https://github.com/mlfoundations/open_clip "D:\SD\venv\Scripts\open_clip"

D:\SD\venv\Scripts\open_clipEnter the following command on the command line .

D:\SD\venv\Scripts\python.exe setup.py build install

The same operation is done to CLIP

git clone https://github.com/openai/CLIP.git "D:\SD\venv\Scripts\CLIP"

Go to the corresponding directory and enter the following command

D:\SD\venv\Scripts\python.exe -m pip install ftfy regex tqdm
D:\SD\venv\Scripts\python.exe setup.py build install
2.3 gitclone and installation dependencies

In fact, the operation is also the same as the 2.2 type. The corresponding operations are also combined as follows. If there is no repositoriesfolder, just create a new one yourself:

git clone https://github.com/Stability-AI/stablediffusion.git "D:\SD\repositories\stable-diffusion-stability-ai" 

git clone https://github.com/CompVis/taming-transformers.git "D:\SD\repositories\taming-transformers" 

git clone https://github.com/crowsonkb/k-diffusion.git "D:\SD\repositories\k-diffusion"

git clone https://github.com/sczhou/CodeFormer.git "D:\SD\repositories\CodeFormer" 

git clone https://github.com/salesforce/BLIP.git "D:\SD\repositories\BLIP"

But you need to go to each directory mentioned above to check whether there are requirements.txtfiles. If there are files, go to the command line of the corresponding directory and execute the following command

D:\SD\venv\Scripts\python.exe -m pip install -r requirements.txt 
2.4 Troublesome tokenziers errors when using pip transformers
Solution: Network problem, try a few more times (it took me almost 3 to 4 hours to get it successful)

Although the rust editor has been installed before , it still prompts all kinds of strange errors, such as the one below.

Building wheels for collected packages: tokenizers
  Building wheel for tokenizers (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Building wheel for tokenizers (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [62 lines of output]
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build\lib.win-amd64-cpython-310
      creating build\lib.win-amd64-cpython-310\tokenizers
      copying py_src\tokenizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers
      creating build\lib.win-amd64-cpython-310\tokenizers\models
      copying py_src\tokenizers\models\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\models
      creating build\lib.win-amd64-cpython-310\tokenizers\decoders
      copying py_src\tokenizers\decoders\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\decoders
      creating build\lib.win-amd64-cpython-310\tokenizers\normalizers
      copying py_src\tokenizers\normalizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\normalizers
      creating build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers
      copying py_src\tokenizers\pre_tokenizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers
      creating build\lib.win-amd64-cpython-310\tokenizers\processors
      copying py_src\tokenizers\processors\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\processors
      creating build\lib.win-amd64-cpython-310\tokenizers\trainers
      copying py_src\tokenizers\trainers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\trainers
      creating build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\base_tokenizer.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\bert_wordpiece.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\byte_level_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\char_level_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\sentencepiece_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\sentencepiece_unigram.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      creating build\lib.win-amd64-cpython-310\tokenizers\tools
      copying py_src\tokenizers\tools\visualizer.py -> build\lib.win-amd64-cpython-310\tokenizers\tools
      copying py_src\tokenizers\tools\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\tools
      copying py_src\tokenizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers
      copying py_src\tokenizers\models\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\models
      copying py_src\tokenizers\decoders\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\decoders
      copying py_src\tokenizers\normalizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\normalizers
      copying py_src\tokenizers\pre_tokenizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers
      copying py_src\tokenizers\processors\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\processors
      copying py_src\tokenizers\trainers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\trainers
      copying py_src\tokenizers\tools\visualizer-styles.css -> build\lib.win-amd64-cpython-310\tokenizers\tools
      running build_ext
      running build_rust
      cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module --crate-type cdylib --
      warning: unused manifest key: target.x86_64-apple-darwin.rustflags
          Updating crates.io index
      warning: spurious network error (2 tries remaining): failed to send request: 鎿莴綔璒呮椂
      ; class=Os (2)
      warning: spurious network error (1 tries remaining): failed to send request: 鎿莴綔璒呮椂
      ; class=Os (2)
      error: failed to get `env_logger` as a dependency of package `tokenizers-python v0.10.3 (C:\Users\LENOVO\AppData\Local\Temp\pip-install-bw47mt33\tokenizers_3b1650888a634bdab4e4c98d147e7833)`

      Caused by:
        failed to load source for dependency `env_logger`

      Caused by:
        Unable to update registry `crates-io`

      Caused by:
        failed to fetch `https://github.com/rust-lang/crates.io-index`

      Caused by:
        failed to send request: 鎿莔綔璓呮椂
        ; class=Os (2)
      error: `cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module --crate-type cdylib --` failed with code 101
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for tokenizers
Failed to build tokenizers

Later I looked for many answers on the Internet and found that their answers were not uniform at all. Later I asked chatGPT and their answers were as follows:

Speaking in human terms: chatGPT I don’t know why this is happening?

Notice :

In fact, it is a network problem. It has nothing to do with your tool version. You can try a few more times or find out if there is a faster download channel.

Insert image description here

tokenizersThe error message indicates that there was a problem with the wheel file that built the package during the installation process . The error occurs while running cargo rustcthe command used to compile the Rust code.

The specific error message is as follows:

error: failed to get `env_logger` as a dependency of package `tokenizers-python v0.10.3`

This error indicates that the build process was unable to obtain the dependencies tokenizersrequired by the package .env_logger

The error further explains the following:

Caused by:
  Unable to update registry `crates-io`

Caused by:
  failed to fetch `https://github.com/rust-lang/crates.io-index`

Caused by:
  failed to send request: 鎿莔綔璓呮椂
; class=Os (2)

These messages indicate a problem updating the registry and obtaining the required dependencies. The error message failed to send request: 鎿嶄綔瓒呮椂indicates that there may be a network problem or a temporary problem with the crates.io index.

Here are a few steps to resolve the issue:

  1. Check your network connection: Make sure you have a stable internet connection and can access external resources.

  2. Retry the installation: Sometimes network problems can be temporary. Try running the installation command again to see if the problem is resolved.

  3. Clear pip cache: pip caches packages to speed up installation. However, caching files can sometimes cause problems. You can try clearing the pip cache by running the following command:

    pip cache purge
    
  4. Specify a different package source: You can try to install the package from a different package source. For example, you can use --index-urlthe option to specify another package index:

    pip install --index-url https://pypi.python.org/simple/ tokenizers
    
  5. Upgrade pip and setuptools: Outdated pip and setuptools versions can sometimes cause problems. Try running the following commands to upgrade them:

    pip install --upgrade pip
    pip install --upgrade setuptools
    
  6. Install Rust: tokenizersPackages depend on Rust. Make sure Rust is installed on your system and is the latest version. You can install Rust by following the instructions on the official Rust website (https://www.rust-lang.org/tools/install).

If the problem persists after trying these steps, it is recommended to search for tokenizerssimilar issues or error messages related to the package to see if there are any known solutions or workarounds. You can look for information on the forum or issue tracker.

Guess you like

Origin blog.csdn.net/chaishen10000/article/details/134354776