[A card, Windows] stable diffusion webui download and installation pit avoidance guide

Reminder

The content of this article is my personal experience, one by one installation, download and test. Of course, if you prefer to use the fool-style integration package, then skip this article.

Of course, I don’t recommend the operation of this article, because it took me almost 1 hour to get a picture. If I have an N card, I will use the N card. This is just a helpless move. Host graphics cards are actually not that expensive these days.

For N card, you can refer to this article https://blog.csdn.net/binzai_16/article/details/130216343 , but pay attention to the avoidance guide of this article , otherwise you will need to spend time to make up for the previous mistakes mistake.

text

Download all the things that should be downloaded, just like the following directory list.

Pit avoidance 1: Don't rush to open after gitclone at the beginningwebui-user.bat

1.1 It is recommended to use the following git clone writing method

The reason for doing this is to avoid errors such as 443 or openSSL when cd to a certain folder when using it . It is said on the Internet that it is a problem with network speed or the overall situation, but it also includes permission problems.git bash

git clone git address "copied to local path address\folder name"

For example mine:

git clone https://github.com/lshqqytiger/stable-diffusion-webui-directml.git “D:\SD”

1.2 webui-user.batThe third edited line is as follows (because card A is used)

set COMMANDLINE_ARGS=–medvram --skip-torch-cuda-test --no-half --precision full --use-cpu all
或者
set COMMANDLINE_ARGS=–precision full --no-half --opt-sub-quad-attention --lowvram --disable-nan-check --autolaunch

1.3 Do not use the v1.5SD model

Put another model in the file directory models\Stable-diffusion, this is to avoid the installation after a long time, it prompts you that there is no model, and finally downloaded it.

Error message :

something went wrong

If you see webui-user.batthat the v1.5SD model is still generated by itself when you see it running, then delete it later.

So go to this download address https://cyberes.github.io/stable-diffusion-models/#stable-diffusion-1-4 , recommend downloading from Thunder, and put it in the folder I mentioned after downloading.

1.4 Download rust editor

You will definitely be confused when you see this, don't we want to use python ?

I don’t know about this either, just download it, otherwise you transformerswon’t be able to install later

Because it is windows to the official website to choose the recommended download method https://www.rust-lang.org/tools/install .

insert image description here
To execute the file, enter 1.

insert image description here
Test whether the installation is successful.
insert image description here

Pit avoidance 2: All downloads must be git clone, do not download locally

This is because git clone will automatically have one when it is downloaded . I don’t know what is the use ofnode tree this hash value . It should be similar to a key. If they are downloaded locally, they will not have this hash value, and an error will be prompted when running.webui-user.bat

The error is similar to:

reference is not a tree: 24268930bf1dce879235a7fddd0b2355b84d7ea6

2.1 After completing the above steps, runwebui-user.bat

Wait for a while, and when the command line tool displays the download installing GFPGAN, Ctrl+Cstop the batch tool operation, and a \venv\Scriptspath will be generated at this time.

2.2 The git clone command and corresponding operations are as follows (see your own installation directory changes)

use git bashtools

git clone https://github.com/TencentARC/GFPGAN.git "D:\SD\venv\Scripts\GFPGAN"

D:\SD\venv\Scripts\GFPGANThen enter the following command on the command line .

D:\SD\venv\Scripts\python.exe -m pip install basicsr facexlib 
D:\SD\venv\Scripts\python.exe -m pip install -r requirements.txt 
D:\SD\venv\Scripts\python.exe setup.py develop
D:\SD\venv\Scripts\python.exe -m pip install realesrgan

Then to download open_clip, the samegit bash

git clone https://github.com/mlfoundations/open_clip "D:\SD\venv\Scripts\open_clip"

Command line to D:\SD\venv\Scripts\open_clipenter the following command.

D:\SD\venv\Scripts\python.exe setup.py build install

It is also the same operation to CLIP

git clone https://github.com/openai/CLIP.git "D:\SD\venv\Scripts\CLIP"

Go to the corresponding directory and enter the following command

D:\SD\venv\Scripts\python.exe -m pip install ftfy regex tqdm
D:\SD\venv\Scripts\python.exe setup.py build install

2.3 gitclone and installation dependencies

In fact, the operation is also the same as 2.2 type, and the corresponding operation is also combined as follows. If there is no repositoriesfolder, create a new one yourself:

git clone https://github.com/Stability-AI/stablediffusion.git "D:\SD\repositories\stable-diffusion-stability-ai" 

git clone https://github.com/CompVis/taming-transformers.git "D:\SD\repositories\taming-transformers" 

git clone https://github.com/crowsonkb/k-diffusion.git "D:\SD\repositories\k-diffusion"

git clone https://github.com/sczhou/CodeFormer.git "D:\SD\repositories\CodeFormer" 

git clone https://github.com/salesforce/BLIP.git "D:\SD\repositories\BLIP"

But to go to each directory mentioned above, check whether there is requirements.txta file, if there is, go to the command line of the corresponding directory, and execute the following command

D:\SD\venv\Scripts\python.exe -m pip install -r requirements.txt 

2.4 Troublesome tokenziers error when pip transformers

Solution: Network problem, try several times (here I spent almost 3~4h to get it done)

Although the rust editor has been installed before , it still prompts various strange errors, such as the one below.

Building wheels for collected packages: tokenizers
  Building wheel for tokenizers (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Building wheel for tokenizers (pyproject.toml) did not run successfully.exit code: 1
  ╰─> [62 lines of output]
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build\lib.win-amd64-cpython-310
      creating build\lib.win-amd64-cpython-310\tokenizers
      copying py_src\tokenizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers
      creating build\lib.win-amd64-cpython-310\tokenizers\models
      copying py_src\tokenizers\models\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\models
      creating build\lib.win-amd64-cpython-310\tokenizers\decoders
      copying py_src\tokenizers\decoders\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\decoders
      creating build\lib.win-amd64-cpython-310\tokenizers\normalizers
      copying py_src\tokenizers\normalizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\normalizers
      creating build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers
      copying py_src\tokenizers\pre_tokenizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers
      creating build\lib.win-amd64-cpython-310\tokenizers\processors
      copying py_src\tokenizers\processors\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\processors
      creating build\lib.win-amd64-cpython-310\tokenizers\trainers
      copying py_src\tokenizers\trainers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\trainers
      creating build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\base_tokenizer.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\bert_wordpiece.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\byte_level_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\char_level_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\sentencepiece_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\sentencepiece_unigram.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      creating build\lib.win-amd64-cpython-310\tokenizers\tools
      copying py_src\tokenizers\tools\visualizer.py -> build\lib.win-amd64-cpython-310\tokenizers\tools
      copying py_src\tokenizers\tools\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\tools
      copying py_src\tokenizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers
      copying py_src\tokenizers\models\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\models
      copying py_src\tokenizers\decoders\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\decoders
      copying py_src\tokenizers\normalizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\normalizers
      copying py_src\tokenizers\pre_tokenizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers
      copying py_src\tokenizers\processors\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\processors
      copying py_src\tokenizers\trainers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\trainers
      copying py_src\tokenizers\tools\visualizer-styles.css -> build\lib.win-amd64-cpython-310\tokenizers\tools
      running build_ext
      running build_rust
      cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module --crate-type cdylib --
      warning: unused manifest key: target.x86_64-apple-darwin.rustflags
          Updating crates.io index
      warning: spurious network error (2 tries remaining): failed to send request: 鎿嶄綔瓒呮椂
      ; class=Os (2)
      warning: spurious network error (1 tries remaining): failed to send request: 鎿嶄綔瓒呮椂
      ; class=Os (2)
      error: failed to get `env_logger` as a dependency of package `tokenizers-python v0.10.3 (C:\Users\LENOVO\AppData\Local\Temp\pip-install-bw47mt33\tokenizers_3b1650888a634bdab4e4c98d147e7833)`

      Caused by:
        failed to load source for dependency `env_logger`

      Caused by:
        Unable to update registry `crates-io`

      Caused by:
        failed to fetch `https://github.com/rust-lang/crates.io-index`

      Caused by:
        failed to send request: 鎿嶄綔瓒呮椂
        ; class=Os (2)
      error: `cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module --crate-type cdylib --` failed with code 101
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for tokenizers
Failed to build tokenizers

Later, I found a lot of answers on the Internet, and found that their answers were not uniform at all. Later, I asked chatGPT, and their answers were as follows:

In human terms: chatGPT doesn’t know why this happens?

Note :

In fact, it is a network problem. It has nothing to do with your tool version. You can try a few more times, or find out if there is a faster download channel.

insert image description here

tokenizersThe error message indicates that there was a problem with the wheel file that built the package during the installation process . The error occurs when running cargo rustcthe command, which is used to compile the Rust code.

The specific error message is as follows:

error: failed to get `env_logger` as a dependency of package `tokenizers-python v0.10.3`

This error indicates that the build process was unable to fetch dependencies tokenizersrequired by the package .env_logger

The error further states the following:

Caused by:
  Unable to update registry `crates-io`

Caused by:
  failed to fetch `https://github.com/rust-lang/crates.io-index`

Caused by:
  failed to send request: 鎿嶄綔瓒呮椂
; class=Os (2)

These messages indicate a problem updating the registry and obtaining required dependencies. The error message failed to send request: 鎿嶄綔瓒呮椂indicates that there may be a network problem or a temporary problem with crates.io indexing.

Here are a few steps to resolve this issue:

  1. Check your network connection: Make sure you have a stable internet connection and can access external resources.

  2. Retry the installation: Sometimes network problems can be temporary. Try running the install command again to see if the problem is resolved.

  3. Clear pip cache: pip caches packages to improve installation speed. However, cache files can sometimes cause problems. You can try clearing the pip cache by running:

    pip cache purge
    
  4. Specify another package source: You can try to install the package from another package source. For example, you can use --index-urloptions to specify another package index:

    pip install --index-url https://pypi.python.org/simple/ tokenizers
    
  5. Upgrading pip and setuptools: Outdated versions of pip and setuptools can sometimes cause problems. Try running the following commands to upgrade them:

    pip install --upgrade pip
    pip install --upgrade setuptools
    
  6. Install Rust: tokenizersThe package depends on Rust. Make sure Rust is installed and up to date on your system. You can install Rust by following the instructions on the official Rust website (https://www.rust-lang.org/tools/install).

If the problem persists after trying these steps, it is recommended to search for tokenizerssimilar problems or error messages related to the package to see if there is a known solution or workaround. You can look for information on the forum or on the issue tracker.

Pit avoidance 3: Normal operation webui-user.bat, testing

If it downloads the v1.5SD model by itself inexplicably, delete it later, and you will get the following results normally,
insert image description here

Don't worry about it Installing requirements, just pass the meeting.

Enter URL http://127.0.0.1:7860to view
insert image description here

If it doesn't appear on the page something went wrongthen it's a success !

3.1 If AttributeError: 'NoneType' object has no attribute 'process_texts'an error

URL to answer your question https://github.com/vladmandic/automatic/issues/382

In fact, you didn’t wait for the SD model to be loaded before you started generating pictures, just try again later

3.2 如果出现RuntimeError: Could not allocate tensor with 402653184 bytes. There is not enough GPU video memory available!

That’s because when you use the A card, even if you have a 16G memory, the factor you want to adjust may require too much computing power, so the computer just quits. The solution is as follows:

  • Set the configuration of webui.bat to medvram, exchange time for space
  • Use your own linux virtual machine
  • Whoring Google's colar GPU for nothing
  • Using some SD plug-ins also sacrifices quality for time.
    For details, please refer to this post https://www.zhihu.com/question/590045937

Guess you like

Origin blog.csdn.net/kokool/article/details/130888083