在m1芯片的mac os上安装huggingface tokenizers报错

报错

在m1芯片的mac os上安装huggingface tokenizers报错:

Building wheels for collected packages: pyyaml, tokenizers
  Building wheel for pyyaml (pyproject.toml) ... done
  Created wheel for pyyaml: filename=PyYAML-6.0-cp38-cp38-macosx_11_0_arm64.whl size=45335 sha256=e27236fa2771f8d6ffbba947c48931a8fcf95ad33d77b91d4c693d04d5344710
  Stored in directory: /Users/soulteary/Library/Caches/pip/wheels/fe/be/21/a238a4532fd03d32998d6a07c6b4f572ea8cb4eaa89ddc2a41
  Building wheel for tokenizers (pyproject.toml) ... error
  error: subprocess-exited-with-error
  
  × Building wheel for tokenizers (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [51 lines of output]
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build/lib.macosx-11.1-arm64-cpython-38
      creating build/lib.macosx-11.1-arm64-cpython-38/tokenizers
      copying py_src/tokenizers/__init__.py -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers
      creating build/lib.macosx-11.1-arm64-cpython-38/tokenizers/models
      copying py_src/tokenizers/models/__init__.py -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/models
      creating build/lib.macosx-11.1-arm64-cpython-38/tokenizers/decoders
      copying py_src/tokenizers/decoders/__init__.py -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/decoders
      creating build/lib.macosx-11.1-arm64-cpython-38/tokenizers/normalizers
      copying py_src/tokenizers/normalizers/__init__.py -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/normalizers
      creating build/lib.macosx-11.1-arm64-cpython-38/tokenizers/pre_tokenizers
      copying py_src/tokenizers/pre_tokenizers/__init__.py -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/pre_tokenizers
      creating build/lib.macosx-11.1-arm64-cpython-38/tokenizers/processors
      copying py_src/tokenizers/processors/__init__.py -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/processors
      creating build/lib.macosx-11.1-arm64-cpython-38/tokenizers/trainers
      copying py_src/tokenizers/trainers/__init__.py -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/trainers
      creating build/lib.macosx-11.1-arm64-cpython-38/tokenizers/implementations
      copying py_src/tokenizers/implementations/byte_level_bpe.py -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/implementations
      copying py_src/tokenizers/implementations/sentencepiece_unigram.py -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/implementations
      copying py_src/tokenizers/implementations/sentencepiece_bpe.py -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/implementations
      copying py_src/tokenizers/implementations/base_tokenizer.py -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/implementations
      copying py_src/tokenizers/implementations/__init__.py -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/implementations
      copying py_src/tokenizers/implementations/char_level_bpe.py -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/implementations
      copying py_src/tokenizers/implementations/bert_wordpiece.py -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/implementations
      creating build/lib.macosx-11.1-arm64-cpython-38/tokenizers/tools
      copying py_src/tokenizers/tools/__init__.py -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/tools
      copying py_src/tokenizers/tools/visualizer.py -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/tools
      copying py_src/tokenizers/__init__.pyi -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers
      copying py_src/tokenizers/models/__init__.pyi -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/models
      copying py_src/tokenizers/decoders/__init__.pyi -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/decoders
      copying py_src/tokenizers/normalizers/__init__.pyi -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/normalizers
      copying py_src/tokenizers/pre_tokenizers/__init__.pyi -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/pre_tokenizers
      copying py_src/tokenizers/processors/__init__.pyi -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/processors
      copying py_src/tokenizers/trainers/__init__.pyi -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/trainers
      copying py_src/tokenizers/tools/visualizer-styles.css -> build/lib.macosx-11.1-arm64-cpython-38/tokenizers/tools
      running build_ext
      running build_rust
      error: can't find Rust compiler
      
      If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
      
      To update pip, run:
      
          pip install --upgrade pip
      
      and then retry package installation.
      
      If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for tokenizers
Successfully built pyyaml
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects

解决办法

解决问题的方法很简单,只需要在 MacOS 上完成 Rust 的安装即可:

curl https://sh.rustup.rs -sSf | sh

完成安装之后,可以使用 rustc --version 来做一个简单的命令“可执行”的验证:

猜你喜欢

转载自blog.csdn.net/mch2869253130/article/details/128336902