NLP deep learning solves the en_core_web_sm installation problem

The allennlp package is used in natural language processing, but the problem that en_core_web_sm needs to be installed after the correct installation:
Normal situation: python -m spacy download en_core_web_sm can be installed correctly
but sometimes it will prompt ConnectionError (433) error, here is because Mirror URL error caused by
error correction method:

  • First
    find the .condarc file, which is usually hidden under home. The view method is as follows

    1.linux hidden directory file command

    If you don’t ask for a price, you can create one first

    [root@localhost data]# touch a.txt to
    modify the file as a hidden attribute

    [root@localhost data]# mv a.txt .a.txt
    2. Show hidden directory files

    You can choose any of the following commands to show hidden files and directories:

    [root@localhost data]# l.
    . … .201912110952 .a.txt

    [root@localhost data]# ls -d .*
    .… .201912110952 .a.txt
    At the same time, you can use ll -a to display all files and directories:

    [root@localhost data]# ll -a
    total usage 0
    dr-xr-xr-x. 19 root root 248 December 11
    09 : 50 drwxr-xr-x. 2 root root 6 December 11 09:52 .201912110952
    -rw-r--r--. 1 root root 0 December 11 11:29 .a.txt
    Insert picture description here

  • Then enter
    cat .condarc to open the file and delete the defaut line. The above picture is the result of the deletion.

  • Finally,
    use conda install -c conda-forge spacy-model-en_core_web_sm to achieve the correct installation of en_core_web_sm

Guess you like

Origin blog.csdn.net/liqiang12689/article/details/106417696