PrivateGPT (how to deploy and use experience)

insert image description here

foreword

An open source project called PrivateGPT has recently appeared on GitHub. The project aims to provide a new chat tool for users facing sensitive data, classified information or personal privacy issues. PrivateGPT has complete data control capabilities, enabling users to interact with powerful language models in a local environment without uploading data to the Internet or sharing it with others. The good news is that PrivateGPT is a free and open-source project that anyone can download and use on GitHub. This openness enables more people to benefit from the powerful features of PrivateGPT and contribute to its improvement and development.

At present, there are two similar projects on github, the address is as follows:
imartinez/privateGPT
SamurAIGPT/privateGPT
The models used by both projects are ggml-gpt4all-j-v1.3-groovy.bin, so the effect is similar in theory, but , the second one has a visual interface, yes, it actually has a front-end interface, so as a qualified front-end er, you must choose it


1. Deployment

I deployed it on the ubuntu 18.04 server. If you don’t have a python environment, you can read my article ChatGLM-6B (introducing related concepts, basic environment construction and deployment) , which contains detailed python environment construction process . Next, we will officially start the construction of privateGPT

1. clone project

git clone https://github.com/SamurAIGPT/privateGPT

2. Install dependencies

# Go to client folder and run the below commands
npm install
npm run dev
# Go to server folder and run the below commands 
# 此步骤是有坑的,平坑步骤见第二小节
pip install -r requirements.txt
python privateGPT.py

3. View project

Just open 127.0.0.1:3000 directly, the interface is as follows
insert image description here

2. Problems encountered in deployment

1. python version selection

Try to choose the python3.8 environment. Although "Requirements Python 3.8 or later" is written in README.me, an error will be reported when starting the server in python3.10

2.python package prompt

......
Failed to build hnswlib llama-cpp-python
ERROR: Could not build wheels for hnswlib, llama-cpp-python, which is required to install pyproject.toml-based projects

Solution:
add llama-cpp-python0.1.50 and pyllamacpp2.3.0 is removed from requirements.txt and then repackaged

3. Execute python privateGPT.py to report an error

 /lib/x86_64-linux-gnu/libm.so.6: version `GLIBC_2.29' not found

Solution:
reinstall GLIBC_2.29, but you need to be very careful when installing, because this library is the core library of linux, one step of caution may cause the system to crash, the following solutions are for your reference only

# 下载编译glibc-2.29
wget -c https://ftp.gnu.org/gnu/glibc/glibc-2.29.tar.gz
tar -zxvf glibc-2.29.tar.gz
mkdir glibc-2.29/build
cd glibc-2.29/build
../configure --prefix=/opt/glibc
# 执行此步骤可能导致类似于Makeconfig:42: *** missing separator.的错误,应该是上一步执行因部分包版本比较旧导致失败,直接apt update下那两包即可
make 
make install
# 建立软链
cp /opt/glibc/lib/libm-2.29.so /lib/x86_64-linux-gnu/
ln -sf libm-2.29.so libm.so.6
# 查看结果
strings /lib/x86_64-linux-gnu/libm.so.6 | grep GLIBC_

insert image description here

4. The front end cannot be accessed in the LAN

I put this project on the server, and then used my local computer to check the link, and found that the back-end interface could not be accessed.
Solution: replace http://localhost in
the client/components/MainContainer.js and client/components /ConfigSideNav.js files with the server ip


3. Use experience

I uploaded his own README.md , and then asked questions. At first, I used Chinese, but found that he did not support it. The server reported an error as follows: Then I asked
insert image description here
two citation questions using

1.how to run
2.what the Requirements in the docs

The problem comes from the content of the document, which is as follows
insert image description here

The answer is as follows: You can feel it yourself
insert image description here
. I am not sure if my opening method is wrong. I feel that this answer is outrageous. If someone knows what the problem is, I hope you can give me some advice.


Summarize

In general, I think the current PrivateGPT does not perform well in answering simple questions, much worse than ChatGPT. But I still think that the concept of PrivateGPT is in line with the current trend, and after continuous improvement, it will surely achieve a qualitative leap.

You are welcome to correct me at any time.

Guess you like

Origin blog.csdn.net/qq_39544148/article/details/131014207