FastChat localized installation and deployment - own private ChatGPT

Written in front: This is my first post to write down some of my experiences of staying up late for a few days, hoping to give AI enthusiasts a lesson. Ever since I saw the Stanford-ALPACA project, I have been wondering whether I can locally deploy a set of AI CHAT, so I searched many websites, but none of them could write down the FastChat deployment completely, so I came up with this article:

1 Preliminary preparation: N card (24G or above video memory), 16G memory, i5 or above CPU

2 Operating system: Ubuntu 22.04 https://ubuntu.com/download/desktop/thank-you?version=22.04.2&architecture=amd64

3 Install N card driver:

3.1 Select the Nk driver that comes with the system

3.2 Download the latest    official driver | NVIDIA

3.3 Use nvidia-smi to confirm whether the driver is installed successfully

4 Installation environment:

sudo apt update
sudo apt install tmux htop

wget https://repo.anaconda.com/archive/Anaconda3-2022.10-Linux-x86_64.sh
bash Anaconda3-2022.10-Linux-x86_64.sh

conda create -n fastchat python=3.9
conda activate fastchat


git clone https://github.com/lm-sys/FastChat.git   
#如果失败加上https://ghproxy.com/https://github.com/lm-sys/FastChat.git 
cd FastC

Guess you like

Origin blog.csdn.net/wesunhu/article/details/130012152