Install the ChatGPT-pandora project with the pagoda panel Docker, and directly build the token tutorial

This machine takes Alibaba Cloud lightweight cloud server as an example.

First, allow port 3002 in the firewall on the backend of the lightweight cloud server.

Then log in to the pagoda, click on Docker, if it prompts that it is not installed, please install Docker.

Then click on the Compose template and write:

version: '3'

services:
  app:
    image: pengzhile/pandora # 总是使用 latest ,更新时重新 pull 该 tag 镜像即可
    ports:
      - 3002:3002
    environment:# 二选一
      PANDORA_SERVER: 0.0.0.0:3002
      PANDORA_ACCESS_TOKEN: eyxxxx(这里是你的账号token)

Note: PANDORA_ACCESS_TOKEN parameter

You need to log in to the openai website in advance. Then visit this link https://chat.openai.com/api/auth/session to get access_token



As shown below, add the template

Then click Compose, then click Add Project. Select the Compose template (auto-selected), and enter a name. Click Add to download.

add a container

start command

docker run  -e PANDORA_CLOUD=cloud -e PANDORA_SERVER=0.0.0.0:3002 -p 3002:3002 -d pengzhile/pandora

After installation, in the container, you can see that the project is running.

At this time, you can access and use it through the server IP: 3002.

Guess you like

Origin blog.csdn.net/weixin_42100033/article/details/131569069