Introduction to chatGPT related open source projects

Category description

Small model : It can be trained and deployed on GPUs such as 3090 and 4090, and its capability is close to that of openAI's chatGPT large model.
Small models can be trained with their own specific data sets to create distinctive capabilities, not seeking broad capabilities, but seeking specialized capabilities. It is also a good idea to build vertical applications based on small models.

ChatGPT front-end alternative : you can open the official website of openAI to use chatGPT, and some big guys have developed desktop and web versions.

chatGPT prompt : About the prompt project, learn prompt well to communicate with chatGPT better.
Doing so is not as simple as copying, you can add voice recognition, voice playback, prompt help and so on. Such applications are implemented based on openAI's official gpt3.5 turbo API, and some are implemented based on reverse engineering of native chatGPT web pages.

Others : There are many interesting open source projects on GitHub, some of which can simulate the plug-in function of chatGPT, and some can parse pdf. Projects are springing up, and this article will continue to be updated.

small model

LLaMA

Project address:
https://github.com/facebookresearch/llama
The open-source language model LLaMA researched by facebook surpasses GPT-3 in most tasks, with a minimum model parameter size of 7B and a maximum of 65B, which is smaller than the 175B parameter size of GPT-3 a lot.

Chinese-ChatLLaMA

Project address:
https://github.com/ydli-ai/Chinese-ChatLLaMA
This project provides to the community

Guess you like

Origin blog.csdn.net/artistkeepmonkey/article/details/130215103