[Large language model] Quickly understand and deploy ChatGLM-6B in 10 minutes

Introduction of ChatGLM-6B

ChatGLM-6B is an open source dialogue robot released by the Knowledge Engineering Group (KEG) & Data Mining at Tsinghua University. According to the official introduction, this is a Chinese-English language model with a scale of 100 billion parameters. And optimized for Chinese. The open source version this time is a small-scale version of its 6 billion parameters, about 6 billion parameters, and only 6GB of video memory (INT4 quantization level) is required for local deployment.

ChatGLM-6B:https://github.com/THUDM/ChatGLM-6B

Capabilities of ChatGLM-6B :

  • 自我认知: "Who are you", "Introduce your strengths"
  • 提纲写作: "Help me write a blog outline introducing ChatGLM"
  • 文案写作: "Write 10 hot review copy"
  • 信息抽取: "Extract people, time, events from the above information"

Guess you like

Origin blog.csdn.net/ARPOSPF/article/details/131371090