langchain: Prompt in hand, I have the world

Introduction

Prompts are the input of large language models, and they are a powerful tool for applications based on large language models. There are no bad large language models, only bad prompts.

Only by writing prompts well can you unleash 300% of the power of a large language model.

In theory, it is not that easy to write prompts well, but langchain has turned this theory into reality. Let’s take a look.

OK prompt

Sometimes, it's not that the language model we use is not good enough, but that the prompts we write are not good enough.

Here are some principles for writing prompts for large language models:

  1. Specific and detailed: Prompts should have clear questions or tasks while containing enough detail and background information for large language models to understand and answer.

  2. Understandable and answerable: Prompts should be clear and clear so that large language models can understand and answer them. Avoid using language that is too abstract, vague, or offensive.

  3. Contextual and contextual: Prompts should contain enough contextual and background information to allow the large language model to understand the importance and meaning of the question and provide meaningful information in the answer.

  4. Have goals and direction: Prompts should clarify the goal and direction of the question or task so that the large language model can provide clear and useful answers to the required information.

  5. Extensible and customizable: Prompts should be designed to be easily extensible and customizable to suit different application scenarios and user needs.

Because many times, in similar scenarios, the general structure of our prompts is the same, only the specific details are different. At this time, we need to use prompt template.

What is a prompt template?

Prompt template is a prompt template. Through prompt template, we can quickly generate multiple prompts.

Basically, the prompt template has helped us describe the scene and what to do. We just need to fill in the specific content.

Here is a simple example of a prompt template:

from langchain import PromptTemplate


template = """/
假如你是一个金融公司的理财经理,请你分析一下{stock}这只股票。
"""

prompt = PromptTemplate.from_template(template)
prompt.format(stock="腾讯控股")

假如你是一个金融公司的理财经理,请你分析一下腾讯控股这只股票。

In this way, the user only needs to enter the name of the stock that needs to be inquired about. The other long string of text is no longer needed, which greatly saves the time of prompt construction.

Of course, this is just a very simple example. You can also set the format of the answer in the prompt template, provide specific examples, etc., to get a better reply.

Create prompt template in langchain

To put it simply, prompt template is a formatted input thing. In langchain, the corresponding tool class is called PromptTemplate.

In the simple example above, we have roughly seen how to use PromptTemplate.

In the above example, we called the PromptTemplate.from_template method and passed in a template string.

In the template string, we define a variable using parentheses. Finally, call the prompt.format method, specify the name and value of the variable, and complete the final creation of the prompt.

In addition, multiple variables can be specified in the prompt template:

template = "请告诉我一个关于{personA}的{thingsB}"

prompt_template = PromptTemplate.from_template(template)
prompt_template.format(personA="小张", thingsB="故事")

Just specify the variable name in format.

In addition to using the PromptTemplate.from_template method, we can also directly use the PromptTemplate constructor to create prompts.

The constructor of PromptTemplate can accept two parameters: input_variables and template.

input_variables is the variable name in the template, which is an array.

Template is the specific content of the template, which is a string.

For example, we can construct a variableless template:

no_input_prompt = PromptTemplate(input_variables=[], template="这是一个无参数模板。")
no_input_prompt.format()

We can also construct templates with parameters:

one_input_prompt = PromptTemplate(input_variables=["stock"], template="假如你是一个金融公司的理财经理,请你分析一下{stock}这只股票。")
one_input_prompt.format(stock="腾讯控股")

There are also templates with multiple parameters:

multiple_input_prompt = PromptTemplate(
    input_variables=["personA", "thingsB"], 
    template="请告诉我一个关于{personA}的{thingsB}"
)
multiple_input_prompt.format(personA="小张", thingsB="故事")

Chat-specific prompt template

I mentioned to you before when I introduced langchain that although chat is based on LLM, it is still different from the basic LLM.

The main difference is that chat messages are for different roles. For example, in openai, chat messages can be divided into roles such as AI, human or system.

Although this is a bit more complicated, it can better classify the messages.

Let’s take a look at the PromptTemplates for chat in langchain:

from langchain.prompts import (
    ChatPromptTemplate,
    PromptTemplate,
    SystemMessagePromptTemplate,
    AIMessagePromptTemplate,
    HumanMessagePromptTemplate,
)

Like the ordinary prompt template, we can call the from_template of MessagePromptTemplate to create the corresponding prompt:

template="现在你的角色是{role},请按该角色进行后续的对话."
system_message_prompt = SystemMessagePromptTemplate.from_template(template)
human_template="{text}"
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)

Of course you can create prompts through the constructor:

prompt=PromptTemplate(
    template="现在你的角色是{role},请按该角色进行后续的对话.",
    input_variables=["role"],
)

After you have one or more MessagePromptTemplates, you can use these MessagePromptTemplates to build ChatPromptTemplate:

chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt])

chat_prompt.format_prompt(role="医生", text="帮我看看我的颜值还行吗?").to_messages()

Summarize

Okay, the basic prompt template in langchain has been introduced. Everyone, go and try it.

Guess you like

Origin blog.csdn.net/superfjj/article/details/131653231