Langchain uses OpenAI chat model

Langchain uses OpenAI chat model

This notebook describes how to get started with the OpenAI chat model.

sample code,

from langchain.chat_models import ChatOpenAI
from langchain.prompts.chat import (
    ChatPromptTemplate,
    SystemMessagePromptTemplate,
    AIMessagePromptTemplate,
    HumanMessagePromptTemplate,
)
from langchain.schema import AIMessage, HumanMessage, SystemMessage
chat = ChatOpenAI(temperature=0)

The sample code above assumes that your OpenAI API key is already set in an environment variable. If you want to manually specify the API key and/or organization ID, use the following code:

chat = ChatOpenAI(temperature=0, openai_api_key="YOUR_API_KEY", openai_organization="YOUR_ORGANIZATION_ID")

Remove the openai_organization parameter if it doesn't work for you.

messages = [
    SystemMessage(
        content="You are a helpful assistant that translates English to French."
    ),
    HumanMessage(
        content="Translate this sentence from English to French. I love programming."
    ),
]
chat(messages)

You can MessagePromptTemplateuse templates by using .

MessagePromptTemplatesYou can build from one or more ChatPromptTemplate.

You can use ChatPromptTemplate- format_promptthis returns PromptValue, which you can convert to a string or a Message object, depending on whether you want to use the formatted value as input to the llm or chat model.

For convenience, a from_templatemethod is exposed on the template. If you were to use this template, it would look like this:

template = (
    "You are a helpful assistant that translates {input_language} to {output_language}."
)
system_message_prompt = SystemMessagePromptTemplate.from_template(template)
human_template = "{text}"
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)
chat_prompt = ChatPromptTemplate.from_messages(
    [system_message_prompt, human_message_prompt]
)

# get a chat completion from the formatted messages
chat(
    chat_prompt.format_prompt(
        input_language="English", output_language="French", text="I love programming."
    ).to_messages()
)
chat_prompt = ChatPromptTemplate.from_messages(
    [system_message_prompt, human_message_prompt]
)

# get a chat completion from the formatted messages
chat(
    chat_prompt.format_prompt(
        input_language="English", output_language="French", text="I love programming."
    ).to_messages()
)

end!

Guess you like

Origin blog.csdn.net/engchina/article/details/131873420