Main module of langchain (4): Memory

langchain

1. Concept

What is LangChain?

Origin: LangChain came about when Harrison was talking to some people in the field who were building complex LLM applications, and he was developing methods

I saw some parts that can be abstracted. An application may need to prompt LLM multiple times and parse its output, thus requiring a lot of copy-pasting to be written.

LangChain makes this development process easier. Once launched, it was widely adopted in the community, with not only many users but also many contributors participating.

Work with open source.

There is also the problem of the large model itself, which cannot perceive real-time data and cannot interact with the current world.

LangChain is a framework for developing large language models.

Main features:

\1. Data awareness: Ability to connect language models with other data sources.

\2. Agency: allows the language model to interact with its environment. You can do various things by writing tools, and write and update data.

Main values:

1. It components the functions needed to develop LLM and provides many tools for easy use.

2. There are some ready-made chains that can complete specific functions, which can also be understood as improving the ease of use of the tool.

2. Main modules

Insert image description here

LangChain provides standard, extensible interfaces and external integrations for the following modules, listed from least to most complex:

Model I/O

Interface with language models

Data connection

Interface with application-specific data

Chain assembly (Chains)

Construct call sequence

Agents

Let chain assembly choose which tools to use based on high-level instructions

Memory

Persist application state across multiple runs of chained assembly

Callbacks

Record and stream the intermediate steps of any chain assembly

3.Memory

By default, both the chained model and the proxy model are stateless, which means they handle each incoming query independently (just like the underlying LLMs and chat model itself). In some applications, such as chatbots, remembering previous interactions is crucial. Whether in the short or long term, remember previous interactions. The Memory class does exactly this. LangChain provides two forms of memory components. First, LangChain provides auxiliary tools for managing and manipulating previous chat messages. These tools are designed to be modular and useful no matter how you use them. Second, LangChain provides ways to easily integrate these tools into chained models.

from langchain.memory import ConversationBufferMemory
from langchain.chains import ConversationChain

conversation = ConversationChain(
    llm=model,
    verbose=True,
    memory=ConversationBufferMemory()
)
conversation.predict(input="你好啊!,我是张三")

Insert image description here

conversation.predict(input="你知道我的姓名吗")

Insert image description here

ConversationBufferMemory

memory = ConversationBufferMemory()
memory.save_context({"input": "你好啊!,我是张三"},
                    {"output": "你好,张三!很高兴认识你!你有什么问题想要问我吗?"})
memory.save_context({"input": "你知道我的姓名吗"},
                    {"output": "当然知道!您叫做张三。请问有什么我可以帮助您的?"})
memory.load_memory_variables({})

{'history': 'Human: Hello!, I am Zhang San\nAI: Hello, Zhang San! Nice to meet you! Do you have any questions for me? \nHuman: Do you know my name\nAI: Of course! Your name is Zhang San. Is there anything I can do to help you? '}

ConversationBufferWindowMemory

Control the number of memories by k

from langchain.memory import ConversationBufferWindowMemory

memory = ConversationBufferWindowMemory(k=1)

memory.save_context({"input": "你好啊!,我是张三"},
                    {"output": "你好,张三!很高兴认识你!你有什么问题想要问我吗?"})
memory.save_context({"input": "你知道我的姓名吗"},
                    {"output": "当然知道!您叫做张三。请问有什么我可以帮助您的?"})
memory.load_memory_variables({})

{'history': 'Human: Do you know my name\nAI: Of course! Your name is Zhang San. Is there anything I can do to help you? '}

ConversationTokenBufferMemory

Number of reserved tokens

from langchain.memory import ConversationTokenBufferMemory

memory = ConversationTokenBufferMemory(llm=llm, max_token_limit=20)
memory.save_context({"input": "AI is what?!"},
                    {"output": "Amazing!"})
memory.save_context({"input": "Backpropagation is what?"},
                    {"output": "Beautiful!"})
memory.save_context({"input": "Chatbots are what?"},
                    {"output": "Charming!"})
memory.load_memory_variables({})

{‘history’: ‘AI: Beautiful!\nHuman: Chatbots are what?\nAI: Charming!’}

ConversationSummaryMemory

Summarize what was said in the conversation

from langchain.memory import ConversationSummaryBufferMemory

# create a long string
schedule = "There is a meeting at 8am with your product team. \
You will need your powerpoint presentation prepared. \
9am-12pm have time to work on your LangChain \
project which will go quickly because Langchain is such a powerful tool. \
At Noon, lunch at the italian resturant with a customer who is driving \
from over an hour away to meet you to understand the latest in AI. \
Be sure to bring your laptop to show the latest LLM demo."

memory = ConversationSummaryBufferMemory(llm=llm, max_token_limit=100)
memory.save_context({"input": "Hello"}, {"output": "What's up"})
memory.save_context({"input": "Not much, just hanging"},
                    {"output": "Cool"})
memory.save_context({"input": "What is on the schedule today?"},
                    {"output": f"{schedule}"})

memory.load_memory_variables({})

{‘history’: ‘System: The human asks the AI what is on the schedule today. The AI responds that it is not currently set up to provide a schedule.\nAI: There is a meeting at 8am with your product team. You will need your powerpoint presentation prepared. 9am-12pm have time to work on your LangChain project which will go quickly because Langchain is such a powerful tool. At Noon, lunch at the italian resturant with a customer who is driving from over an hour away to meet you to understand the latest in AI. Be sure to bring your laptop to show the latest LLM demo.’}

Guess you like

Origin blog.csdn.net/qq128252/article/details/132847259