Tips to learn prompt
Why learn with hints? The gap between the target of the downstream task and the target of the pre-training is too large, so that the improvement effect is not obvious, and the fine-tuning process relies on a large amount of supervised corpus
Reduce semantic differences: pre-training tasks are mainly based on (MLM), while downstream tasks reintroduce new training parameters, so the goals of the two stages are usually quite different;
Avoid over-fitting: Since additional parameters need to be introduced in the Fine-tuning stage to adapt to the corresponding task needs, over-fitting is prone to occur when the number of samples is limited, which reduces the generalization ability of the model.
提示:写完文章后,目录可以自动生成,如何生成可参考右边的帮助文档
Article Directory
1. Prompt working principle
提示:这里可以添加本文要记录的大概内容:
For example: With the continuous development of artificial intelligence, the technology of machine learning is becoming more and more important. Many people have started to learn machine learning. This article introduces the basic content of machine learning.
提示:以下是本篇文章正文内容,下面案例可供参考
2. Prompt learning components
How to design prompts for other NLP tasks? In fact, the Great Master Liu Pengfei provided us with some references in his thesis
3. Prompt learning design
1. Artificial design template
Prompt's templates were originally designed manually. Artificial design is generally based on human natural language knowledge, and strives to obtain "templates" with smooth semantics and high efficiency. The advantage of manually designing templates is that it is intuitive, but the disadvantage is that it requires a lot of experimentation, experience, and language expertise. The figure below is an experimental result in the GPT Understands, Too paper
2. Automatic learning template
4. Why introduce prompt
Challenges and Prospects of Prompt
Prompt's design problem. At present, most of the work using Prompt focuses on classification tasks and generation tasks, and there are fewer other tasks, because how to effectively link pre-training tasks and prompts is still a problem worth exploring. In addition, the connection between the template and the answer has yet to be resolved. How to search at the same time or learn the best combination of the two is still very challenging.
Theoretical Analysis and Interpretability of Prompt. Although the Prompt method has been successful in many cases, there are still few theoretical analyzes and guarantees for prompt-based learning, making it difficult for people to understand why Prompt can achieve good results, and why Prompt, which has similar meanings in natural language Sometimes the effect is very different.
Application of Prompt in PLM debias. Since PLM has seen a large amount of natural language in the human world during pre-training, it is naturally affected. For example, there are a lot of "The capital of China is "Beijing."" in the training corpus, which leads the model to think that it will predict "Beijing" when it sees "capital" next time, instead of focusing on which country's capital it is. In In the process of application, Prompt also exposed many other biases learned by PLM, such as racial discrimination, terrorism, gender antagonism, and so on.