Getting Started Guide to Prompt Project: Basic Principles and Practice (1)

Imagine you are decorating a room. You can choose a standard set of furniture, which is quick and convenient, but may not exactly fit your personal style or needs. On the other hand, you can also choose to customize your furniture, choosing specific colors, materials, and designs to ensure that each piece fits your preferences and space requirements.

For example, when choosing a custom-made sofa, you can decide its size, fabric type, and even the style of the armrests. You can also choose colors to match the theme of the room and even add some unique touches like embroidery or special stitching.

This is the concept of Prompt project. Just like you customize furniture by selecting different design elements, Prompt engineering involves fine-tuning the AI's inputs to get more tailored results. By changing, adding, or refining the input cues, you can guide the AI ​​to produce output that better suits a specific request or style, like picking and customizing the right furniture for a room.

Many people's knowledge and understanding of the large language model is limited to the level of intelligent answering, positioning it as an intelligent answering robot that replaces the search platform. However, as more large models are gradually applied in various fields, and more enterprise business implementation scenarios are constantly being explored, people will gradually realize that the capabilities of large models go far beyond just question and answer functions. If we want to start a formal and in-depth study of the application of large models, the first step will be to learn about the Prompt prompt project. This series of articles will comprehensively explain the Prompt prompt project from simple to in-depth and its actual implementation. The empowerment is not limited to direct media. Application in operations, AIGC copywriting generation, and voice and video synthesis.

Insert image description here

1.What is the Prompt project?

In the context of AI, "Prompt" usually refers to a request or question put to the model. The form and content of this request or question will affect the output of the model. For example: in a text generation model, the prompt can be a question, a topic or a description, and the model generates the corresponding text based on this prompt. Prompt engineering is the process by which people refine the prompts they input into a generative artificial intelligence (AI) service to generate text or images. Anyone can use generators like Wenyan Yixin and DALL-E to operate through natural language. This is also a technique used by AI engineers when refining large language models (LLMs) with specific or recommended hints.

For example, if you were using Wenyan Yixin to brainstorm a professional overview for your resume, you might write this command: "Write a sample professional overview for a market analyst." For Wenyan Yixin's answer, you might give Get feedback like “too formal” or “shortened to no more than 100 words.” The Prompt project is to continuously organize and optimize each prompt word, so that LLM can make the most effective answer that meets our needs. In some cases, tuning hints may be the only way to improve the quality of model output, especially when using pre-trained models that do not allow direct modification of internal mechanisms.

Insert image description here

Let’s take a look at the example given by ChatGPT’s Prompt engineer:

  • For text models like ChatGPT:

    • What’s the difference between a professional summary and an executive summary?
    • Write a professional summary for a marketing analyst looking for a marketing manager job.
    • Now trim it down to less than 60 words.
    • Rewrite it with a less formal tone.
  • For image models like DALL-E:

    • A painting of a cat.
    • A painting of a cat chasing a mouse in Impressionist style.
    • Now use only warm tones in the painting.

    The text model ChatGPT and DALL-E examples are similar. Taking the text model ChatGPT as an example, there are four prompt words proposed:

    The first sentence raises a question, professional and executive summary, and leads to two specific goals;

    The second sentence specifies the scenario and specific service objects.

    The third sentence sets out requirements for generated content.

    The fourth sentence adds emotional content to the generated text.

    What we need to know here is that because generative AI is a bot trained on data generated by humans and machines, it does not have the ability to sift through what you are communicating to understand what you are actually saying. Which means you get what you say you get. When entering queries into an AI generator, it’s best to use specific, simple language for the query and no unnecessary fillers. For example, you might query "Write an outline for an academic research proposal, including a title, abstract, and next steps," instead of "Write an outline that includes a title, abstract, and next steps." Therefore, there are also some techniques and principles for building Prompt projects, which can be summed up simply as follows:

    Question Prompt

    Question-based prompts ask questions to guide the model to generate answers. This type of prompt is the most direct and simple, and is suitable for scenarios where specific information needs to be obtained. Like the first example given What's the difference between a professional summary and an executive summary?

    Declarative Prompt

    Declarative prompts guide the model to generate relevant content by describing a scenario or state. This type of prompt pays more attention to the creativity and expressiveness of the model, and is suitable for scenarios that require the model to generate descriptions, stories, or opinions. For example, Write a professional summary for a marketing analyst looking for a marketing manager job.

    Imperative Prompt

    Imperative prompts guide the model to complete specific tasks by giving clear instructions. This type of prompt requires the model to have strong execution capabilities and is suitable for scenarios that require the model to perform tasks such as translation, error correction, and programming. For example, now trim it down to less than 60 words. Now trim it down to less than 60 words.

    Emotion incoming prompt

    Under certain conditions, we may need to generate some text content with strong emotions and subjective thoughts to criticize or criticize. If we want to generate some such content, we need to let the language model know where we are. environment and subjective emotional color. Prompt emotional words are not provided, and the AI ​​answers will always be content generation with neutral emotions.

    Of course, the above is only a superficial classification for the Prompt project. To truly classify specific scenarios, Prompt engineers must prescribe the right medicine. Refer to Microsoft's example of vertical prompt complexity to help us better understand the prompt project:

    Instruction Type Prompt Completion
    Simple instructions Write an introduction for a weekly newsletter. Welcome to our weekly newsletter. This week we have a special focus on productivity. We’ll be sharing tips and tricks on how to be productive in all areas of your life. We hope you enjoy and please share with your friends!
    Complex instructions Write an introduction for a weekly newsletter to my company named Contoso. It should mention the great all hands meeting we had last week and thank the team for all their hard work over the tough past few months. Welcome to Contoso’s weekly newsletter! We had a great all hands meeting last week and I want to thank the team for all their hard work over the tough past few months. We’re looking forward to a bright future and I’m confident that with our team’s dedication and commitment, we will achieve great things. Thank you for being a part of our team and I look forward to working with you all in the coming weeks and months. Sincerely, [Your name].
    Complex Instructions Write an introduction for a weekly newsletter, including the following:
    - My company is named contoso
    - An exuberant greeting
    - Thanking the team for their hard work over a couple tough months
    - Positive outlook for the coming quarter
    - Signed by the SLT
    Hello Contoso team! I wanted to take a moment to thank you for all your hard work over the past few months. It’s been a tough couple of months, but you’ve all pulled together and done an amazing job. I’m really proud of you all. Looking ahead to the next quarter, I’m feeling really positive. I think we’re going to achieve some great things. Thanks again, The Senior Leadership Team.

2.What does a Prompt engineer mainly do?

The main job of Prompt engineers is to design, optimize and refine prompts (prompts) for generative artificial intelligence systems. These prompts are textual inputs, such as text, images, or code, that guide the AI ​​system to generate specific output. The role of prompt engineers may be different in different AI application fields, mainly building feature prompt words according to the business scenario. Prompt engineers are important for AI engineers to create better services, such as chatbots that handle customer service tasks or generate legal contracts. Ensuring that a generative AI service like Wenyan Yixin delivers output requires engineers to build the code and train the AI ​​on a wide range of accurate data.

In the big data-driven world, we have reached a point where training AI models can help deliver solutions more efficiently without having to manually sort through large amounts of data. The right Prompt engineers can also identify and mitigate injection attacks (malicious intrusions into the logic behind ChatGPT or chatbots) to ensure companies provide consistent and accurate service. Coursera’s requirements for prompt engineers are:

Prompt engineers need to be skilled in the fundamentals of natural language processing(NLP), including libraries and frameworks, Python programming language, generative AI models, and contribute to open-source projects.

Prompt engineers need to be proficient in the basics of natural language processing (NLP), including libraries and frameworks, the Python programming language, generating artificial intelligence models, and contributing to open source projects. Successful tips often rely on the practice of "one-off" or "little" learning. This refers to one or more examples of the desired behavior of the model, usually by including input and output pairs. This is not learning in the sense that the model is permanently changed, but rather in the sense that the example better tunes the model so that it responds only to the current reasoning required. Using hints without examples is sometimes called "zero-shot" learning. Using the Chat Completion API, a small number of hot learning examples are typically added to the message array in the form of sample user/assistant interactions after the initial system message:

Learning Type Prompt Completion
Zero-shot Headline: Coach confident injury won’t derail Warriors Topic: The coach is confident that the injury won’t derail the Warriors’ season. The team is still focused on their goals and that they will continue to work hard to achieve them.
Few-shot Headline: Twins’ Correa to use opt-out, test free agency Topic: Baseball Headline: Qatar World Cup to have zones for sobering up Topic: Soccer Headline: Yates: Fantasy football intel for Week 6 Topic: Football Headline: Coach confident injury won’t derail Warriors Topic: Basketball

Prompt engineers generally divide prompts into several main categories based on their purpose and how they are designed:

Zero-shot Prompting :

  • In this case, the prompt is designed to directly ask the model a question or request without providing any specific training examples.
  • This approach relies on the knowledge that the model has already acquired through pre-training.

One-shot or Few-shot Prompting :

  • This approach includes one or several examples in the prompt to guide the model on how to respond.
  • These examples serve as demonstrations of how models should handle similar situations.

Chain-of-thought Prompting :

  • In this approach, prompts are designed to guide the model to demonstrate its thinking process, especially when solving complex problems.
  • This helps improve transparency and interpretability of model output.

Template-based Prompting :

  • In this case, prompts are designed according to specific templates or structures that are designed to effectively stimulate the correct response of the model.
  • This method is usually used in specific application scenarios, such as text classification or entity recognition.

Conversational Prompting :

  • This prompt is designed in the form of a conversation, simulating natural language conversation scenarios to guide the model to respond in the conversation environment.
  • This approach works well for chatbots and interactive applications.

Task-specific Prompting :

  • This type of prompt is specifically tailored for specific tasks or application scenarios, such as translation, summarization, question answering, etc.
  • This method emphasizes optimizing the design of prompts based on task requirements.

Here we only expand a specific Prompting to demonstrate to you, and do not expand all forms. Subsequent articles will explain it in detail, taking template-based Prompting as an example:

Task : Perform sentiment analysis on a given text passage to determine whether it is positive, negative, or neutral.

Template Prompt :

  • "Text: [Text paragraph to be classified]
  • Sentiment analysis results: The sentiment of this text is [positive/negative/neutral]. "

In this example, the template consists of two parts:

  1. "Text:" is followed by the text paragraph to be classified.
  2. "Sentiment Analysis Result:" guides the model to make a sentiment classification based on the provided text.

A specific example of using templated prompts might be:

  • “Text: I had a great day today, the weather was nice and I had a great day with my friends.
  • Sentiment analysis results: The sentiment of this text is [positive/negative/neutral]. "

Insert image description hereInsert image description here

This chapter has enough content. The next article will expand on the construction and use of Prompt in different scenarios.

If you want to get more content, please feel free to chat privately and follow it to prevent it from getting lost. If there are any mistakes, please leave a message for advice. Thank you very much.

This chapter has enough content. The next article will expand on the construction and use of Prompt in different scenarios.

おすすめ

転載: blog.csdn.net/master_hunter/article/details/135405734