Nanny-level JAVA docking ChatGPT tutorial, realize your own AI dialogue assistant

1 Introduction

Hello everyone, I am Wang Laoshi. Recently, OpenAI opened the latest gpt-3.5-turbo model of chatGPT. According to reports, this model is the same model used by the current official website. If you have not experienced ChatGPT, then today I will teach you how to Break down the network barriers and create your own smart assistant. This article includes the application of API Key and the establishment of network proxy, so without further ado, let's start now.

2. Docking process

2.1. Acquisition of API-Key

The first step is to obtain the API Key of the OpenAI interface. This Key is the token you use to call the interface and is mainly used for interface authentication. To obtain the key, you must first register an OpenAi account. For details, please refer to my other article, ChatGPT nanny-level registration tutorial .

  1. Openhttps://platform.openai.com/ website, click view API Key,

image.png

  1. Click to create key

image.png

  1. The pop-up window displays the generated key, remember to copy the key, otherwise you will not be able to find the key later, you can only create it again.

image.png

Save the API Key for future use

2.2. Check API usage

Here you can check the usage of the API. The default trial quota for new account registration is $5, which was $18 before. After the API cost dropped, the trial quota was severely cut, haha.

image.png

2.3. Core code implementation

2.3.1. pom dependencies

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.webtap</groupId>
    <artifactId>webtap</artifactId>
    <version>0.0.1</version>
    <packaging>jar</packaging>

    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>2.1.2.RELEASE</version>
    </parent>

    <dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-thymeleaf</artifactId>
        </dependency>
        <dependency>
            <groupId>nz.net.ultraq.thymeleaf</groupId>
            <artifactId>thymeleaf-layout-dialect</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-data-jpa</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-devtools</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-mail</artifactId>
        </dependency>

        <dependency>
            <groupId>mysql</groupId>
            <artifactId>mysql-connector-java</artifactId>
        </dependency>
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-lang3</artifactId>
            <version>3.4</version>
        </dependency>
        <dependency>
            <groupId>commons-codec</groupId>
            <artifactId>commons-codec</artifactId>
        </dependency>
        <dependency>
            <groupId>org.jsoup</groupId>
            <artifactId>jsoup</artifactId>
            <version>1.9.2</version>
        </dependency>
        <!-- alibaba.fastjson -->
        <dependency>
            <groupId>com.alibaba</groupId>
            <artifactId>fastjson</artifactId>
            <version>1.2.56</version>
        </dependency>
        <dependency>
            <groupId>net.sourceforge.nekohtml</groupId>
            <artifactId>nekohtml</artifactId>
            <version>1.9.22</version>
        </dependency>
        <dependency>
            <groupId>com.github.pagehelper</groupId>
            <artifactId>pagehelper-spring-boot-starter</artifactId>
            <version>1.4.1</version>
        </dependency>
        <dependency>
            <groupId>org.projectlombok</groupId>
            <artifactId>lombok</artifactId>
        </dependency>
        <dependency>
            <groupId>org.apache.httpcomponents</groupId>
            <artifactId>httpasyncclient</artifactId>
            <version>4.0.2</version>
        </dependency>
        <dependency>
            <groupId>org.apache.httpcomponents</groupId>
            <artifactId>httpcore-nio</artifactId>
            <version>4.3.2</version>
        </dependency>

        <dependency>
            <groupId>org.apache.httpcomponents</groupId>
            <artifactId>httpclient</artifactId>
            <version>4.3.5</version>
            <exclusions>
                <exclusion>
                    <artifactId>commons-codec</artifactId>
                    <groupId>commons-codec</groupId>
                </exclusion>
            </exclusions>
        </dependency>
        <dependency>
            <groupId>commons-httpclient</groupId>
            <artifactId>commons-httpclient</artifactId>
            <version>3.1</version>
            <exclusions>
                <exclusion>
                    <artifactId>commons-codec</artifactId>
                    <groupId>commons-codec</groupId>
                </exclusion>
            </exclusions>
        </dependency>
        <dependency>
            <groupId>org.mybatis.spring.boot</groupId>
            <artifactId>mybatis-spring-boot-starter</artifactId>
            <version>1.3.1</version>
        </dependency>
        <dependency>
            <groupId>com.github.ulisesbocchio</groupId>
            <artifactId>jasypt-spring-boot-starter</artifactId>
            <version>2.0.0</version>
        </dependency>

    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
            </plugin>
        </plugins>
    </build>

</project>

2.3.2. Entity class ChatMessage.java

It is used to store the sent message information. The annotation uses lombok. If lombok is not used, the construction method and get and set methods can be automatically generated

@Data
@NoArgsConstructor
@AllArgsConstructor
public class ChatMessage {
    //消息角色
    String role;
    //消息内容
    String content;
}

2.3.3. Entity class ChatCompletionRequest.java

The parameter entity class used to send the request, the parameters are explained as follows:

model: Select the model to use, such as gpt-3.5-turbo

messages: list of sent messages

temperature: Temperature, the parameter ranges from 0-2, the lower the value, the more accurate, the higher the value, the lower the repetition rate of the answer

n: Number of replies, the number of replies in a conversation

stream: Whether stream processing, just like ChatGPT, will send information incrementally.

max_tokens: The maximum number of tokens allowed for the generated answer

user: dialog user

@Data
@Builder
public class ChatCompletionRequest {

    String model;

    List<ChatMessage> messages;

    Double temperature;

    Integer n;

    Boolean stream;

    List<String> stop;

    Integer max_tokens;

    String user;
}

2.3.4. Entity class ExecuteRet.java

Used to receive the information returned by the request and the execution result


/**
 * 调用返回
 */
public class ExecuteRet {

    /**
     * 操作是否成功
     */
    private final boolean success;

    /**
     * 返回的内容
     */
    private final String respStr;

    /**
     * 请求的地址
     */
    private final HttpMethod method;

    /**
     * statusCode
     */
    private final int statusCode;

    public ExecuteRet(booleansuccess, StringrespStr, HttpMethodmethod, intstatusCode) {
        this.success =success;
        this.respStr =respStr;
        this.method =method;
        this.statusCode =statusCode;
    }

    @Override
    public String toString() {
        return String.format("[success:%s,respStr:%s,statusCode:%s]", success, respStr, statusCode);
    }

    /**
     *@returnthe isSuccess
     */
    public boolean isSuccess() {
        return success;
    }

    /**
     *@returnthe !isSuccess
     */
    public boolean isNotSuccess() {
        return !success;
    }

    /**
     *@returnthe respStr
     */
    public String getRespStr() {
        return respStr;
    }

    /**
     *@returnthe statusCode
     */
    public int getStatusCode() {
        return statusCode;
    }

    /**
     *@returnthe method
     */
    public HttpMethod getMethod() {
        return method;
    }
}

2.3.5. Entity class ChatCompletionChoice.java

Used to receive the data returned by ChatGPT

@Data
public class ChatCompletionChoice {

    Integer index;

    ChatMessage message;

    String finishReason;
}

2.3.6. The interface calls the core class OpenAiApi.java

Use httpclient to call the api interface, and support post and get method requests.

url is the value of the configuration file open.ai.url, indicating the address to call the api: https://api.openai.com/, token is the obtained api-key.
Add header information when executing the post or get method headers.put("Authorization", "Bearer " + token);for authentication through the interface.


@Slf4j
@Component
public class OpenAiApi {

    @Value("${open.ai.url}")
    private String url;
    @Value("${open.ai.token}")
    private String token;

    private static final MultiThreadedHttpConnectionManagerCONNECTION_MANAGER= new MultiThreadedHttpConnectionManager();

    static {
        // 默认单个host最大链接数
CONNECTION_MANAGER.getParams().setDefaultMaxConnectionsPerHost(
                Integer.valueOf(20));
        // 最大总连接数,默认20
CONNECTION_MANAGER.getParams()
                .setMaxTotalConnections(20);
        // 连接超时时间
CONNECTION_MANAGER.getParams()
                .setConnectionTimeout(60000);
        // 读取超时时间
CONNECTION_MANAGER.getParams().setSoTimeout(60000);
    }

    public ExecuteRet get(Stringpath, Map<String, String> headers) {
        GetMethod method = new GetMethod(url +path);
        if (headers== null) {
            headers = new HashMap<>();
        }
        headers.put("Authorization", "Bearer " + token);
        for (Map.Entry<String, String> h : headers.entrySet()) {
            method.setRequestHeader(h.getKey(), h.getValue());
        }
        return execute(method);
    }

    public ExecuteRet post(Stringpath, Stringjson, Map<String, String> headers) {
        try {
            PostMethod method = new PostMethod(url +path);
            //log.info("POST Url is {} ", url + path);
            // 输出传入参数
log.info(String.format("POST JSON HttpMethod's Params = %s",json));
            StringRequestEntity entity = new StringRequestEntity(json, "application/json", "UTF-8");
            method.setRequestEntity(entity);
            if (headers== null) {
                headers = new HashMap<>();
            }
            headers.put("Authorization", "Bearer " + token);
            for (Map.Entry<String, String> h : headers.entrySet()) {
                method.setRequestHeader(h.getKey(), h.getValue());
            }
            return execute(method);
        } catch (UnsupportedEncodingExceptionex) {
log.error(ex.getMessage(),ex);
        }
        return new ExecuteRet(false, "", null, -1);
    }

    public ExecuteRet execute(HttpMethodmethod) {
        HttpClient client = new HttpClient(CONNECTION_MANAGER);
        int statusCode = -1;
        String respStr = null;
        boolean isSuccess = false;
        try {
            client.getParams().setParameter(HttpMethodParams.HTTP_CONTENT_CHARSET, "UTF8");
            statusCode = client.executeMethod(method);
method.getRequestHeaders();

            // log.info("执行结果statusCode = " + statusCode);
            InputStreamReader inputStreamReader = new InputStreamReader(method.getResponseBodyAsStream(), "UTF-8");
            BufferedReader reader = new BufferedReader(inputStreamReader);
            StringBuilder stringBuffer = new StringBuilder(100);
            String str;
            while ((str = reader.readLine()) != null) {
log.debug("逐行读取String = " + str);
                stringBuffer.append(str.trim());
            }
            respStr = stringBuffer.toString();
            if (respStr != null) {
             log.info(String.format("执行结果String = %s, Length = %d", respStr, respStr.length()));
                          } 
            inputStreamReader.close();
            reader.close();
            // 返回200,接口调用成功
            isSuccess = (statusCode == HttpStatus.SC_OK);
        } catch (IOExceptionex) {
        } finally {
method.releaseConnection();
        }
        return new ExecuteRet(isSuccess, respStr,method, statusCode);
    }

}

2.3.7. Define the interface constant class PathConstant.class

Used to maintain a list of supported api interfaces

public class PathConstant {
    public static class MODEL {
				//获取模型列表
        public static String MODEL_LIST = "/v1/models";
    }

    public static class COMPLETIONS {
        public static String CREATE_COMPLETION = "/v1/completions";
				//创建对话
        public static String CREATE_CHAT_COMPLETION = "/v1/chat/completions";
      
    }
}

2.3.8. Interface call debugging unit test class OpenAiApplicationTests.class

The core code has been prepared, and then write a unit test to test the interface call situation.


@SpringBootTest
@RunWith(SpringRunner.class)
public class OpenAiApplicationTests {

    @Autowired
    private OpenAiApi openAiApi;
    @Test
    public void createChatCompletion2() {
        Scanner in = new Scanner(System.in);
        String input = in.next();
        ChatMessage systemMessage = new ChatMessage('user', input);
        messages.add(systemMessage);
        ChatCompletionRequest chatCompletionRequest = ChatCompletionRequest.builder()
                .model("gpt-3.5-turbo-0301")
                .messages(messages)
                .user("testing")
                .max_tokens(500)
                .temperature(1.0)
                .build();
        ExecuteRet executeRet = openAiApi.post(PathConstant.COMPLETIONS.CREATE_CHAT_COMPLETION, JSONObject.toJSONString(chatCompletionRequest),
                null);
        JSONObject result = JSONObject.parseObject(executeRet.getRespStr());
        List<ChatCompletionChoice> choices = result.getJSONArray("choices").toJavaList(ChatCompletionChoice.class);
        System.out.println(choices.get(0).getMessage().getContent());
        ChatMessage context = new ChatMessage(choices.get(0).getMessage().getRole(), choices.get(0).getMessage().getContent());
        System.out.println(context.getContent());
    }

}
  • Use Scanner to input information on the console. If the console cannot be input during unit testing, then enter the IDEA installation directory and modify the following files. Add the last line and add -Deditable.java.test.console=true.

image.png
image.png

  • Create a ChatMessage object to store parameters. The roles include user, system, and assistant. Generally, the response returned by the interface is the role of assistant. We generally use user.

  • Define the request parameter ChatCompletionRequest, here we use the latest model gpt-3.5-turbo-0301 released on 3.1. What are the specific models? You can call the v1/model interface to view the supported models.

  • Then call openAiApi.post to request the interface, and convert the request result into a JSON object. Take the choices field and convert it to the ChatCompletionChoice object, which stores the specific information returned by the API.

    The format of the interface return information is as follows:

    {
        "id": "chatcmpl-6rNPw1hqm5xMVMsyf6PXClRHtNQAI",
        "object": "chat.completion",
        "created": 1678179420,
        "model": "gpt-3.5-turbo-0301",
        "usage": {
            "prompt_tokens": 16,
            "completion_tokens": 339,
            "total_tokens": 355
        },
        "choices": [{
            "message": {
                "role": "assistant",
                "content": "\n\nI. 介绍数字孪生的概念和背景\n    A. 数字孪生的定义和意义\n    B. 数字孪生的发展历程\n    C. 数字孪生在现代工业的应用\n\nII. 数字孪生的构建方法\n    A. 数字孪生的数据采集和处理\n    B. 数字孪生的建模和仿真\n    C. 数字孪生的验证和测试\n\nIII. 数字孪生的应用领域和案例分析\n    A. 制造业领域中的数字孪生应用\n    B. 建筑和城市领域中的数字孪生应用\n    C. 医疗和健康领域中的数字孪生应用\n\nIV. 数字孪生的挑战和发展趋势\n    A. 数字孪生的技术挑战\n    B. 数字孪生的实践难点\n    C. 数字孪生的未来发展趋势\n\nV. 结论和展望\n    A. 总结数字孪生的意义和价值\n    B. 展望数字孪生的未来发展趋势和研究方向"
            },
            "finish_reason": "stop",
            "index": 0
        }]
    }
    
  • Output the corresponding information.

2.3.9. Result presentation

image.png

2.4. Realization of continuous dialogue

2.4.1 Function realization of continuous dialogue

After the basic interface is adjusted, it is found that after a session, the return is not completed, and the input continues and a new session is re-initiated. So how do we implement the contact context? In fact, as long as some simple changes are made, the information of each conversation is saved in a message list, so that the question and answer supports context. The code is as follows:

List<ChatMessage> messages = new ArrayList<>();
@Test
public void createChatCompletion() {
    Scanner in = new Scanner(System.in);
    String input = in.next();
    while (!"exit".equals(input)) {
        ChatMessage systemMessage = new ChatMessage(ChatMessageRole.USER.value(), input);
        messages.add(systemMessage);
        ChatCompletionRequest chatCompletionRequest = ChatCompletionRequest.builder()
                .model("gpt-3.5-turbo-0301")
                .messages(messages)
                .user("testing")
                .max_tokens(500)
                .temperature(1.0)
                .build();
        ExecuteRet executeRet = openAiApi.post(PathConstant.COMPLETIONS.CREATE_CHAT_COMPLETION, JSONObject.toJSONString(chatCompletionRequest),
                null);
        JSONObject result = JSONObject.parseObject(executeRet.getRespStr());
        List<ChatCompletionChoice> choices = result.getJSONArray("choices").toJavaList(ChatCompletionChoice.class);
        System.out.println(choices.get(0).getMessage().getContent());
        ChatMessage context = new ChatMessage(choices.get(0).getMessage().getRole(), choices.get(0).getMessage().getContent());
        messages.add(context);
        in = new Scanner(System.in);
        input = in.next();
    }
}

Because OpenAi's /v1/chat/completions interface message parameter is a list, which is used to save our context, so we only need to save the content of each conversation with a list.

2.4.2 The results are as follows:

image.png

image.png

4. Frequently Asked Questions

4.1. The OpenAi interface cannot be called

Because https://api.openai.com/the address is also restricted, but the interface does not verify the region, so you can build a proxy by yourself

4.2. The interface returns 401

Check whether the request method adds the token field and whether the key is correct

5. Summary

So far, the connection between JAVA and OpenAI has been completed, and it also supports continuous dialogue. On this basis, you can continue to improve and bridge to web services, and customize your own ChatGPT assistant. I also built a platform myself, and it is constantly being improved. You can see the details in the picture below, and it will be open sourced later. If you want to experience it, you can private message me to get the address and account number.

image.png

Guess you like

Origin blog.csdn.net/b379685397/article/details/129459154