Calling GPT’s interface API through Microsoft Azure - Compatible and replacing OpenAI’s official notes

As we all know, we cannot access the official OpenAI service, but we can access the interface through a proxy or a third-party proxy

Now the new regulations prohibit the use of overseas AI large model interfaces for domestic customers, so we need to use domestic large model interfaces

The domestic effect is really poor. Now if you want to use the GPT large model, you can use the OpenAI service of Microsoft Azure.

Responsible AI

At Microsoft, we are committed to human-centered principles that drive the advancement of AI. Generative models such as those available in Azure OpenAI have significant potential advantages, but without careful design and thoughtful mitigations, such models have the potential to generate incorrect or even harmful content. Microsoft has made significant investments to help prevent abuse and accidental harm, including requiring applicants to demonstrate a well-defined use case, incorporating Microsoft's Principles for Responsible Use of AI, building content filters to support customers, and providing customers with responsible AI Implementation Guidance.

Microsoft has a compliant company in China, and the data content has been filtered. We can consider replacing OpenAI’s official website service

Azure GPT interface specification

We can refer to the call parameters of the following REST interface

After the service is successfully created on Azure, you will get two parameters ENDPOINT and API-KEY

Chat completion interface

curl $AZURE_OPENAI_ENDPOINT/openai/deployments/gpt-35-turbo/chat/completions?api-version=2023-05-15 \
  -H "Content-Type: application/json" \
  -H "api-key: $AZURE_OPENAI_KEY" \
  -d '{"messages":[{"role": "system", "content": "You are a helpful assistant."},{"role": "user", "content": "Does Azure OpenAI support customer managed keys?"},{"role": "assistant", "content": "Yes, customer managed keys are supported by Azure OpenAI."},{"role": "user", "content": "Do other Azure Cognitive Services support this too?"}]}'

向量转换接口

curl https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT_NAME/embeddings?api-version=2023-05-15 \
  -H "Content-Type: application/json" \
  -H "api-key: YOUR_API_KEY" \
  -d "{\"input\": \"The food was delicious and the waiter...\"}"

The difference from the official OpenAI

The interface address is different, and the form of the key passed in the header parameter is also different.

In addition to these two points, there is another very important thing.

focus on

Take a closer look at the url part of the above interface address, the bold part

$AZURE_OPENAI_ENDPOINT/openai/deployments/gpt-35-turbo/chat/completions?api-version=2023-05-15

$AZURE_OPENAI_ENDPOINT/openai/deployments/YOUR_DEPLOYMENT_NAME/embeddings?api-version=2023-05-15

我们需要把大模型部署一下,上面标红的地方,就是部署名称,所以如果我们想无缝切换OpenAI与微软Azure OpenAI

我们需要固定好,部署的名称与模型的名称保持一致,这样才能正常调用

客服系统配置

我们系统已经兼容了微软azure接口:gofly.v1kf.com

前往【菜单】【机器人设置】【向量知识库AI配置】接口地址填入$AZURE_OPENAI_ENDPOINT,接口密钥填写key

Guess you like

Origin blog.csdn.net/taoshihan/article/details/132332524