ChatGPT application development (1) ChatGPT OpenAI API proxy-free calling method (through Cloudflare's AI Gateway)

Preface

When developing ChatGPT applications, I think the most important point is to be able to use the ChatGPT API interface. First of all, I need to be able to successfully access it myself. This is no problem. If I know magic, I can call it locally.

So how does the user call my application API? My understanding is to initiate access to OpenAI through the transfer server that the user can access. So I need to prepare proxy.

I am now using the Cloudflare proxy, and they have opened an AI Gateway function for initiating access to the OpenAI API from around the world. It seems that it is still free during the testing phase, and Cloudflare has 100,000 opportunities to process requests for free every day.

This is just my humble opinion. If you guys have a better way, you are welcome to criticize and correct me!

How to call the ChatGPT API

First of all, we are not in a hurry to get it right in one step. Let’s first try to call the ChatGPT API through the external network to see if we can access it successfully.

I won’t go into details about ChatGPT account application and magic. I believe that most students who need to develop applications are already ready.

To use the API, you need to register a key for your own account and recharge a certain starting amount (20$) into the account. The official tutorial is as follows:Quickstart tutorial - OpenAI API If you can already run the program locally and call the API, you can skip this part.

Points to note:

  1. Key application always goes wrong. Here I was frantically trying to register the key, and it took maybe dozens of attempts to get it out. As a result, I successfully registered for the first time, but I forgot to save it and closed the page. However, the key is similar to the github ssh key. It will be fully displayed only when the registration is successful, and will not be visible later, so I deleted the old key again. Reprinted many times.

    1701953583516

  2. You can test whether the API can be called successfully by yourself. In fact, you don’t need to test the official use cases. Haha, it has too many tokens and costs money. I usually just test the hello to see if there is a reply.

    from openai import OpenAI
    client = OpenAI()
    
    completion = client.chat.completions.create(
      model="gpt-3.5-turbo",
      messages=[
        {
          
          "role": "user", "content": "hello"}
      ]
    )
    
    print(completion.choices[0].message)
    

    Poor kids are like this.

    1701953938221

Cloudflare AI Gateway configuration

The ChatGPT API can now be used, but the target users of the application may not be able to use it (if you require users to turn on magic to use your software, you will lose a large number of users). We use Cloudflare to proxy requests.

cloudflare console: https://dash.cloudflare.com/

After registering an account, you can see the AI ​​section on the left (you can ignore my jingqinggpts.com, that one does not need to be configured, it was just a blind attempt by me hh).

1701954237000

Click ‘AI’ - ‘AI Gateway’ to register.

After registration is completed, you can see API usage examples in ‘$UserName API Endpoints’.

1701954342610

1701954532389

curl -X POST https://gateway.ai.cloudflare.com/v1/9f02226921e1ee7cd9adb9c655bb2883/jingqinggpts/openai/chat/completions \
  -H 'Authorization: Bearer XXX' \
  -H 'Content-Type: application/json' \
  -d ' {
      "model": "gpt-3.5-turbo",
      "messages": [
        {
          "role": "user",
          "content": "What is Cloudflare?"
        }
      ]
    }
'

Note that the XXX after Bearer should be changed to your own API key.

The content in content can also be modified.

Postman sends a post request to call the API

Next we can try to call the API by sending an http post request to cloudflare's AI Gateway Endpoint without magic.

Postman is my personal choice for http testing application. Of course, it is not limited to this, curl and other methods are also acceptable.

The configuration is as follows (you can paste the above curl statement into the address bar, the address and header part will be automatically parsed, and the json data needs to be supplemented) (Pay attention to selecting the post request):

1701954792336

1701954989996

[
  {
    
    
    "provider": "openai",
    "endpoint": "chat/completions",
    "headers": {
    
    
      "authorization": "Bearer XXX",
      "content-type": "application/json"
    },
    "query": {
    
    
      "model": "gpt-3.5-turbo",
      "messages": [
        {
    
    
          "role": "user",
          "content": "hello"
        }
      ]
    }
  }
]

Click send to see if a response has been received.

image-20231207211849624

Python sends a post request to call the API

Essentially the same as above, except that Python calls the request package to send the post request. I am here to provide you with my calling ideas.

import requests
import json

ALLOWED_PROVIDERS = ["openai", "azure-openai", "huggingface"]

def send_request(python_data):
    # provider = json_request["provider"]
    # if provider not in ALLOWED_PROVIDERS:
    #     raise ValueError(f"Provider '{provider}' is not allowed.")

    url = f"https://gateway.ai.cloudflare.com/v1/9f02226921e1ee7cd9adb9c655bb2883/jingqinggpts"
    headers = {
    
    
        'Content-Type': 'application/json',
    }


    response = requests.post(url, json=python_data, headers=headers)
    return response.json()

json_data = """
[
  {
    "provider": "openai",
    "endpoint": "chat/completions",
    "headers": {
      "authorization": "Bearer XXX",
      "content-type": "application/json"
    },
    "query": {
      "model": "gpt-3.5-turbo",
      "messages": [
        {
          "role": "user",
          "content": "hello"
        }
      ]
    }
  }
]
"""

python_data = json.loads(json_data)
print(python_data)

try:
    response = send_request(python_data)
    print(response)
except Exception as e:
    print(f"Error sending request: {
      
      e}")
    print(f"Response content: {
      
      python_data}")

The response is as follows:

1701955448102

Guess you like

Origin blog.csdn.net/jtwqwq/article/details/134865483