前端-requests.post()函数实现openai-api的流式响应

请用requests.post()函数实现openai-api的流式响应发送和解析(“stream“: True)函数,封装成函数。并给出示例的完整代码。

以下为示例代码:

import requests
import json

def openai_api_streaming(api_key, prompt, model, temperature=0.5, max_tokens=50):
    url = "https://api.openai.com/v1/engines/" + model + "/completions"
    headers = {
        "Content-Type": "application/json",
        "Authorization": "Bearer " + api_key
    }
    data = {
        "prompt": prompt,
        "max_tokens": max_tokens,
        "temperature": temperature,
        "stream": True
    }
    response = requests.post(url, headers=headers, data=json.dumps(data), stream=True)
    if response.status_code != 200:
        raise ValueError("Failed to generate response: " + response.text)

    result = []
    for chunk in response.iter_lines():
        chunk = json.loads(chunk)
        if "text" in chunk:
            result.append(chunk["text"])
        if "choices" in chunk and len(chunk["choices"]) > 0:
            choice = chunk["choices"][0]
            if "text" in choice:
                result.append(choice["text"])
    return "".join(result)


# 示例调用
api_key = "YOUR_API_KEY"
prompt = "Hello, "
model = "davinci"
response = openai_api_streaming(api_key, prompt, model)
print(response)

注意,在函数中,stream=True参数使响应被流式传输。我们使用迭代器iter_lines()来迭代响应的各个行,将它们加载到Python对象中,并最终返回一个字符串。

Logo

加入社区!打开量化的大门,首批课程上线啦!

更多推荐