提问人:N.M 提问时间:5/23/2023 最后编辑:N.M 更新时间:5/23/2023 访问量:741
openai.createChatCompletion() 在 aws lambda 函数中将流转换为 true 时不返回事件
openai.createChatCompletion() is not return event when I turn stream to true in aws lambda function
问:
我创建了一个函数,该函数从 返回蒸汽数据。下面的代码片段效果非常好。python
openai
def get_data():
trained_model = train_model(relevent_data, question)
response = openai.ChatCompletion.create(
messages=trained_model,
model=GENERATIVE_MODEL,
max_tokens=OPENAI_MAX_TOKEN,
temperature=OPENAI_TEMPERATURE,
stream=True,
)
complete_answer = ""
for event in response:
event_text = event["choices"][0]["delta"]
answer = event_text.get("content", "") # RETRIEVE CONTENT
complete_answer += answer
yield answer
return Response(get_data(), mimetype="text/event-stream")
由于 Lambda 不支持流式处理,因此使用 ,而是支持 .我切换到 .但是,我面临的问题是,响应不是预期的事件。这是我收到的回复。Python
node.js
node.js
openai.createChatCompletion
const trained_model = await train_model(relevent_data, question);
const response = await openai.createChatCompletion({
messages: trained_model,
model: GENERATIVE_MODEL,
max_tokens: OPENAI_MAX_TOKEN,
temperature: OPENAI_TEMPERATURE,
stream: true,
});
我收到的回复:
'strict-transport-security': 'max-age=15724800; includeSubDomain Authorization:
'Bearer my-openai-key data: `{"messages":[{"role":"system","content":"My trained model" Context:"my content" Answer:"}],"model":"gpt-3.5-turbo","max_tokens":1000,"temperature 'Authorization: Bearer my-openai-key data: 'data: {"id":"chatcmpl-7JFj3EkpA02LMyxmiQr7","object":"chat.completion.chunk","created":1684823037,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"role":"assistant"},"index":0,"finish_ 'data: {"id":"chatcmpl-7JFj3EkpA02LMyxmiQrM6njD4mxh7","object":"chat.completion.chunk","created":1684823037,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":"Hello"},"index":0,"finish_reaso 'data: {"id":"chatcmpl-7JFj3EkpA02LMyxmiQrM6njD4mxh7","object":"chat.completion.chunk","created":1684823037,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":"!"},"index":0,"finish_reason":n 'data: {"id":"chatcmpl-7JFj3EkpA02LMyxmiQrM6njD4mxh7","object":"chat.completion.chunk","created":1684823037,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":" I"},"index":0,"finish_reason": 'data: {"id":"chatcmpl-7JFj3EkpA02LMyxmiQrM6njD4mxh7","object":"chat.completion.chunk","created":1684823037,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":" am"},"index":0,"finish_reason" 'data: {"id":"chatcmpl- 'data: {"id":"chatcmpl-7JFj3EkpA02LMyxmiQrM6njD4mxh7","object":"chat.completion.chunk","created":1684823037,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":" The"},"index":0,"finish_reason 'data: {"id":"chatcmpl-7JFj3EkpA02LMyxmiQrM6njD4mxh7","object":"chat.completion.chunk","created":1684823037,"mode'... 1568 more char}
从上面的片段中可以看出,响应是一个对象,如果我使用response.on("data", (chunk) => { console.log("====Cunk=====: ", chunk.toString()); });
它会给我一个错误,响应类型不是事件,而是对象
但是,包含一个字段。
我安慰了其类型,然后我再次尝试,它抛出了一个错误。response.data
delta
resonse.data
String
JSON.parse(response.data)
我很困惑,不确定自己是不是遗漏了什么,我已经向openai createChatCompleteum的官方文档作者求助了
你能帮我为什么会出现这个问题吗?提前感谢您的帮助。
编辑和快速解决方案:
我在这里遇到了一个讨论,我发现这里有一个 npm 包,它目前是有效的。openai-ext
这是我的更新代码:
const trained_model = await train_model(relevent_data, question);
const streamConfig = {
openai: openai,
handler: {
onContent(content, isFinal, stream) {
console.log("====content===: ", content);
},
onDone(stream) {
console.log("Done!");
},
onError(error, stream) {
console.error(error);
},
},
};
await OpenAIExt.streamServerChatCompletion(
{
model: GENERATIVE_MODEL,
messages: trained_model,
max_tokens: OPENAI_MAX_TOKEN,
temperature: OPENAI_TEMPERATURE,
},
streamConfig
);
答: 暂无答案
评论