[Issue] ChatGPT reply function doesn't work with gpt-3.5-turbo model
[Issue] ChatGPT reply function doesn't work with gpt-3.5-turbo model
Hey, Martin!
It seems like OpenAI have changed their endpoints or there is some issue in the software.
ChatGPT replies in twitter dub with the davinci model works like a charm. But when I set model to "gpt-3.5-turbo" further completion requests fail with the following error:
chatgpt http: 404
https://api.openai.com/v1/completions
{
"error": {
"message": "This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?",
"type": "invalid_request_error",
"param": "model",
"code": null
}
}
ERROR: chatgpt api request failed
Please, consider adding proper endpoint routing or at least a possibility to manually set API. <3
It seems like OpenAI have changed their endpoints or there is some issue in the software.
ChatGPT replies in twitter dub with the davinci model works like a charm. But when I set model to "gpt-3.5-turbo" further completion requests fail with the following error:
chatgpt http: 404
https://api.openai.com/v1/completions
{
"error": {
"message": "This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?",
"type": "invalid_request_error",
"param": "model",
"code": null
}
}
ERROR: chatgpt api request failed
Please, consider adding proper endpoint routing or at least a possibility to manually set API. <3
- martin@rootjazz
- Site Admin
- Posts: 34696
- Joined: Fri Jan 25, 2013 10:06 pm
- Location: The Funk
- Contact:
Re: [Issue] ChatGPT reply function doesn't work with gpt-3.5-turbo model
let me check. tbh I know nothing about openAI API options / settings.
I wasn't aware that changing the model changes the required payload...
I wasn't aware that changing the model changes the required payload...
- martin@rootjazz
- Site Admin
- Posts: 34696
- Joined: Fri Jan 25, 2013 10:06 pm
- Location: The Funk
- Contact:
Re: [Issue] ChatGPT reply function doesn't work with gpt-3.5-turbo model
ok yes, it needs different endpoints depending on the model
/v1/chat/completions gpt-4, gpt-4-0314, gpt-4-32k, gpt-4-32k-0314, gpt-3.5-turbo, gpt-3.5-turbo-0301
/v1/completions text-davinci-003, text-davinci-002, text-curie-001, text-babbage-001, text-ada-001
/v1/chat/completions gpt-4, gpt-4-0314, gpt-4-32k, gpt-4-32k-0314, gpt-3.5-turbo, gpt-3.5-turbo-0301
/v1/completions text-davinci-003, text-davinci-002, text-curie-001, text-babbage-001, text-ada-001
- martin@rootjazz
- Site Admin
- Posts: 34696
- Joined: Fri Jan 25, 2013 10:06 pm
- Location: The Funk
- Contact:
Re: [Issue] ChatGPT reply function doesn't work with gpt-3.5-turbo model
The next update will fix this. I shall let you know when it is ready.
Regards,
Martin
Regards,
Martin
- martin@rootjazz
- Site Admin
- Posts: 34696
- Joined: Fri Jan 25, 2013 10:06 pm
- Location: The Funk
- Contact:
Re: [Issue] ChatGPT reply function doesn't work with gpt-3.5-turbo model
Already fixed? wow cool! thank you
I'll test and report once see any problems.
Re: [Issue] ChatGPT reply function doesn't work with gpt-3.5-turbo model
Hey,
I've tried the testing build you mentioned above.
Looks like messages are being sent to a proper endpoint, but the request itself causes the following error:
Kindly add respective request format for chat gpt completions. Refrence:
https://stackoverflow.com/questions/759 ... y-when-tes
Example:
I've tried the testing build you mentioned above.
Looks like messages are being sent to a proper endpoint, but the request itself causes the following error:
Code: Select all
{
"error": {
"message": "'messages' is a required property",
"type": "invalid_request_error",
"param": null,
"code": null
}
}
https://stackoverflow.com/questions/759 ... y-when-tes
Example:
Code: Select all
{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content":[b] Prompt+tweet message hier[/b]}],
"max_tokens": 512,
"top_p": 1,
"temperature": 0.5,
"frequency_penalty": 0,
"presence_penalty": 0
}
- martin@rootjazz
- Site Admin
- Posts: 34696
- Joined: Fri Jan 25, 2013 10:06 pm
- Location: The Funk
- Contact:
Re: [Issue] ChatGPT reply function doesn't work with gpt-3.5-turbo model
:facepalm:
My apologies. I didn't think to check if the request body would need to be formatted differently, I should have expected this. Let me investigate and get it into the next update.
Sorry it's unlikely to be today but should be tomorrow
Regards,
Martin
My apologies. I didn't think to check if the request body would need to be formatted differently, I should have expected this. Let me investigate and get it into the next update.
Sorry it's unlikely to be today but should be tomorrow
Regards,
Martin
Re: [Issue] ChatGPT reply function doesn't work with gpt-3.5-turbo model
How are you using ChatGPT with the bot??
Re: [Issue] ChatGPT reply function doesn't work with gpt-3.5-turbo model
Hehemartin@rootjazz wrote: ↑Tue Jun 06, 2023 11:18 pm :facepalm:
My apologies. I didn't think to check if the request body would need to be formatted differently, I should have expected this. Let me investigate and get it into the next update.
Sorry it's unlikely to be today but should be tomorrow
Regards,
Martin
No problem! Just ping once it's been added to the testing build.
I won't share a business scenario
In case you asked about how to actually make TwitterDub use gpt completions this is how:
1. Added an openAI API key, prompt ("make an excited response to a tweet:") and model("text-davinci-003" for ex) to the global chatGPT settings (check the help menu)
2. Used the chat gpt token while sending tweets (it must work for dms as well but I haven't tested it)