I saw that in the models for the Chat GPT piece, the new gpt-3.5-turbo-instruct was in there, so I was messing around to see how it faired. However, on running the sample data test, I get the following error each time:
{
"token": 404,
"error": {
"message": "This is not a chat model and thus not supported in the v1/chat/completions endpoint. Did you mean to use v1/completions?",
"type": "invalid_request_error",
"param": "model",
"code": null
}
}
I assume this is a bug or not set up correctly in ActivePieces. Looks like it is calling the wrong API endpoint.
Hey all, this is not an Activepieces bug. According to OpenAI’s docs models page, this model is only compatible with the legacy text completion endpoint and not the chat endpoint.