I got a 400 error.
The response body is below.
{
"error": {
"message": "This model's maximum context length is 4097 tokens. However, your messages resulted in 4363 tokens. Please reduce the length of the messages.",
"type": "invalid_request_error",
"param": "messages",
"code": "context_length_exceeded"
}
}
It looks like the context length I threw was over the maximum token because chat-code throws the whole chat context whenever we send a new message.
It'd be better if we could send a single chat message even if we are on the same chat thread.