Ask ChatGPT model and token set up

I got the following error, how do I set the Ask ChatGPT model and number of tokens to deal with this?

400 This model’s maximum context length is 16385 tokens. However, your messages resulted in 31022 tokens. Please reduce the length of the messages.

What are you asking it? Can you show an example of your prompt and settings inside the piece?

@Takeru, it might be worth putting your initial prompt in the tokenizer tool to understand how many tokens it will cost.

quoted from the above tokenizer tool link:

A helpful rule of thumb is that one token generally corresponds to ~4 characters of text for common English text. This translates to roughly ¾ of a word (so 100 tokens ~= 75 words).


@ashrafsam, it might be worth seeing if there is a way to incorporate a token count into the prompt input window:

Also quoted from the above tokenizer tool link:

If you need a programmatic interface for tokenizing text, check out our tiktoken package for Python. For JavaScript, the community-supported @dbdq/tiktoken package works with most GPT models.


Kind regards,
GunnerJnr

This topic was automatically closed 24 hours after the last reply. New replies are no longer allowed.