@Takeru, it might be worth putting your initial prompt in the tokenizer tool to understand how many tokens it will cost.
quoted from the above tokenizer tool link:
A helpful rule of thumb is that one token generally corresponds to ~4 characters of text for common English text. This translates to roughly ¾ of a word (so 100 tokens ~= 75 words).
@ashrafsam, it might be worth seeing if there is a way to incorporate a token count into the prompt input window:
Also quoted from the above tokenizer tool link:
If you need a programmatic interface for tokenizing text, check out our tiktoken package for Python. For JavaScript, the community-supported @dbdq/tiktoken package works with most GPT models.