@Bjorn_Beheydt you could use the tokenizer on OpenAi
If you put your prompt in and see how many tokens it uses. And then add in a tweet at the approximate length that you want it to be.
Take the total number of Tokens. And in your ChatGPT piece, set the max tokens to around that amount. Then it should give you output closer to what you are looking for.
You have to include the prompt in there as well though as it counts towards the token use.
You may need to do a few test runs and tweak it a little.
Hope this helps.
Kind regards,
GunnerJnr
EDIT:
Forgot to add that the reason it probably changed for you recently is likely because the new update for the ChatGPT piece enforces a token limit now, and sets one by default.