Token limit setting?

I’m dipping my toes into agents, skills, and skill sets. I was testing my work through a conversation with Ed and have received this error a few times - “Exception thrown during processing. The output token limit has been exceeded. The maximum number of tokens allowed is 4096. Please reduce the size of your request and try again.”

Is there a setting to increase the token limit?

@nate what LLM model are you using? This is what got mentioned internally:

The short answer is that the max tokens is set by the LLM provider like OpenAI and Anthropic, not Nextworld internally. Swapping to a model with a higher output token limit is the easiest solution

CC: @jeremy.smolens