Using fireworks through LiteLLM.
Model: fireworks_ai/accounts/fireworks/models/kimi-k2-instruct
litellm.BadRequestError: Fireworks_aiException - {"error":{"object":"error","type":"invalid_request_error","co$
e":"invalid_request_error","message":"The prompt is too long: 40177, model maximum context length: 32767"}}
This is inconsistent, since the context window is stated as 128k/131k tokens for Kimi.
Using fireworks through LiteLLM.
Model: fireworks_ai/accounts/fireworks/models/kimi-k2-instruct
litellm.BadRequestError: Fireworks_aiException - {"error":{"object":"error","type":"invalid_request_error","co$
e":"invalid_request_error","message":"The prompt is too long: 40177, model maximum context length: 32767"}}
This is inconsistent, since the context window is stated as 128k/131k tokens for Kimi.