I wanted to test this extension on my local, however, I encountered the following error!
I also test with the different current pages like google, bing, github, etc.
Error:
This model's maximum context length is 128000 tokens. However, your messages resulted in 133758 tokens. Please reduce the length of the messages.
