Niraj Posted June 25 Share Posted June 25 Hi Experts, I am able to configure co-pilot successfully with Azure platform. Only issue I realize, when I post 2 or 3 query to co- pilot it gives me response without any issue . After that It start giving the error . I'm afraid there's been an error posting your question to Copilot. The leading cause for this is that we have hit the rate limit for the underlying LLM. You may wish to wait a few minutes and retry your request. I tried to increase the rate limit quota in Azure Open AI but it did not help. After investigated further in container logs , it seems I am getting the 200 response from Azure but getting 500 Error at orchestrator level (Attached logs). is this expected ? or any work around to fix this issue? log.txt Link to comment Share on other sites More sharing options...
Marcelo Gallardo Posted June 26 Share Posted June 26 Hello Niraj, Sorry you are having issues with copilot. Could you please provide the following information: - What version of the copilot orchestrator are you using? - Are you connecting to Azure OpenAI or OpenAI? - Can you access the Cognitive Search URL from the machine running orchestrator? Thanks, Marcelo Gallardo Link to comment Share on other sites More sharing options...
Niraj Posted June 27 Author Share Posted June 27 @Marcelo Gallardo Thanks for the quick response. Please see the details below inline - What version of the copilot orchestrator are you using? Spotfire_Copilot_1.1.0 - Are you connecting to Azure OpenAI or OpenAI? : Azure Open AI - Can you access the Cognitive Search URL from the machine running orchestrator? Yes FYI: The setup is working fine but getting this kind of error when it exceeds two or three prompt. if you wait another 3-4 minutes and close the copilot and retry , it works but again only for 2-3 prompt. This is very unusual and afraid to showcase the demo to the customer with this. Please let me know if you need any more information. Regards, Niraj Link to comment Share on other sites More sharing options...
Marcelo Gallardo Posted June 27 Share Posted June 27 Niraj, We observed those issues in the past and were resolved by increasing the rate limit on the models used by Orchestrator. Also, do you have access to OpenAI? You could try that to see if you get the same rate limit issues. Thanks, Marcelo Link to comment Share on other sites More sharing options...
Niraj Posted June 28 Author Share Posted June 28 @Marcelo Gallardo Yes , I do have access to Azure Open AI. I had tried to increase the rate limit from default 1k to 12 K. but It did not help. After your input , I increased to 72 K now. but still I am facing same issue. Attached are the screenshot from Azure Open AI for your reference. if you see container log which is shared in previous conversation. We are getting the response 200 Ok from Azure but it five 500 at Orchestrators end. let me know if you need any more info. Best Regards, Niraj Link to comment Share on other sites More sharing options...
Niraj Posted July 2 Author Share Posted July 2 @Marcelo Gallardo any comments or feedback ? Link to comment Share on other sites More sharing options...
Marcelo Gallardo Posted July 11 Share Posted July 11 @Niraj Could you please share the logs for the Orchestrator? Thanks, Marcelo Link to comment Share on other sites More sharing options...
Solution Niraj Posted July 18 Author Solution Share Posted July 18 Hi @Marcelo Gallardo I was able to fix the issue by using different GPT model version in Azure open AI. Link to comment Share on other sites More sharing options...
Marcelo Gallardo Posted July 25 Share Posted July 25 @NirajThanks for the feedback and glad you resolved the issue. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now