Feedback on Auto model selection for vscode copilot agent mode #168899
Replies: 3 comments
-
💬 Your Product Feedback Has Been Submitted 🎉 Thank you for taking the time to share your insights with us! Your feedback is invaluable as we build a better GitHub experience for all our users. Here's what you can expect moving forward ⏩
Where to look to see what's shipping 👀
What you can do in the meantime 💻
As a member of the GitHub community, your participation is essential. While we can't promise that every suggestion will be implemented, we want to emphasize that your feedback is instrumental in guiding our decisions and priorities. Thank you once again for your contribution to making GitHub even better! We're grateful for your ongoing support and collaboration in shaping the future of our platform. ⭐ |
Beta Was this translation helpful? Give feedback.
-
I might add that Auto selecting a premium model for this is .. insane 🤑 "Hello! I'm GitHub Copilot, your AI programming assistant. I'm ready to help you with your coding tasks. How can I assist you today?" |
Beta Was this translation helpful? Give feedback.
-
You can see which model and model multiplier are used by hovering over the chat response. https://code.visualstudio.com/blogs/2025/09/15/autoModelSelection |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Select Topic Area
Product Feedback
Copilot Feature Area
Copilot Agent Mode
Body
General background and setup ...
So, I'm testing various models and how well they respond to various coding and research tasks. Decided to try selecting "Auto", and that is what my feedback is focused on.
My conversation started with a very broad and unspecified request "I want to refactor my code", fully aware that this would most likely trigger a premium model selection.
The codebase is python, and it is really small. Not complex code. The repo is here if you want to get an idea of the size and complexity:
Mode Manager MCP
So there the adventure started ...
Make copilot aware of the model it is using.
I tried forcing this out of it but it refused to answer, clamed it was secret. First it told me that I shouldn't worry, because money is not an issue, "CoPilot is flat-rate". Well, i corrected it, it apologized (ofc) and then it told me that it's beyond it's knowledge. But it did advice me to trace the API calls to the service, because the information would be in the header. So I realized that it would be in the debug-output, and sure enough there it was.
So, my feedback, and request, is that you make the chat aware of the model it is using so it can tell me when I'm asking. Because when using Auto i would prompt it to tell me what model it is currently using, every single turn.
Make Auto cheaper, like 0x75 for premium
I believe it is in your interest since this is you orchestrating load balancing and directing us toward models you want runtime on. I bet you have different deals for the different models, and higher margin on some than others.
The feedback on this, and the request, is that you lower the price for Auto model. It's in all our interest. The model it chose for me I wouldn't chosen myself, but it was still suitable and I can live with it. That is, if it was an incentive for it. Now instead, I turned off Auto and selected my own model that i know and trust. Because the money is the same for me.
Cheers and thanks for listening
-- Niclas
Beta Was this translation helpful? Give feedback.
All reactions