mirror of
https://github.com/Mintplex-Labs/anything-llm.git
synced 2026-03-02 22:57:05 -05:00
[FEAT]: Easily switch LLM models for each request #1375
Labels
No labels
Desktop
Docker
Integration Request
Integration Request
OS: Linux
OS: Mobile
OS: Windows
UI/UX
blocked
bug
bug
core-team-only
documentation
duplicate
embed-widget
enhancement
feature request
github_actions
good first issue
investigating
needs info / can't replicate
possible bug
question
stage: specifications
wontfix
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/anything-llm#1375
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @ajmurmann on GitHub (Aug 12, 2024).
Originally assigned to: @timothycarambat on GitHub.
What would you like to see?
Prices and capabilities vary greatly between different providers and models. It would be fabulous to have a dropdown in the "Send a message" control that allows me to chose a provider. For simple requests I might use my local LLM or one of the cheaper commercial models. For harder questions I'd want to use one of the more powerful, but pricier models.
Ideally I'd be able to select "regenerate with model X" on an already generated response if the model I used didn't provide a satisfying answer.
@buildgreatthings commented on GitHub (Jun 5, 2025):
Curious why has this not been tackled? It's a pretty easy change, and very high value for users
@timothycarambat commented on GitHub (Jun 6, 2025):
Maybe from a user perspective! Right now, we don't have stored connections, so the easiest lift is just being able to swap model w/o swapping provider. Since we dont have the previously stored/used connections to easily swap between models/providers. Then of course, we need designs for this that fit into the UI, which we haven't tasked yet as design is fully commited to another project for at least the next week and I cant pull them off that.
The second we merge just model swapping in - next issue will be being able to swap provider as well, since it would be better to do both at once rather than do the work twice. So we should support model and provider swapping easily for this feature.
There is also a minor bit of complexity with loading/unloading models since that takes time. Most people use AnythingLLM with local models, so we need to always consider that overhead as well. Swapping cloud is trivial.
I don't disagree that this would be a great feature to have for users and we can see how we can fit this in the near term since it's the #3 most reacted request! So its pretty clear we should at least be looking at getting this done soon - I did not notice it was so high up so thanks for brining this to my attention!
@buildgreatthings commented on GitHub (Jun 6, 2025):
In our setup, we'd be happy with "just being able to swap model w/o swapping providers". Thank you!