mirror of
https://github.com/Mintplex-Labs/anything-llm.git
synced 2026-03-02 22:57:05 -05:00
[BUG]: Recognize LiteLLM model's max context and remove max_tokens #2226
Labels
No labels
Desktop
Docker
Integration Request
Integration Request
OS: Linux
OS: Mobile
OS: Windows
UI/UX
blocked
bug
bug
core-team-only
documentation
duplicate
embed-widget
enhancement
feature request
github_actions
good first issue
investigating
needs info / can't replicate
possible bug
question
stage: specifications
wontfix
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/anything-llm#2226
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @ringge on GitHub (Mar 8, 2025).
Originally assigned to: @angelplusultra on GitHub.
How are you running AnythingLLM?
AnythingLLM desktop app
What happened?
I'm using document pinning for my workspace.
If using the model directly, for example I used Gemini 2.0, or any other model, the document pinning works as expected.
However, if using the same model (like Gemini 2.0) through LiteLLM, anythingLLM seems not being able to detect the max context through LiteLLM so it automatically truncate the document
Are there known steps to reproduce?