mirror of
https://github.com/Mintplex-Labs/anything-llm.git
synced 2026-03-02 22:57:05 -05:00
Anthropic Claude 2 API Integration #101
Labels
No labels
Desktop
Docker
Integration Request
Integration Request
OS: Linux
OS: Mobile
OS: Windows
UI/UX
blocked
bug
bug
core-team-only
documentation
duplicate
embed-widget
enhancement
feature request
github_actions
good first issue
investigating
needs info / can't replicate
possible bug
question
stage: specifications
wontfix
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/anything-llm#101
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @jignnsd on GitHub (Jul 22, 2023).
Hey Tim, thank you for this work, it works very good so far on my tests.
I have one question, is it possible to have support (using the same DB and Embeddings API) of the just introduced AnthropIc Claude 2 to use it instead of Openai chat API on an easy way, or it will implicate a lot of changes to be done?
I’m curios because I just compare the answers between openai (on anything-llm) and claude 2 with the same pdf file, and the answer was much better on claude 2, maybe because of the bigger token limit, so maybe will be a good thing to try if this does not implicate a huge change.
Also, the pricing drop using Claude 2 instead of Openai is important, like half of the price.
Again, many thanks for the good work and hopefully you can add a link to make a donation to compensate this great effort.
Javier
@timothycarambat commented on GitHub (Jul 25, 2023):
Oh, that sounds very cool. The LLM side of things needs to diversity (even beyond Azure which is just OpenAI anyway).
If they have a support JS client it should keep things even more simple. Otherwise rolling our own internal API might be a pain. I do think that embedding will still have to be done with OpenAI though, as I don't think Anthropic has an embedded yet?
Chat should be straighforward
@jignnsd commented on GitHub (Jul 25, 2023):
You are right, should keept openai for embeddings and also same vectordb, just the chat part will be changed to have the option of Anthropic.
@timothycarambat commented on GitHub (Jul 25, 2023):
For Implementation:
Node client: https://www.npmjs.com/package/@anthropic-ai/sdk
API: https://docs.anthropic.com/claude/reference/getting-started-with-the-api
Will require some new ENV keys in settings to support this config.
@madruga8 commented on GitHub (Jul 30, 2023):
the big context of claude2-100k really makes a huge difference...
@timothycarambat commented on GitHub (Aug 3, 2023):
Mintplex Labs has applied to the Anthropic Claude 2 API access so we can integrate it fully. This issue is blocked until we or someone else can provide an API key for development (do not share key in issue please)
@ishaan-jaff commented on GitHub (Sep 21, 2023):
Hi @jignnsd @timothycarambat I believe I can help with this issue. I’m the maintainer of LiteLLM https://github.com/BerriAI/litellm - we allow you to use any LLM as a drop in replacement for
gpt-3.5-turbo.You can use LiteLLM in the following ways:
With your own API KEY:
This calls the provider API directly
Using the LiteLLM Proxy with a LiteLLM Key
this is great if you don’t have access to claude but want to use the open source LiteLLM proxy to access claude