Skip to content

Add support for Palm, Claude-2, Llama2, CodeLlama (100+LLMs) #129

Open
ishaan-jaff wants to merge 1 commit intogrumpyp:mainfrom
ishaan-jaff:main
Open

Add support for Palm, Claude-2, Llama2, CodeLlama (100+LLMs) #129
ishaan-jaff wants to merge 1 commit intogrumpyp:mainfrom
ishaan-jaff:main

Conversation

@ishaan-jaff
Copy link
Copy Markdown

This PR adds support for the above mentioned LLMs using LiteLLM https://github.com/BerriAI/litellm/

Example

from litellm import completion

## set ENV variables
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion("command-nightly", messages)

# anthropic call
response = completion(model="claude-instant-1", messages=messages)

@vercel
Copy link
Copy Markdown

vercel Bot commented Sep 9, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
aixplora ✅ Ready (Inspect) Visit Preview 💬 Add feedback Sep 9, 2023 4:09pm

@ishaan-jaff
Copy link
Copy Markdown
Author

addressing: #128

@ishaan-jaff
Copy link
Copy Markdown
Author

@grumpyp can you take a look at this pr when possible thanks !

@grumpyp
Copy link
Copy Markdown
Owner

grumpyp commented Sep 9, 2023

Thanks for the contribution. Would you also implement it from the frontend side? Does this implementation download the LLMs to your machine?

If yes, did you have a look how the other LLMs currently would be stored? So it would make sense to do it in the same way :)

Thx!

@ishaan-jaff
Copy link
Copy Markdown
Author

  • can we address the front end in a separate pr ?
  • this does not download any llms to your machine

@grumpyp
Copy link
Copy Markdown
Owner

grumpyp commented Sep 11, 2023

Why should we seperate it in another PR?

Ok as far as I understand it just uses the LLMs provided by 3rd party API's?

Thats fine, as long as it is compatible with our current implementation. Is there a list of all available LLMs which can be used with litellm? So we could think of how to implement it in a nice way in frontend side.

Or at least add some tests to this PR please.

@ghost
Copy link
Copy Markdown

ghost commented Oct 4, 2023

@grumpyp

re: provider list,

yes here are the docs - https://docs.litellm.ai/docs/providers

via code:

import litellm

print(litellm.provider_list)

@grumpyp
Copy link
Copy Markdown
Owner

grumpyp commented Oct 5, 2023

@grumpyp

re: provider list,

yes here are the docs - https://docs.litellm.ai/docs/providers

via code:

import litellm

print(litellm.provider_list)

Feel free to test everything.

If it doesn't break, I'll be happy to merge it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants