Skip to content
forked from BerriAI/litellm

lightweight package to simplify LLM API calls - Azure, OpenAI, Cohere, Anthropic. Manages input/output translation

License

Notifications You must be signed in to change notification settings

hbcbh1999/litellm

 
 

Repository files navigation

litellm

a simple & light 100 line package to call OpenAI, Azure, Cohere, Anthropic API Endpoints

litellm manages:

  • translating inputs to completion and embedding endpoints
  • guarantees consistent output, text responses will always be available at ['choices'][0]['message']['content']

usage

from litellm import completion

## set ENV variables
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion("command-nightly", messages)

# azure openai call
response = completion("chatgpt-test", messages, azure=True)

installation

pip install litellm

why did I build this

  • Need for simplicity: My code started to get extremely complicated managing & translating calls between Azure, OpenAI, Cohere

About

lightweight package to simplify LLM API calls - Azure, OpenAI, Cohere, Anthropic. Manages input/output translation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%