Fixed the assignment of custom_llm_provider argument (#11628)

- **Description:** Assigning the custom_llm_provider to the default
params function so that it will be passed to the litellm
- **Issue:** Even though the custom_llm_provider argument is being
defined it's not being assigned anywhere in the code and hence its not
being passed to litellm, therefore any litellm call which uses the
custom_llm_provider as required parameter is being failed. This
parameter is mainly used by litellm when we are doing inference via
Custom API server.
https://docs.litellm.ai/docs/providers/custom_openai_proxy
  - **Dependencies:** No dependencies are required

@krrishdholakia , @baskaryan

---------

Co-authored-by: Bagatur <baskaryan@gmail.com>
pull/10675/head
Tarun Thotakura 9 months ago committed by GitHub
parent db67ccb0bb
commit c9d4d53545
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -207,6 +207,7 @@ class ChatLiteLLM(BaseChatModel):
"stream": self.streaming,
"n": self.n,
"temperature": self.temperature,
"custom_llm_provider": self.custom_llm_provider,
**self.model_kwargs,
}

Loading…
Cancel
Save