mirror of https://github.com/hwchase17/langchain
Fixed the assignment of custom_llm_provider argument (#11628)
- **Description:** Assigning the custom_llm_provider to the default params function so that it will be passed to the litellm - **Issue:** Even though the custom_llm_provider argument is being defined it's not being assigned anywhere in the code and hence its not being passed to litellm, therefore any litellm call which uses the custom_llm_provider as required parameter is being failed. This parameter is mainly used by litellm when we are doing inference via Custom API server. https://docs.litellm.ai/docs/providers/custom_openai_proxy - **Dependencies:** No dependencies are required @krrishdholakia , @baskaryan --------- Co-authored-by: Bagatur <baskaryan@gmail.com>pull/10675/head
parent
db67ccb0bb
commit
c9d4d53545
Loading…
Reference in New Issue