mirror of
https://github.com/hwchase17/langchain
synced 2024-11-10 01:10:59 +00:00
181dfef118
1. Adds `.get_ls_params` to BaseChatModel which returns ```python class LangSmithParams(TypedDict, total=False): ls_provider: str ls_model_name: str ls_model_type: Literal["chat"] ls_temperature: Optional[float] ls_max_tokens: Optional[int] ls_stop: Optional[List[str]] ``` by default it will only return ```python {ls_model_type="chat", ls_stop=stop} ``` 2. Add these params to inheritable metadata in `CallbackManager.configure` 3. Implement `.get_ls_params` and populate all params for Anthropic + all subclasses of BaseChatOpenAI Sample trace: https://smith.langchain.com/public/d2962673-4c83-47c7-b51e-61d07aaffb1b/r **OpenAI**: <img width="984" alt="Screenshot 2024-05-17 at 10 03 35 AM" src="https://github.com/langchain-ai/langchain/assets/26529506/2ef41f74-a9df-4e0e-905d-da74fa82a910"> **Anthropic**: <img width="978" alt="Screenshot 2024-05-17 at 10 06 07 AM" src="https://github.com/langchain-ai/langchain/assets/26529506/39701c9f-7da5-4f1a-ab14-84e9169d63e7"> **Mistral** (and all others for which params are not yet populated): <img width="977" alt="Screenshot 2024-05-17 at 10 08 43 AM" src="https://github.com/langchain-ai/langchain/assets/26529506/37d7d894-fec2-4300-986f-49a5f0191b03"> |
||
---|---|---|
.. | ||
langchain_openai | ||
scripts | ||
tests | ||
.gitignore | ||
LICENSE | ||
Makefile | ||
poetry.lock | ||
pyproject.toml | ||
README.md |
langchain-openai
This package contains the LangChain integrations for OpenAI through their openai
SDK.
Installation and Setup
- Install the LangChain partner package
pip install langchain-openai
- Get an OpenAI api key and set it as an environment variable (
OPENAI_API_KEY
)
LLM
See a usage example.
from langchain_openai import OpenAI
If you are using a model hosted on Azure
, you should use different wrapper for that:
from langchain_openai import AzureOpenAI
For a more detailed walkthrough of the Azure
wrapper, see here
Chat model
See a usage example.
from langchain_openai import ChatOpenAI
If you are using a model hosted on Azure
, you should use different wrapper for that:
from langchain_openai import AzureChatOpenAI
For a more detailed walkthrough of the Azure
wrapper, see here
Text Embedding Model
See a usage example
from langchain_openai import OpenAIEmbeddings
If you are using a model hosted on Azure
, you should use different wrapper for that:
from langchain_openai import AzureOpenAIEmbeddings
For a more detailed walkthrough of the Azure
wrapper, see here