langchain/libs/partners/openai
2024-01-25 13:51:41 -08:00
..
langchain_openai docs: add rag citations page (#16549) 2024-01-25 13:51:41 -08:00
scripts openai[minor]: implement langchain-openai package (#15503) 2024-01-05 15:03:28 -08:00
tests core[patch], openai[patch]: Chat openai stream logprobs (#16218) 2024-01-19 09:16:09 -08:00
.gitignore openai[minor]: implement langchain-openai package (#15503) 2024-01-05 15:03:28 -08:00
LICENSE openai[minor]: implement langchain-openai package (#15503) 2024-01-05 15:03:28 -08:00
Makefile openai[minor]: implement langchain-openai package (#15503) 2024-01-05 15:03:28 -08:00
poetry.lock openai[patch]: Release 0.0.3 (#16289) 2024-01-19 10:15:08 -08:00
pyproject.toml openai[patch]: Release 0.0.3 (#16289) 2024-01-19 10:15:08 -08:00
README.md openai[patch], docs: readme (#15773) 2024-01-09 11:52:24 -08:00

langchain-openai

This package contains the LangChain integrations for OpenAI through their openai SDK.

Installation and Setup

  • Install the LangChain partner package
pip install langchain-openai
  • Get an OpenAI api key and set it as an environment variable (OPENAI_API_KEY)

LLM

See a usage example.

from langchain_openai import OpenAI

If you are using a model hosted on Azure, you should use different wrapper for that:

from langchain_openai import AzureOpenAI

For a more detailed walkthrough of the Azure wrapper, see here

Chat model

See a usage example.

from langchain_openai import ChatOpenAI

If you are using a model hosted on Azure, you should use different wrapper for that:

from langchain_openai import AzureChatOpenAI

For a more detailed walkthrough of the Azure wrapper, see here

Text Embedding Model

See a usage example

from langchain_openai import OpenAIEmbeddings

If you are using a model hosted on Azure, you should use different wrapper for that:

from langchain_openai import AzureOpenAIEmbeddings

For a more detailed walkthrough of the Azure wrapper, see here