langchain/libs/partners/openai
2024-07-17 22:26:33 +00:00
..
langchain_openai openai[patch]: use model_name in AzureOpenAI.ls_model_name (#24366) 2024-07-17 15:24:05 -07:00
scripts infra: add print rule to ruff (#16221) 2024-02-09 16:13:30 -08:00
tests openai[patch]: use model_name in AzureOpenAI.ls_model_name (#24366) 2024-07-17 15:24:05 -07:00
.gitignore openai[minor]: implement langchain-openai package (#15503) 2024-01-05 15:03:28 -08:00
LICENSE openai[minor]: implement langchain-openai package (#15503) 2024-01-05 15:03:28 -08:00
Makefile infra: update mypy 1.10, ruff 0.5 (#23721) 2024-07-03 10:33:27 -07:00
poetry.lock openai: release 0.1.18 (#24369) 2024-07-17 22:26:33 +00:00
pyproject.toml openai: release 0.1.18 (#24369) 2024-07-17 22:26:33 +00:00
README.md openai[patch], docs: readme (#15773) 2024-01-09 11:52:24 -08:00

langchain-openai

This package contains the LangChain integrations for OpenAI through their openai SDK.

Installation and Setup

  • Install the LangChain partner package
pip install langchain-openai
  • Get an OpenAI api key and set it as an environment variable (OPENAI_API_KEY)

LLM

See a usage example.

from langchain_openai import OpenAI

If you are using a model hosted on Azure, you should use different wrapper for that:

from langchain_openai import AzureOpenAI

For a more detailed walkthrough of the Azure wrapper, see here

Chat model

See a usage example.

from langchain_openai import ChatOpenAI

If you are using a model hosted on Azure, you should use different wrapper for that:

from langchain_openai import AzureChatOpenAI

For a more detailed walkthrough of the Azure wrapper, see here

Text Embedding Model

See a usage example

from langchain_openai import OpenAIEmbeddings

If you are using a model hosted on Azure, you should use different wrapper for that:

from langchain_openai import AzureOpenAIEmbeddings

For a more detailed walkthrough of the Azure wrapper, see here