langchain/libs/partners/openai
2024-04-24 19:34:57 +00:00
..
langchain_openai docs, multiple: de-beta with_structured_output (#20850) 2024-04-24 19:34:57 +00:00
scripts infra: add print rule to ruff (#16221) 2024-02-09 16:13:30 -08:00
tests core, openai: support custom token encoders (#20762) 2024-04-23 13:57:05 +00:00
.gitignore openai[minor]: implement langchain-openai package (#15503) 2024-01-05 15:03:28 -08:00
LICENSE openai[minor]: implement langchain-openai package (#15503) 2024-01-05 15:03:28 -08:00
Makefile openai[patch]: fix name param (#19365) 2024-03-20 22:22:09 +00:00
poetry.lock release anthropic, fireworks, openai, groq, mistral (#20333) 2024-04-11 09:19:52 -07:00
pyproject.toml release anthropic, fireworks, openai, groq, mistral (#20333) 2024-04-11 09:19:52 -07:00
README.md openai[patch], docs: readme (#15773) 2024-01-09 11:52:24 -08:00

langchain-openai

This package contains the LangChain integrations for OpenAI through their openai SDK.

Installation and Setup

  • Install the LangChain partner package
pip install langchain-openai
  • Get an OpenAI api key and set it as an environment variable (OPENAI_API_KEY)

LLM

See a usage example.

from langchain_openai import OpenAI

If you are using a model hosted on Azure, you should use different wrapper for that:

from langchain_openai import AzureOpenAI

For a more detailed walkthrough of the Azure wrapper, see here

Chat model

See a usage example.

from langchain_openai import ChatOpenAI

If you are using a model hosted on Azure, you should use different wrapper for that:

from langchain_openai import AzureChatOpenAI

For a more detailed walkthrough of the Azure wrapper, see here

Text Embedding Model

See a usage example

from langchain_openai import OpenAIEmbeddings

If you are using a model hosted on Azure, you should use different wrapper for that:

from langchain_openai import AzureOpenAIEmbeddings

For a more detailed walkthrough of the Azure wrapper, see here