langchain/libs/partners/openai
Bagatur 8461934c2b
core[patch], integrations[patch]: convert TypedDict to tool schema support (#24641)
supports following UX

```python
    class SubTool(TypedDict):
        """Subtool docstring"""

        args: Annotated[Dict[str, Any], {}, "this does bar"]

    class Tool(TypedDict):
        """Docstring
        Args:
            arg1: foo
        """

        arg1: str
        arg2: Union[int, str]
        arg3: Optional[List[SubTool]]
        arg4: Annotated[Literal["bar", "baz"], ..., "this does foo"]
        arg5: Annotated[Optional[float], None]
```

- can parse google style docstring
- can use Annotated to specify default value (second arg)
- can use Annotated to specify arg description (third arg)
- can have nested complex types
2024-07-31 18:27:24 +00:00
..
langchain_openai core[patch], integrations[patch]: convert TypedDict to tool schema support (#24641) 2024-07-31 18:27:24 +00:00
scripts infra: add print rule to ruff (#16221) 2024-02-09 16:13:30 -08:00
tests openai[patch]: move test (#24552) 2024-07-23 10:22:22 -04:00
.gitignore openai[minor]: implement langchain-openai package (#15503) 2024-01-05 15:03:28 -08:00
LICENSE openai[minor]: implement langchain-openai package (#15503) 2024-01-05 15:03:28 -08:00
Makefile infra: update mypy 1.10, ruff 0.5 (#23721) 2024-07-03 10:33:27 -07:00
poetry.lock integration releases (#24725) 2024-07-26 12:30:10 -07:00
pyproject.toml integration releases (#24725) 2024-07-26 12:30:10 -07:00
README.md

langchain-openai

This package contains the LangChain integrations for OpenAI through their openai SDK.

Installation and Setup

  • Install the LangChain partner package
pip install langchain-openai
  • Get an OpenAI api key and set it as an environment variable (OPENAI_API_KEY)

LLM

See a usage example.

from langchain_openai import OpenAI

If you are using a model hosted on Azure, you should use different wrapper for that:

from langchain_openai import AzureOpenAI

For a more detailed walkthrough of the Azure wrapper, see here

Chat model

See a usage example.

from langchain_openai import ChatOpenAI

If you are using a model hosted on Azure, you should use different wrapper for that:

from langchain_openai import AzureChatOpenAI

For a more detailed walkthrough of the Azure wrapper, see here

Text Embedding Model

See a usage example

from langchain_openai import OpenAIEmbeddings

If you are using a model hosted on Azure, you should use different wrapper for that:

from langchain_openai import AzureOpenAIEmbeddings

For a more detailed walkthrough of the Azure wrapper, see here