langchain/libs/partners/openai/langchain_openai
Simon Kelly a682f0d12b
openai[patch]: wrap stream code in context manager blocks (#18013)
**Description:**
Use the `Stream` context managers in `ChatOpenAi` `stream` and `astream`
method.

Using the context manager returned by the OpenAI client makes it
possible to terminate the stream early since the response connection
will be closed when the context manager exists.

**Issue:** #5340
**Twitter handle:** @snopoke

---------

Co-authored-by: Bagatur <baskaryan@gmail.com>
Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com>
2024-04-09 17:40:16 +00:00
..
chat_models openai[patch]: wrap stream code in context manager blocks (#18013) 2024-04-09 17:40:16 +00:00
embeddings openai[patch]: remove openai chunk size validation (#19878) 2024-04-01 18:26:06 +00:00
llms openai[patch]: add checking codes for calling AI model get error (#17909) 2024-03-29 00:17:32 +00:00
output_parsers langchain[patch], core[patch], openai[patch], fireworks[minor]: ChatFireworks.with_structured_output (#18078) 2024-02-26 12:46:39 -08:00
__init__.py
py.typed