Commit Graph

53 Commits (7d83189b198dea6cb228e2092f7697d380051052)

Author SHA1 Message Date
Erick Friis 81639243e2
openai: release 0.1.17 (#24361) 2 months ago
Bagatur 13b0d7ec8f
openai[patch]: Release 0.1.16 (#24202) 2 months ago
Bagatur cb5031f22f
integrations[patch]: require core >=0.2.17 (#24207) 2 months ago
Erick Friis 71c2221f8c
openai: release 0.1.15 (#24097) 2 months ago
Bagatur a0c2281540
infra: update mypy 1.10, ruff 0.5 (#23721)
```python
"""python scripts/update_mypy_ruff.py"""
import glob
import tomllib
from pathlib import Path

import toml
import subprocess
import re

ROOT_DIR = Path(__file__).parents[1]


def main():
    for path in glob.glob(str(ROOT_DIR / "libs/**/pyproject.toml"), recursive=True):
        print(path)
        with open(path, "rb") as f:
            pyproject = tomllib.load(f)
        try:
            pyproject["tool"]["poetry"]["group"]["typing"]["dependencies"]["mypy"] = (
                "^1.10"
            )
            pyproject["tool"]["poetry"]["group"]["lint"]["dependencies"]["ruff"] = (
                "^0.5"
            )
        except KeyError:
            continue
        with open(path, "w") as f:
            toml.dump(pyproject, f)
        cwd = "/".join(path.split("/")[:-1])
        completed = subprocess.run(
            "poetry lock --no-update; poetry install --with typing; poetry run mypy . --no-color",
            cwd=cwd,
            shell=True,
            capture_output=True,
            text=True,
        )
        logs = completed.stdout.split("\n")

        to_ignore = {}
        for l in logs:
            if re.match("^(.*)\:(\d+)\: error:.*\[(.*)\]", l):
                path, line_no, error_type = re.match(
                    "^(.*)\:(\d+)\: error:.*\[(.*)\]", l
                ).groups()
                if (path, line_no) in to_ignore:
                    to_ignore[(path, line_no)].append(error_type)
                else:
                    to_ignore[(path, line_no)] = [error_type]
        print(len(to_ignore))
        for (error_path, line_no), error_types in to_ignore.items():
            all_errors = ", ".join(error_types)
            full_path = f"{cwd}/{error_path}"
            try:
                with open(full_path, "r") as f:
                    file_lines = f.readlines()
            except FileNotFoundError:
                continue
            file_lines[int(line_no) - 1] = (
                file_lines[int(line_no) - 1][:-1] + f"  # type: ignore[{all_errors}]\n"
            )
            with open(full_path, "w") as f:
                f.write("".join(file_lines))

        subprocess.run(
            "poetry run ruff format .; poetry run ruff --select I --fix .",
            cwd=cwd,
            shell=True,
            capture_output=True,
            text=True,
        )


if __name__ == "__main__":
    main()

```
3 months ago
Bagatur 6168c846b2
openai[patch]: Release 0.1.14 (#23782) 3 months ago
Bagatur af2c05e5f3
openai[patch]: Release 0.1.13 (#23651) 3 months ago
ccurme 5d93916665
openai[patch]: release 0.1.12 (#23641) 3 months ago
ccurme bffc3c24a0
openai[patch]: release 0.1.11 (#23596) 3 months ago
ccurme 5bfcb898ad
openai[patch]: bump sdk version (#23592)
Tests failing with `TypeError: Completions.create() got an unexpected
keyword argument 'parallel_tool_calls'`
3 months ago
Bagatur 92ac0fc9bd
openai[patch]: Release 0.1.10 (#23410) 3 months ago
ccurme 75c7c3a1a7
openai: release 0.1.9 (#23263) 3 months ago
Bagatur 8698cb9b28
infra: add more formatter rules to openai (#23189)
Turns on
https://docs.astral.sh/ruff/settings/#format_docstring-code-format and
https://docs.astral.sh/ruff/settings/#format_skip-magic-trailing-comma

```toml
[tool.ruff.format]
docstring-code-format = true
skip-magic-trailing-comma = true
```
3 months ago
Bagatur 0a4ee864e9
openai[patch]: image token counting (#23147)
Resolves #23000

---------

Co-authored-by: isaac hershenson <ihershenson@hmc.edu>
Co-authored-by: ccurme <chester.curme@gmail.com>
3 months ago
ccurme 42257b120f
partners: fix numpy dep (#22858)
Following https://github.com/langchain-ai/langchain/pull/22813, which
added python 3.12 to CI, here we update numpy accordingly in partner
packages.
3 months ago
ccurme 6e1df72a88
openai[patch]: Release 0.1.8 (#22291) 4 months ago
ccurme 9a010fb761
openai: read stream_options (#21548)
OpenAI recently added a `stream_options` parameter to its chat
completions API (see [release
notes](https://platform.openai.com/docs/changelog/added-chat-completions-stream-usage)).
When this parameter is set to `{"usage": True}`, an extra "empty"
message is added to the end of a stream containing token usage. Here we
propagate token usage to `AIMessage.usage_metadata`.

We enable this feature by default. Streams would now include an extra
chunk at the end, **after** the chunk with
`response_metadata={'finish_reason': 'stop'}`.

New behavior:
```
[AIMessageChunk(content='', id='run-4b20dbe0-3817-4f62-b89d-03ef76f25bde'),
 AIMessageChunk(content='Hello', id='run-4b20dbe0-3817-4f62-b89d-03ef76f25bde'),
 AIMessageChunk(content='!', id='run-4b20dbe0-3817-4f62-b89d-03ef76f25bde'),
 AIMessageChunk(content='', response_metadata={'finish_reason': 'stop'}, id='run-4b20dbe0-3817-4f62-b89d-03ef76f25bde'),
 AIMessageChunk(content='', id='run-4b20dbe0-3817-4f62-b89d-03ef76f25bde', usage_metadata={'input_tokens': 8, 'output_tokens': 9, 'total_tokens': 17})]
```

Old behavior (accessible by passing `stream_options={"include_usage":
False}` into (a)stream:
```
[AIMessageChunk(content='', id='run-1312b971-c5ea-4d92-9015-e6604535f339'),
 AIMessageChunk(content='Hello', id='run-1312b971-c5ea-4d92-9015-e6604535f339'),
 AIMessageChunk(content='!', id='run-1312b971-c5ea-4d92-9015-e6604535f339'),
 AIMessageChunk(content='', response_metadata={'finish_reason': 'stop'}, id='run-1312b971-c5ea-4d92-9015-e6604535f339')]
```

From what I can tell this is not yet implemented in Azure, so we enable
only for ChatOpenAI.
4 months ago
ccurme 152c8cac33
anthropic, openai: cut pre-releases (#22083) 4 months ago
ccurme 4470d3b4a0
partners: bump core in packages implementing ls_params (#21868)
These packages all import `LangSmithParams` which was released in
langchain-core==0.2.0.

N.B. we will need to release `openai` and then bump `langchain-openai`
in `together` and `upstage`.
4 months ago
Bagatur af284518bc
openai[patch]: Release 0.1.7, bump tiktoken 0.7.0 (#21723) 4 months ago
Erick Friis c77d2f2b06
multiple: core 0.2 nonbreaking dep, check_diff community->langchain dep (#21646)
0.2 is not a breaking release for core (but it is for langchain and
community)

To keep the core+langchain+community packages in sync at 0.2, we will
relax deps throughout the ecosystem to tolerate `langchain-core` 0.2
4 months ago
Bagatur 67a5cc34c6
openai[patch]: Release 0.1.6 (#21236) 5 months ago
Bagatur 6ac6158a07
openai[patch]: support tool_choice="required" (#21216)
Co-authored-by: ccurme <chester.curme@gmail.com>
5 months ago
Bagatur 6fa8626e2f
openai[patch]: fix azure open lc serialization, release 0.1.5 (#21159) 5 months ago
Bagatur bef50ded63
openai[patch]: fix special token default behavior (#21131)
By default handle special sequences as regular text
5 months ago
ccurme 465fbaa30b
openai: release 0.1.4 (#20939) 5 months ago
Bagatur 799714c629
release anthropic, fireworks, openai, groq, mistral (#20333) 6 months ago
Erick Friis 9eb6f538f0
infra, multiple: rc release versions (#20252) 6 months ago
Bagatur a8eb0f5b1b
openai[patch]: pre-release 0.1.3-rc.1 (#20249) 6 months ago
Bagatur 0b2f0307d7
openai[patch]: Release 0.1.2 (#20241) 6 months ago
Erick Friis 855ba46f80
standard-tests: a standard unit and integration test set (#20182)
just chat models for now
6 months ago
Erick Friis e71daa7a03
openai[patch]: add test coverage to output (#19462) 6 months ago
Erick Friis ac57123f40
openai[patch]: release 0.1.1 (#19458) 6 months ago
Erick Friis a9cda536ad
openai[patch]: fix core min version (#19366) 6 months ago
Erick Friis f6c8700326
openai[patch]: release 0.1.0, message id and name support (#19363) 6 months ago
Bagatur 242af4b5a4
openai[patch], mistral[patch], fireworks[patch]: releases 0.0.8, 0.0.5, 0.0.2 (#18186) 7 months ago
Bagatur 1c1bb1152e
openai[patch]: refactor with_structured_output (#18052)
- make schema Optional with default val None, since in json_mode you
don't need it if not parsing to pydantic
- change return_type -> include_raw
- expand docstring examples
7 months ago
Erick Friis a05fb19f42
openai[patch]: remove numpy dep (#18034) 7 months ago
Bagatur cc0290fdf3
openai[patch]: Release 0.0.7 (#17993) 7 months ago
Erick Friis a99c667c22
partners: version constraints (#17492)
Core should be ^0.1 by default

Careful about 0.x.y and 0.0.z packages
7 months ago
Erick Friis 37678471c4
openai[patch]: relax tiktoken constraint, release 0.0.6 (#17472) 7 months ago
Erick Friis 3a2eb6e12b
infra: add print rule to ruff (#16221)
Added noqa for existing prints. Can slowly remove / will prevent more
being intro'd
8 months ago
Charlie Marsh 24c0bab57b
infra, multiple: Upgrade configuration for Ruff v0.2.0 (#16905)
## Summary

This PR upgrades LangChain's Ruff configuration in preparation for
Ruff's v0.2.0 release. (The changes are compatible with Ruff v0.1.5,
which LangChain uses today.) Specifically, we're now warning when
linter-only options are specified under `[tool.ruff]` instead of
`[tool.ruff.lint]`.

---------

Co-authored-by: Erick Friis <erick@langchain.dev>
Co-authored-by: Bagatur <baskaryan@gmail.com>
8 months ago
Bagatur bcc71d1a57
openai[patch]: Release 0.0.5 (#16598) 8 months ago
Bagatur 61e876aad8
openai[patch]: Explicitly support embedding dimensions (#16596) 8 months ago
Bagatur 75ad0bba2d
openai[patch]: Release 0.0.4 (#16590) 8 months ago
Bagatur 91230ef5d1
openai[patch]: Release 0.0.3 (#16289) 8 months ago
Erick Friis 06fe2f4fb0
partners: add license field (#16117)
- bumps package post versions for packages without current unreleased
updates
- will bump package version in release prs associated with packages that
do have changes (mistral, vertex)
8 months ago
Erick Friis 95020637bc
openai[patch]: 0.0.2.post1, urls (#15961) 9 months ago
Erick Friis 1bc6b19ea7
openai[patch]: v0.0.2 (#15618) 9 months ago