0.2 is not a breaking release for core (but it is for langchain and
community)
To keep the core+langchain+community packages in sync at 0.2, we will
relax deps throughout the ecosystem to tolerate `langchain-core` 0.2
Thank you for contributing to LangChain!
- [x] **PR title**: "langchain-ibm: Fix llm and embeddings 'verify'
attribute default value"
- [x] **PR message**:
- **Description:** fix default value of "verify" attribute
- **Dependencies:** `ibm_watsonx_ai`
- [x] **Add tests and docs**: If you're adding a new integration, please
include
1. a test for the integration, preferably unit tests that do not rely on
network access,
2. an example notebook showing its use. It lives in
`docs/docs/integrations` directory.
- [x] **Lint and test**: Run `make format`, `make lint` and `make test`
from the root of the package(s) you've modified. See contribution
guidelines for more: https://python.langchain.com/docs/contributing/
Co-authored-by: Erick Friis <erick@langchain.dev>
Thank you for contributing to LangChain!
- [x] **PR title**: "langchain-ibm: Add support for ibm-watsonx-ai new
major version"
- [x] **PR message**:
- **Description:** Add support for ibm-watsonx-ai new major version
- **Dependencies:** `ibm_watsonx_ai`
- [x] **Add tests and docs**: If you're adding a new integration, please
include
1. a test for the integration, preferably unit tests that do not rely on
network access,
2. an example notebook showing its use. It lives in
`docs/docs/integrations` directory.
- [x] **Lint and test**: Run `make format`, `make lint` and `make test`
from the root of the package(s) you've modified. See contribution
guidelines for more: https://python.langchain.com/docs/contributing/
Co-authored-by: Erick Friis <erick@langchain.dev>
- **Description:** add async tests, add tokenize support
- **Dependencies:**
[ibm-watsonx-ai](https://pypi.org/project/ibm-watsonx-ai/),
- **Tag maintainer:**
Please make sure your PR is passing linting and testing before
submitting. Run `make format`, `make lint` and `make test` to check this
locally -> ✅
Please make sure integration_tests passing locally -> ✅
---------
Co-authored-by: Erick Friis <erick@langchain.dev>
- **Description:** Add possibility to pass ModelInference or Model
object to WatsonxLLM class
- **Dependencies:**
[ibm-watsonx-ai](https://pypi.org/project/ibm-watsonx-ai/),
- **Tag maintainer:** :
Please make sure your PR is passing linting and testing before
submitting. Run `make format`, `make lint` and `make test` to check this
locally. ✅