Commit Graph

2895 Commits (52ccae3fb12a77b638b4e3467bef1b6ef50591ef)

Author SHA1 Message Date
Seungwoo Ryu 570b4f8e66
docs: Update openai_tools.ipynb (#16618)
typo
5 months ago
Callum 6a75ef74ca
docs: Fix typo in XML agent documentation (#16645)
This is a tiny PR that just replacer "moduels" with "modules" in the
documentation for XML agents.
5 months ago
baichuan-assistant 70ff54eace
community[minor]: Add Baichuan Text Embedding Model and Baichuan Inc introduction (#16568)
- **Description:** Adding Baichuan Text Embedding Model and Baichuan Inc
introduction.

Baichuan Text Embedding ranks #1 in C-MTEB leaderboard:
https://huggingface.co/spaces/mteb/leaderboard

Co-authored-by: BaiChuanHelper <wintergyc@WinterGYCs-MacBook-Pro.local>
5 months ago
Ghani e30c6662df
Langchain-community : EdenAI chat integration. (#16377)
- **Description:** This PR adds [EdenAI](https://edenai.co/) for the
chat model (already available in LLM & Embeddings). It supports all
[ChatModel] functionality: generate, async generate, stream, astream and
batch. A detailed notebook was added.

  - **Dependencies**: No dependencies are added as we call a rest API.

---------

Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com>
5 months ago
Bagatur 61e876aad8
openai[patch]: Explicitly support embedding dimensions (#16596) 5 months ago
Bagatur 6c89507988
docs: add rag citations page (#16549) 5 months ago
Bagatur db80832e4f
docs: output parser nits (#16588) 5 months ago
Bagatur ef42d9d559
core[patch], community[patch], openai[patch]: consolidate openai tool… (#16485)
… converters

One way to convert anything to an OAI function:
convert_to_openai_function
One way to convert anything to an OAI tool: convert_to_openai_tool
Corresponding bind functions on OAI models: bind_functions, bind_tools
5 months ago
Brian Burgin 148347e858
community[minor]: Add LiteLLM Router Integration (#15588)
community:

  - **Description:**
- Add new ChatLiteLLMRouter class that allows a client to use a LiteLLM
Router as a LangChain chat model.
- Note: The existing ChatLiteLLM integration did not cover the LiteLLM
Router class.
    - Add tests and Jupyter notebook.
  - **Issue:** None
  - **Dependencies:** Relies on existing ChatLiteLLM integration
  - **Twitter handle:** @bburgin_0

---------

Co-authored-by: Bagatur <baskaryan@gmail.com>
5 months ago
Bob Lin 35e60728b7
docs: Fix broken urls (#16559) 5 months ago
Bob Lin 6023953ea7
docs: Fix github link (#16560) 5 months ago
Erick Friis adc008407e
exa: init pkg (#16553) 5 months ago
Rave Harpaz c4e9c9ca29
community[minor]: Add OCI Generative AI integration (#16548)
<!-- Thank you for contributing to LangChain!

Please title your PR "<package>: <description>", where <package> is
whichever of langchain, community, core, experimental, etc. is being
modified.

Replace this entire comment with:
- **Description:** Adding Oracle Cloud Infrastructure Generative AI
integration. Oracle Cloud Infrastructure (OCI) Generative AI is a fully
managed service that provides a set of state-of-the-art, customizable
large language models (LLMs) that cover a wide range of use cases, and
which is available through a single API. Using the OCI Generative AI
service you can access ready-to-use pretrained models, or create and
host your own fine-tuned custom models based on your own data on
dedicated AI clusters.
https://docs.oracle.com/en-us/iaas/Content/generative-ai/home.htm
  - **Issue:** None,
  - **Dependencies:** OCI Python SDK,
- **Twitter handle:** we announce bigger features on Twitter. If your PR
gets announced, and you'd like a mention, we'll gladly shout you out!

Please make sure your PR is passing linting and testing before
submitting. Run `make format`, `make lint` and `make test` from the root
of the package you've modified to check this locally.
Passed

See contribution guidelines for more information on how to write/run
tests, lint, etc: https://python.langchain.com/docs/contributing/

If you're adding a new integration, please include:
1. a test for the integration, preferably unit tests that do not rely on
network access,
2. an example notebook showing its use. It lives in
`docs/docs/integrations` directory.

we provide unit tests. However, we cannot provide integration tests due
to Oracle policies that prohibit public sharing of api keys.
 
If no one reviews your PR within a few days, please @-mention one of
@baskaryan, @eyurtsev, @hwchase17.
 -->

---------

Co-authored-by: Arthur Cheng <arthur.cheng@oracle.com>
Co-authored-by: Bagatur <baskaryan@gmail.com>
5 months ago
Leonid Ganeline f6a05e964b
docs: `Hugging Face` update (#16490)
- added missed integrations to the platform page
- updated integration examples: added links and fixed formats
5 months ago
Harel Gal a91181fe6d
community[minor]: add support for Guardrails for Amazon Bedrock (#15099)
Added support for optionally supplying 'Guardrails for Amazon Bedrock'
on both types of model invocations (batch/regular and streaming) and for
all models supported by the Amazon Bedrock service.

@baskaryan  @hwchase17

```python 
llm = Bedrock(model_id="<model_id>", client=bedrock,
                  model_kwargs={},
                  guardrails={"id": " <guardrail_id>",
                              "version": "<guardrail_version>",
                               "trace": True}, callbacks=[BedrockAsyncCallbackHandler()])

class BedrockAsyncCallbackHandler(AsyncCallbackHandler):
    """Async callback handler that can be used to handle callbacks from langchain."""

    async def on_llm_error(
            self,
            error: BaseException,
            **kwargs: Any,
    ) -> Any:
        reason = kwargs.get("reason")
        if reason == "GUARDRAIL_INTERVENED":
           # kwargs contains additional trace information sent by 'Guardrails for Bedrock' service.
            print(f"""Guardrails: {kwargs}""")


# streaming 
llm = Bedrock(model_id="<model_id>", client=bedrock,
                  model_kwargs={},
                  streaming=True,
                  guardrails={"id": "<guardrail_id>",
                              "version": "<guardrail_version>"})
```

---------

Co-authored-by: Bagatur <baskaryan@gmail.com>
5 months ago
Martin Kolb 04651f0248
community[minor]: VectorStore integration for SAP HANA Cloud Vector Engine (#16514)
- **Description:**
This PR adds a VectorStore integration for SAP HANA Cloud Vector Engine,
which is an upcoming feature in the SAP HANA Cloud database
(https://blogs.sap.com/2023/11/02/sap-hana-clouds-vector-engine-announcement/).

  - **Issue:** N/A
- **Dependencies:** [SAP HANA Python
Client](https://pypi.org/project/hdbcli/)
  - **Twitter handle:** @sapopensource

Implementation of the integration:
`libs/community/langchain_community/vectorstores/hanavector.py`

Unit tests:
`libs/community/tests/unit_tests/vectorstores/test_hanavector.py`

Integration tests:
`libs/community/tests/integration_tests/vectorstores/test_hanavector.py`

Example notebook:
`docs/docs/integrations/vectorstores/hanavector.ipynb`

Access credentials for execution of the integration tests can be
provided to the maintainers.

---------

Co-authored-by: sascha <sascha.stoll@sap.com>
Co-authored-by: Bagatur <baskaryan@gmail.com>
5 months ago
Bob Lin 54dd8e52a8
docs: Updated comments about `n_gpu_layers` in the Metal section (#16501)
Ref: https://github.com/langchain-ai/langchain/issues/16502
5 months ago
Anastasiia Manokhina ce595f0203
docs:Updated integration docs structure for chat/google_vertex_ai_palm (#16201)
Description: 

- checked that the doc chat/google_vertex_ai_palm is using new
functions: invoke, stream etc.
- added Gemini example
- fixed wrong output in Sanskrit example

Issue: https://github.com/langchain-ai/langchain/issues/15664
Dependencies: None
Twitter handle: None
6 months ago
Erick Friis 8d299645f9
docs: rm output (#16519) 6 months ago
Lance Martin 0b740ebd49
Update SQL agent toolkit docs (#16409) 6 months ago
Francisco Ingham 13cf4594f4
docs: added a few suggestions for sql docs (#16508) 6 months ago
Eugene Yurtsev 6004e9706f
Docs: Add streaming section (#16468)
Adds a streaming section to LangChain documentation, explaining
`stream`/`astream` API and `astream_events` API.
6 months ago
Tipwheal 66aafc0573
Docs: typo in tool use quick start page (#16494)
Minor typo fix
6 months ago
BeatrixCohere 2b2285dac0
docs: Update cohere rerank and comparison docs (#16198)
- **Description:** Update the cohere rerank docs to use cohere
embeddings
  - **Issue:** n/a
  - **Dependencies:** n/a
  - **Twitter handle:** n/a
6 months ago
Raunak 476bf8b763
community[patch]: Load list of files using UnstructuredFileLoader (#16216)
- **Description:** Updated `_get_elements()` function of
`UnstructuredFileLoader `class to check if the argument self.file_path
is a file or list of files. If it is a list of files then it iterates
over the list of file paths, calls the partition function for each one,
and appends the results to the elements list. If self.file_path is not a
list, it calls the partition function as before.
  
  - **Issue:** Fixed #15607,
  - **Dependencies:** NA
  - **Twitter handle:** NA

Co-authored-by: H161961 <Raunak.Raunak@Honeywell.com>
6 months ago
Xudong Sun 019b6ebe8d
community[minor]: Add iFlyTek Spark LLM chat model support (#13389)
- **Description:** This PR enables LangChain to access the iFlyTek's
Spark LLM via the chat_models wrapper.
  - **Dependencies:** websocket-client ^1.6.1
  - **Tag maintainer:** @baskaryan 

### SparkLLM chat model usage

Get SparkLLM's app_id, api_key and api_secret from [iFlyTek SparkLLM API
Console](https://console.xfyun.cn/services/bm3) (for more info, see
[iFlyTek SparkLLM Intro](https://xinghuo.xfyun.cn/sparkapi) ), then set
environment variables `IFLYTEK_SPARK_APP_ID`, `IFLYTEK_SPARK_API_KEY`
and `IFLYTEK_SPARK_API_SECRET` or pass parameters when using it like the
demo below:

```python3
from langchain.chat_models.sparkllm import ChatSparkLLM

client = ChatSparkLLM(
    spark_app_id="<app_id>",
    spark_api_key="<api_key>",
    spark_api_secret="<api_secret>"
)
```
6 months ago
Eugene Yurtsev d898d2f07b
docs: Fix version in which astream_events was released (#16481)
Fix typo in version
6 months ago
bu2kx ff3163297b
community[minor]: Add KDBAI vector store (#12797)
Addition of KDBAI vector store (https://kdb.ai).

Dependencies: `kdbai_client` v0.1.2 Python package.

Sample notebook: `docs/docs/integrations/vectorstores/kdbai.ipynb`

Tag maintainer: @bu2kx
Twitter handle: @kxsystems
6 months ago
JongRok BAEK 4ec3fe4680
docs: Updated integration docs structure for chat/anthropic (#16268)
Description: 
- Added output and environment variables
- Updated the documentation for chat/anthropic, changing references from
`langchain.schema` to `langchain_core.prompts`.

Issue: https://github.com/langchain-ai/langchain/issues/15664
Dependencies: None
Twitter handle: None

Since this is my first open-source PR, please feel free to point out any
mistakes, and I'll be eager to make corrections.
6 months ago
Shivani Modi 4e160540ff
community[minor]: Adding Konko Completion endpoint (#15570)
This PR introduces update to Konko Integration with LangChain.

1. **New Endpoint Addition**: Integration of a new endpoint to utilize
completion models hosted on Konko.

2. **Chat Model Updates for Backward Compatibility**: We have updated
the chat models to ensure backward compatibility with previous OpenAI
versions.

4. **Updated Documentation**: Comprehensive documentation has been
updated to reflect these new changes, providing clear guidance on
utilizing the new features and ensuring seamless integration.

Thank you to the LangChain team for their exceptional work and for
considering this PR. Please let me know if any additional information is
needed.

---------

Co-authored-by: Shivani Modi <shivanimodi@Shivanis-MacBook-Pro.local>
Co-authored-by: Shivani Modi <shivanimodi@Shivanis-MBP.lan>
6 months ago
Facundo Santiago 92e6a641fd
feat: adding paygo api support for Azure ML / Azure AI Studio (#14560)
- **Description:** Introducing support for LLMs and Chat models running
in Azure AI studio and Azure ML using the new deployment mode
pay-as-you-go (model as a service).
- **Issue:** NA
- **Dependencies:** None.
- **Tag maintainer:** @prakharg-msft @gdyre 
- **Twitter handle:** @santiagofacundo

Examples added:
*
[docs/docs/integrations/llms/azure_ml.ipynb](https://github.com/santiagxf/langchain/blob/santiagxf/azureml-endpoints-paygo-community/docs/docs/integrations/chat/azureml_endpoint.ipynb)
*
[docs/docs/integrations/chat/azureml_chat_endpoint.ipynb](https://github.com/santiagxf/langchain/blob/santiagxf/azureml-endpoints-paygo-community/docs/docs/integrations/chat/azureml_chat_endpoint.ipynb)

---------

Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
6 months ago
baichuan-assistant 20fcd49348
community: Fix Baichuan Chat. (#15207)
- **Description:** Baichuan Chat (with both Baichuan-Turbo and
Baichuan-Turbo-192K models) has updated their APIs. There are breaking
changes. For example, BAICHUAN_SECRET_KEY is removed in the latest API
but is still required in Langchain. Baichuan's Langchain integration
needs to be updated to the latest version.
  - **Issue:** #15206
  - **Dependencies:** None,
  - **Twitter handle:** None

@hwchase17.

Co-authored-by: BaiChuanHelper <wintergyc@WinterGYCs-MacBook-Pro.local>
6 months ago
gcheron cfc225ecb3
community: SQLStrStore/SQLDocStore provide an easy SQL alternative to `InMemoryStore` to persist data remotely in a SQL storage (#15909)
**Description:**

- Implement `SQLStrStore` and `SQLDocStore` classes that inherits from
`BaseStore` to allow to persist data remotely on a SQL server.
- SQL is widely used and sometimes we do not want to install a caching
solution like Redis.
- Multiple issues/comments complain that there is no easy remote and
persistent solution that are not in memory (users want to replace
InMemoryStore), e.g.,
https://github.com/langchain-ai/langchain/issues/14267,
https://github.com/langchain-ai/langchain/issues/15633,
https://github.com/langchain-ai/langchain/issues/14643,
https://stackoverflow.com/questions/77385587/persist-parentdocumentretriever-of-langchain
- This is particularly painful when wanting to use
`ParentDocumentRetriever `
- This implementation is particularly useful when:
     * it's expensive to construct an InMemoryDocstore/dict
     * you want to retrieve documents from remote sources
     * you just want to reuse existing objects
- This implementation integrates well with PGVector, indeed, when using
PGVector, you already have a SQL instance running. `SQLDocStore` is a
convenient way of using this instance to store documents associated to
vectors. An integration example with ParentDocumentRetriever and
PGVector is provided in docs/docs/integrations/stores/sql.ipynb or
[here](https://github.com/gcheron/langchain/blob/sql-store/docs/docs/integrations/stores/sql.ipynb).
- It persists `str` and `Document` objects but can be easily extended.

 **Issue:**

Provide an easy SQL alternative to `InMemoryStore`.

---------

Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
6 months ago
dudgeon 26b2ad6d5b
Fixed typo on quickstart.ipynb (#16482)
- **Description:** Quick typo fix: `inpect` >> `inspect`
  - **Issue:** N/A
  - **Dependencies:** any dependencies required for this change,
  - **Twitter handle:** @geoffdudgeon
6 months ago
Eugene Yurtsev 39d1cbfecf
Docs: Document astream_events API (#16300)
Document astream events API
6 months ago
Florian MOREL 4b7969efc5
community[minor]: New documents loader for visio files (with extension .vsdx) (#16171)
**Description** : New documents loader for visio files (with extension
.vsdx)

A [visio file](https://fr.wikipedia.org/wiki/Microsoft_Visio) (with
extension .vsdx) is associated with Microsoft Visio, a diagram creation
software. It stores information about the structure, layout, and
graphical elements of a diagram. This format facilitates the creation
and sharing of visualizations in areas such as business, engineering,
and computer science.

A Visio file can contain multiple pages. Some of them may serve as the
background for others, and this can occur across multiple layers. This
loader extracts the textual content from each page and its associated
pages, enabling the extraction of all visible text from each page,
similar to what an OCR algorithm would do.

**Dependencies** : xmltodict package
6 months ago
KhoPhi fb41b68ea1
docs: Update with LCEL examples to Ollama & ChatOllama Integration notebook (#16194)
- **Description:** Updated the Chat/Ollama docs notebook with LCEL chain
examples

- **Issue:**  #15664 I'm a new contributor 😊

- **Dependencies:** No dependencies

- **Twitter handle:** 

Comments:

- How do I truncate the output of the stream in the notebook if and or
when it goes on and on and on for even the basic of prompts?

Edit:

Looking forward to feedback @baskaryan

---------

Co-authored-by: Bagatur <baskaryan@gmail.com>
6 months ago
Michael Gorham 3b0226b2c6
docs: Update redis_chat_message_history.ipynb (#16344)
## Problem
Spent several hours trying to figure out how to pass
`RedisChatMessageHistory` as a `GetSessionHistoryCallable` with a
different REDIS hostname. This example kept connecting to
`redis://localhost:6379`, but I wanted to connect to a server not hosted
locally.

## Cause
Assumption the user knows how to implement `BaseChatMessageHistory` and
`GetSessionHistoryCallable`

## Solution
Update documentation to show how to explicitly set the REDIS hostname
using a lambda function much like the MongoDB and SQLite examples.
6 months ago
Ian c98994c3c9
docs: Improve notebook to show how to use tidb to store history messages (#16420)
After merging [PR
#16304](https://github.com/langchain-ai/langchain/pull/16304), I
realized that our notebook example for integrating TiDB with LangChain
was too basic. To make it more useful and user-friendly, I plan to
create a detailed example. This will show how to use TiDB for saving
history messages in LangChain, offering a clearer, more practical guide
for our users
6 months ago
Eugene Yurtsev c88750d54b
Docs: Agent streaming notebooks (#15858)
Update information about streaming in the agents section. Show how to
use astream_events to get token by token streaming.
6 months ago
Eugene Yurtsev e5672bc944
docs: Re-write custom agent to show to write a tools agent (#15907)
Shows how to write a tools agent rather than a functions agent.
6 months ago
Boris Feld 404abf139a
community: Add CometLLM tracing context var (#15765)
I also added LANGCHAIN_COMET_TRACING to enable the CometLLM tracing
integration similar to other tracing integrations. This is easier for
end-users to enable it rather than importing the callback and pass it
manually.

(This is the same content as
https://github.com/langchain-ai/langchain/pull/14650 but rebased and
squashed as something seems to confuse Github Action).
6 months ago
Jennifer Melot d6275e47f2
docs: Updated integration docs structure for tools/arxiv (#16091) (#16250)
- **Description:** Updated docs for tools/arxiv to use `AgentExecutor`
and `invoke`
  - **Issue:** #15664
  - **Dependencies:** None
  - **Twitter handle:** None
6 months ago
ChengZi a950fa0487
docs: add milvus multitenancy doc (#16177)
- **Description:** add milvus multitenancy doc, it is an example for
this [pr](https://github.com/langchain-ai/langchain/pull/15740) .
  - **Issue:** No,
  - **Dependencies:** No,
  - **Twitter handle:** No

Signed-off-by: ChengZi <chen.zhang@zilliz.com>
6 months ago
parkererickson-tg b26a22f307
community[minor]: add TigerGraph support (#16280)
**Description:** Add support for querying TigerGraph databases through
the InquiryAI service.
**Issue**: N/A
**Dependencies:** N/A
**Twitter handle:** @TigerGraphDB
6 months ago
Christophe Bornet 8da34118bc
docs: Add documentation for Cassandra Document Loader (#16282) 6 months ago
Jonathan Algar 774e543e1f
docs: fix formatting issue in rockset.ipynb (#16328)
**Description:** randomly discovered while working on another PR
https://github.com/quarto-dev/quarto-cli/discussions/8131#discussioncomment-8027706

@anubhav94N ICYI
6 months ago
Ian b9f5104e6c
communty[minor]: Store Message History to TiDB Database (#16304)
This pull request integrates the TiDB database into LangChain for
storing message history, marking one of several steps towards a
comprehensive integration of TiDB with LangChain.


A simple usage
```python
from datetime import datetime
from langchain_community.chat_message_histories import TiDBChatMessageHistory

history = TiDBChatMessageHistory(
    connection_string="mysql+pymysql://<host>:<PASSWORD>@<host>:4000/<db>?ssl_ca=/etc/ssl/cert.pem&ssl_verify_cert=true&ssl_verify_identity=true",
    session_id="code_gen",
    earliest_time=datetime.utcnow(),  # Optional to set earliest_time to load messages after this time point.
)

history.add_user_message("hi! How's feature going?")
history.add_ai_message("It's almot done")
```
6 months ago
Sarthak Chaure dd5b8107b1
Docs: Updated callbacks/index.mdx (#16404)
The callbacks get started demo code was updated , replacing the
chain.run() command ( which is now depricated) ,with the updated
chain.invoke() command.
Solving the following issue : #16379
Twitter/X : @Hazxhx
6 months ago
Omar-aly 873de14cd8
docs: update vectorstores/llm_rails integration doc (#16199)
Description:
- Updated the docs for the vectorstores integration module
llm_rails.ipynb

Issue:
- [Connected to Issue
#15664](https://github.com/langchain-ai/langchain/issues/15664)
 
Dependencies:
- N/A

Co-authored-by: omaraly23 <112936089+omaraly22@users.noreply.github.com>
6 months ago