Commit Graph

11138 Commits (3555882a0de152fd621c44d9f3fafba6386a0f1c)
 

Author SHA1 Message Date
Isaac Francisco d40bdd6257
docs: more indexing of document loaders (#25500)
Co-authored-by: Bagatur <baskaryan@gmail.com>
Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com>
1 month ago
Bagatur 8a71f1b41b
core[minor]: add langsmith document loader (#25493)
needs tests
1 month ago
Bob Merkus 8e3e532e7d
docs: ollama doc update (toolcalling, install, notebook examples) (#25549)
The new `langchain-ollama` package seems pretty well implemented, but I
noticed the docs were still outdated so I decided to fix em up a bit.

- Llama3.1 was release on 23rd of July;
https://ai.meta.com/blog/meta-llama-3-1/
- Ollama supports tool calling since 25th of July;
https://ollama.com/blog/tool-support
- LangChain Ollama partner package was released 1st of august;
https://pypi.org/project/langchain-ollama/

**Problem**: Docs note langchain-community instead of langchain-ollama

**Solution**: Update docs to
https://python.langchain.com/v0.2/docs/integrations/chat/ollama/


**Problem**: OllamaFunctions is deprecated, as noted on
[Integrations](https://python.langchain.com/v0.2/docs/integrations/chat/ollama_functions/):
This was an experimental wrapper that attempts to bolt-on tool calling
support to models that do not natively support it. The [primary Ollama
integration](https://python.langchain.com/v0.2/docs/integrations/chat/ollama/) now
supports tool calling, and should be used instead.

**Solution**: Delete old notebook from repo, update the existing one
with @tool decorator + pydantic examples to the notebook


**Problem**: Llama3.1 was released while llama3-groq-tool-call fine-tune
Is noted in notebooks.

**Solution**: update docs + notebooks to llama3.1 (which has improved
tool calling support)


**Problem**: Install instructions are incomplete, there is no
information to download a model and/or run the Ollama server

**Solution**: Add simple instructions to start the ollama service and
pull model (for toolcalling)

---------

Co-authored-by: Chester Curme <chester.curme@gmail.com>
1 month ago
Jabir 12e490ea56
Update azuresearch.py (#25577)
This will allow complextype metadata to be returned. the current
implementation throws error when dealing with nested metadata

Thank you for contributing to LangChain!

- [x] **PR title**: "package: description"
- Where "package" is whichever of langchain, community, core,
experimental, etc. is being modified. Use "docs: ..." for purely docs
changes, "templates: ..." for template changes, "infra: ..." for CI
changes.
  - Example: "community: add foobar LLM"


- [ ] **PR message**: ***Delete this entire checklist*** and replace
with
    - **Description:** a description of the change
    - **Issue:** the issue # it fixes, if applicable
    - **Dependencies:** any dependencies required for this change
- **Twitter handle:** if your PR gets announced, and you'd like a
mention, we'll gladly shout you out!


- [ ] **Add tests and docs**: If you're adding a new integration, please
include
1. a test for the integration, preferably unit tests that do not rely on
network access,
2. an example notebook showing its use. It lives in
`docs/docs/integrations` directory.


- [ ] **Lint and test**: Run `make format`, `make lint` and `make test`
from the root of the package(s) you've modified. See contribution
guidelines for more: https://python.langchain.com/docs/contributing/

Additional guidelines:
- Make sure optional dependencies are imported within a function.
- Please do not add dependencies to pyproject.toml files (even optional
ones) unless they are required for unit tests.
- Most PRs should not touch more than one package.
- Changes should be backwards compatible.
- If you are adding something to community, do not re-import it in
langchain.

If no one reviews your PR within a few days, please @-mention one of
baskaryan, efriis, eyurtsev, ccurme, vbarda, hwchase17.

---------

Co-authored-by: Chester Curme <chester.curme@gmail.com>
1 month ago
Abraham Omorogbe 498a482e76
docs: Adding Azure Database for PostgreSQL docs (#25560)
This PR to show support for the Azure Database for PostgreSQL Vector
Store and Memory

[Azure Database for PostgreSQL - Flexible
Server](https://learn.microsoft.com/en-us/azure/postgresql/flexible-server/service-overview)
[Azure Database for PostgreSQL pgvector
extension](https://learn.microsoft.com/en-us/azure/postgresql/flexible-server/how-to-use-pgvector)

**Description:** Added vector store and memory usage documentation for
Azure Database for PostgreSQL
 **Twitter handle:** [@_aiabe](https://x.com/_aiabe)

---------

Co-authored-by: Abeomor <{ID}+{username}@users.noreply.github.com>
1 month ago
Leonid Ganeline d324fd1821
docs: added Constitutional AI references (#25553)
Added reference to the source paper.
1 month ago
Bagatur 4bd005adb6
core[patch]: Allow bound models as token_counter in trim_messages (#25563) 1 month ago
Erick Friis e01c6789c4
core,community: add beta decorator to missed GraphVectorStore extensions (#25562) 1 month ago
Erick Friis dd2d094adc
infra: remove huggingface from ci tree (#25559) 1 month ago
Bagatur 6b98207eda
infra: test chat prompt ser/des (#25557) 1 month ago
ccurme c5bf114c0f
together, standard-tests: specify tool_choice in standard tests (#25548)
Here we allow standard tests to specify a value for `tool_choice` via a
`tool_choice_value` property, which defaults to None.

Chat models [available in
Together](https://docs.together.ai/docs/chat-models) have issues passing
standard tool calling tests:
- llama 3.1 models currently [appear to rely on user-side
parsing](https://docs.together.ai/docs/llama-3-function-calling) in
Together;
- Mixtral-8x7B and Mistral-7B (currently tested) consistently do not
call tools in some tests.

Specifying tool_choice also lets us remove an existing `xfail` and use a
smaller model in Groq tests.
1 month ago
maang-h 015ab91b83
community[patch]: Add ToolMessage for ChatZhipuAI (#25547)
- **Description:** Add ToolMessage for `ChatZhipuAI` to solve the issue
#25490
1 month ago
ccurme 5a3aaae6dc
groq[patch]: update model used for llama tests (#25542)
`llama-3.1-8b-instant` often fails some of the tool calling standard
tests. Here we update to `llama-3.1-70b-versatile`.
1 month ago
Mohammad Mohtashim 75c3c81b8c
[Community]: Fix - Open AI Whisper `client.audio.transcriptions` returning Text Object which raises error (#25271)
- **Description:** The following
[line](fd546196ef/libs/community/langchain_community/document_loaders/parsers/audio.py (L117))
in `OpenAIWhisperParser` returns a text object for some odd reason
despite the official documentation saying it should return `Transcript`
Instance which should have the text attribute. But for the example given
in the issue and even when I tried running on my own, I was directly
getting the text. The small PR accounts for that.
 - **Issue:** : #25218
 

I was able to replicate the error even without the GenericLoader as
shown below and the issue was with `OpenAIWhisperParser`

```python
parser = OpenAIWhisperParser(api_key="sk-fxxxxxxxxx",
                                            response_format="srt",
                                            temperature=0)

list(parser.lazy_parse(Blob.from_path('path_to_file.m4a')))
```
1 month ago
Thin red line 未来产品经理 0f7b8adddf
fix issue: cannot use document_variable_name to override context in create_stuff_documents_chain (#25531)
…he prompt in the create_stuff_documents_chain

Thank you for contributing to LangChain!

- [ ] **PR title**: "langchain:add document_variable_name in the
function _validate_prompt in create_stuff_documents_chain"
 

- [ ] **PR message**: 
- **Description:** add document_variable_name in the function
_validate_prompt in create_stuff_documents_chain
- **Issue:** according to the description of
create_stuff_documents_chain function, the parameter
document_variable_name can be used to override the "context" in the
prompt, but in the function, _validate_prompt it still use DOCUMENTS_KEY
to check if it is a valid prompt, the value of DOCUMENTS_KEY is always
"context", so even through the user use document_variable_name to
override it, the code still tries to check if "context" is in the
prompt, and finally it reports error. so I use document_variable_name to
replace DOCUMENTS_KEY, the default value of document_variable_name is
"context" which is same as DOCUMENTS_KEY, but it can be override by
users.
    - **Dependencies:** none
    - **Twitter handle:** https://x.com/xjr199703


- [ ] **Add tests and docs**: none

- [ ] **Lint and test**: Run `make format`, `make lint` and `make test`
from the root of the package(s) you've modified. See contribution
guidelines for more: https://python.langchain.com/docs/contributing/

Additional guidelines:
- Make sure optional dependencies are imported within a function.
- Please do not add dependencies to pyproject.toml files (even optional
ones) unless they are required for unit tests.
- Most PRs should not touch more than one package.
- Changes should be backwards compatible.
- If you are adding something to community, do not re-import it in
langchain.

If no one reviews your PR within a few days, please @-mention one of
baskaryan, efriis, eyurtsev, ccurme, vbarda, hwchase17.

---------

Co-authored-by: Chester Curme <chester.curme@gmail.com>
1 month ago
ccurme 09c0823c3a
docs: update summarization guides (#25408) 1 month ago
maang-h 32f5147523
docs: Fix QianfanLLMEndpoint and Tongyi input text (#25529)
- **Description:** Fix `QianfanLLMEndpoint` and `Tongyi` input text.
1 month ago
ZhangShenao 4255a30f20
Improvement[Community] Improve api doc for `SingleFileFacebookMessengerChatLoader` (#25536)
Delete redundant args in api doc
1 month ago
Bagatur 49dea06af1
docs: fix Agent deprecation msg (#25464) 1 month ago
Hassan El Mghari 937b3904eb
together[patch]: update base url (#25524)
Updated the Together base URL from `.ai` to `.xyz` since some customers
have reported problems with `.ai`.
1 month ago
gbaian10 bda3becbe7
docs: add prompt to install beautifulsoup4. (#25518)
fix: #25482

- **Description:**
Add a prompt to install beautifulsoup4 in places where `from
langchain_community.document_loaders import WebBaseLoader` is used.
- **Issue:** #25482
1 month ago
gbaian10 f6e6a17878
docs: add prompt to install nltk (#25519)
fix: #25473 

- **Description:** add prompt to install nltk
- **Issue:** #25473
1 month ago
Chengzu Ou c1bd4e05bc
docs: fix Databricks Vector Search demo notebook (#25504)
**Description:** This PR fixes an issue in the demo notebook of
Databricks Vector Search in "Work with Delta Sync Index" section.

**Issue:** N/A

**Dependencies:** N/A

---------

Co-authored-by: Chengzu Ou <chengzu.ou@databrick.com>
Co-authored-by: Erick Friis <erick@langchain.dev>
1 month ago
Isaac Francisco a2e90a5a43
add embeddings integration tests (#25508) 1 month ago
Bagatur a06818a654
openai[patch]: update core dep (#25502) 1 month ago
Bagatur df98552b6f
core[patch]: Release 0.2.33 (#25498) 1 month ago
ccurme b83f1eb0d5
core, partners: implement standard tracing params for LLMs (#25410) 1 month ago
Bagatur 9f0c76bf89
openai[patch]: Release 0.1.22 (#25496) 1 month ago
ccurme 01ecd0acba
openai[patch]: fix json mode for Azure (#25488)
https://github.com/langchain-ai/langchain/issues/25479
https://github.com/langchain-ai/langchain/issues/25485

---------

Co-authored-by: Bagatur <baskaryan@gmail.com>
1 month ago
Eugene Yurtsev 1fd1c1dca5
docs: use .invoke rather than __call__ in openai integration notebook (#25494)
Documentation should be using .invoke()
1 month ago
Bagatur 253ceca76a
docs: fix mimetype parser docstring (#25463) 1 month ago
Eugene Yurtsev e18511bb22
core[minor], anthropic[patch]: Upgrade @root_validator usage to be consistent with pydantic 2 (#25457)
anthropic: Upgrade `@root_validator` usage to be consistent with
pydantic 2
core: support looking up multiple keys from env in from_env factory
1 month ago
Eugene Yurtsev 34da8be60b
pinecone[patch]: Upgrade @root_validators to be consistent with pydantic 2 (#25453)
Upgrade root validators for pydantic 2 migration
1 month ago
Eugene Yurtsev b297af5482
voyageai[patch]: Upgrade root validators for pydantic 2 (#25455)
Update @root_validators to be consistent with pydantic 2 semantics
1 month ago
Eugene Yurtsev 4cdaca67dc
ai21[patch]: Upgrade @root_validators for pydantic 2 migration (#25454)
Upgrade @root_validators usage to match pydantic 2 semantics
1 month ago
Eugene Yurtsev d72a08a60d
groq[patch]: Update root validators for pydantic 2 migration (#25402) 1 month ago
Leonid Ganeline 8eb63a609e
docs: `arxiv` page update (#25450)
Added `arxive` papers that use `LangGraph` or `LangSmith`. Improved the
page formatting.
1 month ago
Isaac Francisco 5150ec3a04
[experimental]: minor fix to open assistants code (#24682) 1 month ago
Bagatur 2b4fbcb4b4
docs: format oai embeddings docstring (#25448) 1 month ago
Eugene Yurtsev eb3870e9d8
fireworks[patch]: Upgrade @root_validators to be pydantic 2 compliant (#25443)
Update @root_validators to be pydantic 2 compliant
1 month ago
William FH 75ae585deb
Merge support for group manager (#25360) 1 month ago
Eugene Yurtsev b7c070d437
docs[patch]: Update code that checks API keys (#25444)
Check whether the API key is already in the environment

Update:

```python
import getpass
import os

os.environ["DATABRICKS_HOST"] = "https://your-workspace.cloud.databricks.com"
os.environ["DATABRICKS_TOKEN"] = getpass.getpass("Enter your Databricks access token: ")
```

To:

```python
import getpass
import os

os.environ["DATABRICKS_HOST"] = "https://your-workspace.cloud.databricks.com"
if "DATABRICKS_TOKEN" not in os.environ:
    os.environ["DATABRICKS_TOKEN"] = getpass.getpass(
        "Enter your Databricks access token: "
    )
```

grit migration:

```
engine marzano(0.1)
language python

`os.environ[$Q] = getpass.getpass("$X")` as $CHECK where {
    $CHECK <: ! within if_statement(),
    $CHECK => `if $Q not in os.environ:\n    $CHECK`
}
```
1 month ago
Bagatur 60b65528c5
docs: fix api ref mod links in pkg page (#25447) 1 month ago
Eugene Yurtsev 2ef9d12372
mistralai[patch]: Update more @root_validators for pydantic 2 compatibility (#25446)
Update @root_validators in mistralai integration for pydantic 2 compatibility
1 month ago
Eugene Yurtsev 6910b0b3aa
docs[patch]: Fix integration notebook for Fireworks llm (#25442)
Fix integration notebook
1 month ago
Eugene Yurtsev 831708beb7
together[patch]: Update @root_validator for pydantic 2 compatibility (#25423)
This PR updates usage of @root_validator to be compatible with pydantic 2.
1 month ago
Eugene Yurtsev a114255b82
ai21[patch]: Update @root_validators for pydantic2 migration (#25401)
Update @root_validators for pydantic 2 migration.
1 month ago
Eugene Yurtsev 6f68c8d6ab
mistralai[patch]: Update root validator for compatibility with pydantic 2 (#25403) 1 month ago
ccurme 8afbab4cf6
langchain[patch]: deprecate various chains (#25310)
- [x] NatbotChain: move to community, deprecate langchain version.
Update to use `prompt | llm | output_parser` instead of LLMChain.
- [x] LLMMathChain: deprecate + add langgraph replacement example to API
ref
- [x] HypotheticalDocumentEmbedder (retriever): update to use `prompt |
llm | output_parser` instead of LLMChain
- [x] FlareChain: update to use `prompt | llm | output_parser` instead
of LLMChain
- [x] ConstitutionalChain: deprecate + add langgraph replacement example
to API ref
- [x] LLMChainExtractor (document compressor): update to use `prompt |
llm | output_parser` instead of LLMChain
- [x] LLMChainFilter (document compressor): update to use `prompt | llm
| output_parser` instead of LLMChain
- [x] RePhraseQueryRetriever (retriever): update to use `prompt | llm |
output_parser` instead of LLMChain
1 month ago
Luke 66e30efa61
experimental: Fix divide by 0 error (#25439)
Within the semantic chunker, when calling `_threshold_from_clusters`
there is the possibility for a divide by 0 error if the
`number_of_chunks` is equal to the length of `distances`.

Fix simply implements a check if these values match to prevent the error
and enable chunking to continue.
1 month ago