Commit Graph

388 Commits (82b5bdc7a1509243c6b4e3161414e9530a15dbcf)

Author SHA1 Message Date
Erick Friis 92969d49cb
multiple: remove external repo mds (#20896)
api docs build doesn't tolerate them
5 months ago
YISH ed26149a29
openai[patch]: Allow disablling safe_len_embeddings(OpenAIEmbeddings) (#19743)
OpenAI API compatible server may not support `safe_len_embedding`, 

use `disable_safe_len_embeddings=True` to disable it.

---------

Co-authored-by: Bagatur <baskaryan@gmail.com>
5 months ago
Sean 540f384197
partner: Upstage quick documentation update (#20869)
* Updating the provider docs page. 
The RAG example was meant to be moved to cookbook, but was merged by
mistake.

* Fix bug in Groundedness Check

---------

Co-authored-by: JuHyung-Son <sonju0427@gmail.com>
Co-authored-by: Erick Friis <erick@langchain.dev>
5 months ago
Erick Friis 5da9dd1195
mistral: comment batching param (#20868)
Addresses #20523
5 months ago
ccurme 481d3855dc
patch: remove usage of llm, chat model __call__ (#20788)
- `llm(prompt)` -> `llm.invoke(prompt)`
- `llm(prompt=prompt` -> `llm.invoke(prompt)` (same with `messages=`)
- `llm(prompt, callbacks=callbacks)` -> `llm.invoke(prompt,
config={"callbacks": callbacks})`
- `llm(prompt, **kwargs)` -> `llm.invoke(prompt, **kwargs)`
5 months ago
Erick Friis 1aef8116de
upstage: release 0.1.1 (#20864) 5 months ago
junkeon c8fd51e8c8
upstage: Add Upstage partner package LA and GC (#20651)
---------

Co-authored-by: Sean <chosh0615@gmail.com>
Co-authored-by: Erick Friis <erick@langchain.dev>
Co-authored-by: Sean Cho <sean@upstage.ai>
5 months ago
back2nix a1614b88ac
groq[patch]: groq proxy support (#20758)
# Proxy Fix for Groq Class 🐛 🚀

## Description
This PR fixes a bug related to proxy settings in the `Groq` class,
allowing users to connect to LangChain services via a proxy.

## Changes Made
-  FIX support for specifying proxy settings in the `Groq` class.
-  Resolved the bug causing issues with proxy settings.
-  Did not include unit tests and documentation updates.
-  Did not run make format, make lint, and make test to ensure code
quality and functionality because I couldn't get it to run, so I don't
program in Python and couldn't run `ruff`.
-  Ensured that the changes are backwards compatible.
-  No additional dependencies were added to `pyproject.toml`.

### Error Before Fix
```python
Traceback (most recent call last):
  File "/home/bg/Documents/code/github.com/back2nix/test/groq/main.py", line 9, in <module>
    chat = ChatGroq(
           ^^^^^^^^^
  File "/home/bg/Documents/code/github.com/back2nix/test/groq/venv310/lib/python3.11/site-packages/langchain_core/load/serializable.py", line 120, in __init__
    super().__init__(**kwargs)
  File "/home/bg/Documents/code/github.com/back2nix/test/groq/venv310/lib/python3.11/site-packages/pydantic/v1/main.py", line 341, in __init__
    raise validation_error
pydantic.v1.error_wrappers.ValidationError: 1 validation error for ChatGroq
__root__
  Invalid `http_client` argument; Expected an instance of `httpx.AsyncClient` but got <class 'httpx.Client'> (type=type_error)
  ```
  
### Example usage after fix
  ```python3
import os

import httpx
from langchain_core.prompts import ChatPromptTemplate
from langchain_groq import ChatGroq

chat = ChatGroq(
    temperature=0,
    groq_api_key=os.environ.get("GROQ_API_KEY"),
    model_name="mixtral-8x7b-32768",
    http_client=httpx.Client(
        proxies="socks5://127.0.0.1:1080",
        transport=httpx.HTTPTransport(local_address="0.0.0.0"),
    ),
    http_async_client=httpx.AsyncClient(
        proxies="socks5://127.0.0.1:1080",
        transport=httpx.HTTPTransport(local_address="0.0.0.0"),
    ),
)

system = "You are a helpful assistant."
human = "{text}"
prompt = ChatPromptTemplate.from_messages([("system", system), ("human", human)])

chain = prompt | chat
out = chain.invoke({"text": "Explain the importance of low latency LLMs"})

print(out)
```

---------

Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com>
Co-authored-by: Bagatur <baskaryan@gmail.com>
5 months ago
Erick Friis 8c95ac3145
docs, multiple: de-beta with_structured_output (#20850) 5 months ago
ccurme 3bcfbcc871
groq: handle null queue_time (#20839) 5 months ago
ccurme 6debadaa70
groq: bump core (#20838) 5 months ago
Erick Friis 7984206c95
groq: release 0.1.3 (#20836)
Fixes #20811
5 months ago
ccurme 06b04b80b8
groq: fix warning filter for integration test (#20806) 5 months ago
ccurme 5a3c65a756
standard tests: add xfails (#20659) 5 months ago
ccurme 6622829c67
mistral: catch GatedRepoError, release 0.1.3 (#20802)
https://github.com/langchain-ai/langchain/issues/20618

---------

Co-authored-by: Erick Friis <erick@langchain.dev>
5 months ago
ccurme 7a922f3e48
core, openai: support custom token encoders (#20762) 5 months ago
Bagatur eb18f4e155
infra: rm sep repo partner dirs (#20756)
so you can `poetry run pip install -e libs/partners/*/` to your hearts
content
5 months ago
ccurme c010ec8b71
patch: deprecate (a)get_relevant_documents (#20477)
- `.get_relevant_documents(query)` -> `.invoke(query)`
- `.get_relevant_documents(query=query)` -> `.invoke(query)`
- `.get_relevant_documents(query, callbacks=callbacks)` ->
`.invoke(query, config={"callbacks": callbacks})`
- `.get_relevant_documents(query, **kwargs)` -> `.invoke(query,
**kwargs)`

---------

Co-authored-by: Erick Friis <erick@langchain.dev>
5 months ago
Mateusz Szewczyk 75ffe51bbe
ibm: Add support for Embedding Models (#20647)
---------

Co-authored-by: Erick Friis <erick@langchain.dev>
5 months ago
ccurme 6d530481c1
openai: fix allowed block types (#20636) 5 months ago
Erick Friis 5c216ad08f
upstage[patch]: un-xfail tool calling test, release 0.1.0 (#20635) 5 months ago
Eugene Yurtsev 718c9cbe3a
mistral[patch]: Support both model and model_name (#20557) 5 months ago
Eugene Yurtsev 8c29b7bf35
mistralai[patch]: Use public attribute for eventsource.response (#20580)
Minor change, use the public attribute instead of the protected one.
5 months ago
Erick Friis e7e94b37f1
upstage: fix core dep (#20576) 5 months ago
Erick Friis f09bd0b75b
upstage: init package (#20574)
Co-authored-by: Sean Cho <sean@upstage.ai>
Co-authored-by: JuHyung-Son <sonju0427@gmail.com>
5 months ago
Bagatur 54e9271504
anthropic[patch]: fix msg mutation (#20572) 5 months ago
Bagatur 984e7e36c2
anthropic[patch]: Release 0.1.10 (#20568) 5 months ago
ccurme 2238490069
mistral, openai: allow anthropic-style messages in message histories (#20565) 5 months ago
Eugene Yurtsev 7a7851aa06
anthropic[patch]: Handle empty text block (#20566)
Handle empty text block
5 months ago
ccurme 4a17951900
mistral: read tool calls from AIMessage (#20554)
Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com>
5 months ago
Eugene Yurtsev f257909699
mistralai[patch]: Surface http errors (#20555)
Do not swallow errors when streaming with httpx.

Update affected code if this PR gets merged to httpx:
https://github.com/florimondmanca/httpx-sse/pull/25/files
5 months ago
Erick Friis e7fe5f7d3f
anthropic[patch]: serialization in partner package (#18828) 5 months ago
Bagatur f74d5d642e
anthropic[patch]: bump to core 0.1.43 (#20537) 5 months ago
Bagatur 96d8769eae
anthropic[patch]: release 0.1.9, use tool calls if content is empty (#20535) 5 months ago
ccurme 22da9f5f3f
update scheduled tests (#20526)
repurpose scheduled tests to test over provider packages
5 months ago
Fayfox 9fd36efdb5
anthropic[patch]: env ANTHROPIC_API_URL not work (#20507)
enviroment variable ANTHROPIC_API_URL will not work if anthropic_api_url
has default value

---------

Co-authored-by: Eugene Yurtsev <eugene@langchain.dev>
5 months ago
Bagatur f7667c614b
docs: update tool use case (#20404) 5 months ago
ccurme 4b6b0a87b6
groq[patch]: Make stream robust to ToolMessage (#20417)
```python
from langchain.agents import AgentExecutor, create_tool_calling_agent, tool
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_groq import ChatGroq


prompt = ChatPromptTemplate.from_messages(
    [
        ("system", "You are a helpful assistant"),
        ("human", "{input}"),
        MessagesPlaceholder("agent_scratchpad"),
    ]
)

model = ChatGroq(model_name="mixtral-8x7b-32768", temperature=0)

@tool
def magic_function(input: int) -> int:
    """Applies a magic function to an input."""
    return input + 2

tools = [magic_function]


agent = create_tool_calling_agent(model, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

agent_executor.invoke({"input": "what is the value of magic_function(3)?"})
```
```
> Entering new AgentExecutor chain...

Invoking: `magic_function` with `{'input': 3}`


5The value of magic\_function(3) is 5.

> Finished chain.
{'input': 'what is the value of magic_function(3)?',
 'output': 'The value of magic\\_function(3) is 5.'}
```
5 months ago
aditya thomas 4f75b230ed
partner[ai21]: masking of the api key for ai21 models (#20257)
**Description:** Masking of the API key for AI21 models
**Issue:** Fixes #12165 for AI21
**Dependencies:** None

Note: This fix came in originally through #12418 but was possibly missed
in the refactor to the AI21 partner package


---------

Co-authored-by: Erick Friis <erick@langchain.dev>
5 months ago
Erick Friis e6806a08d4
multiple: standard chat model tests (#20359) 5 months ago
Erick Friis ec0273fc92
chroma: release 0.1.0 (#20355) 5 months ago
Erick Friis da707d0755
chroma: remove relevance score int test (#20346)
deprecating feature in #20302
5 months ago
Bagatur 799714c629
release anthropic, fireworks, openai, groq, mistral (#20333) 5 months ago
ccurme 795c728f71
mistral[patch]: add IDs to tool calls (#20299)
Mistral gives us one ID per response, no individual IDs for tool calls.

```python
from langchain.agents import AgentExecutor, create_tool_calling_agent, tool
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_mistralai import ChatMistralAI


prompt = ChatPromptTemplate.from_messages(
    [
        ("system", "You are a helpful assistant"),
        ("human", "{input}"),
        MessagesPlaceholder("agent_scratchpad"),
    ]
)
model = ChatMistralAI(model="mistral-large-latest", temperature=0)

@tool
def magic_function(input: int) -> int:
    """Applies a magic function to an input."""
    return input + 2

tools = [magic_function]

agent = create_tool_calling_agent(model, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

agent_executor.invoke({"input": "what is the value of magic_function(3)?"})
```

---------

Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com>
5 months ago
Bagatur c706689413
openai[patch]: use tool_calls in request (#20272) 5 months ago
Erick Friis 0fa551c278
chroma: bump rc, keep optional (#20298) 5 months ago
Erick Friis 16f8fff14f
chroma: add required fastapi dep to restrict to <1 (#20297) 5 months ago
Erick Friis 991fd82532
chroma: add optional fastapi dep to restrict to <1 (#20295) 5 months ago
killind-dev f8a54d1d73
chroma: Add chroma partner package (#19292)
**Description:** Adds chroma to the partners package. Tests & code
mirror those in the community package.
**Dependencies:** None
**Twitter handle:** @akiradev0x

---------

Co-authored-by: Erick Friis <erick@langchain.dev>
5 months ago
Yuki Oshima 12190ad728
openai[patch]: Fix langchain-openai unknown parameter error with gpt-4-turbo (#20271)
**Description:** 

I fixed langchain-openai unknown parameter error with gpt-4-turbo.

It seems that the behavior of the Chat Completions API implicitly
changed when using the latest gpt-4-turbo model, differing from previous
models. It now appears to reject parameters that are not listed in the
[API
Reference](https://platform.openai.com/docs/api-reference/chat/create).
So I found some errors and fixed them.

**Issue:** https://github.com/langchain-ai/langchain/issues/20264

**Dependencies:** none

**Twitter handle:** https://twitter.com/oshima_123
5 months ago