Commit Graph

36 Commits

Author SHA1 Message Date
Eugene Yurtsev
f257909699
mistralai[patch]: Surface http errors (#20555)
Do not swallow errors when streaming with httpx.

Update affected code if this PR gets merged to httpx:
https://github.com/florimondmanca/httpx-sse/pull/25/files
2024-04-17 10:47:56 -04:00
Bagatur
799714c629
release anthropic, fireworks, openai, groq, mistral (#20333) 2024-04-11 09:19:52 -07:00
ccurme
795c728f71
mistral[patch]: add IDs to tool calls (#20299)
Mistral gives us one ID per response, no individual IDs for tool calls.

```python
from langchain.agents import AgentExecutor, create_tool_calling_agent, tool
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_mistralai import ChatMistralAI


prompt = ChatPromptTemplate.from_messages(
    [
        ("system", "You are a helpful assistant"),
        ("human", "{input}"),
        MessagesPlaceholder("agent_scratchpad"),
    ]
)
model = ChatMistralAI(model="mistral-large-latest", temperature=0)

@tool
def magic_function(input: int) -> int:
    """Applies a magic function to an input."""
    return input + 2

tools = [magic_function]

agent = create_tool_calling_agent(model, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

agent_executor.invoke({"input": "what is the value of magic_function(3)?"})
```

---------

Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com>
2024-04-11 11:09:30 -04:00
Erick Friis
9eb6f538f0
infra, multiple: rc release versions (#20252) 2024-04-09 17:54:58 -07:00
Bagatur
0d0458d1a7
mistralai[patch]: Pre-release 0.1.2-rc.1 (#20251) 2024-04-10 00:25:38 +00:00
Bagatur
9514bc4d67
core[minor], ...: add tool calls message (#18947)
core[minor], langchain[patch], openai[minor], anthropic[minor], fireworks[minor], groq[minor], mistralai[minor]

```python
class ToolCall(TypedDict):
    name: str
    args: Dict[str, Any]
    id: Optional[str]

class InvalidToolCall(TypedDict):
    name: Optional[str]
    args: Optional[str]
    id: Optional[str]
    error: Optional[str]

class ToolCallChunk(TypedDict):
    name: Optional[str]
    args: Optional[str]
    id: Optional[str]
    index: Optional[int]


class AIMessage(BaseMessage):
    ...
    tool_calls: List[ToolCall] = []
    invalid_tool_calls: List[InvalidToolCall] = []
    ...


class AIMessageChunk(AIMessage, BaseMessageChunk):
    ...
    tool_call_chunks: Optional[List[ToolCallChunk]] = None
    ...
```
Important considerations:
- Parsing logic occurs within different providers;
- ~Changing output type is a breaking change for anyone doing explicit
type checking;~
- ~Langsmith rendering will need to be updated:
https://github.com/langchain-ai/langchainplus/pull/3561~
- ~Langserve will need to be updated~
- Adding chunks:
- ~AIMessage + ToolCallsMessage = ToolCallsMessage if either has
non-null .tool_calls.~
- Tool call chunks are appended, merging when having equal values of
`index`.
  - additional_kwargs accumulate the normal way.
- During streaming:
- ~Messages can change types (e.g., from AIMessageChunk to
AIToolCallsMessageChunk)~
- Output parsers parse additional_kwargs (during .invoke they read off
tool calls).

Packages outside of `partners/`:
- https://github.com/langchain-ai/langchain-cohere/pull/7
- https://github.com/langchain-ai/langchain-google/pull/123/files

---------

Co-authored-by: Chester Curme <chester.curme@gmail.com>
2024-04-09 18:41:42 -05:00
Bagatur
74d04a4e80
mistralai[patch]: Release 0.1.1 (#20239) 2024-04-09 21:53:01 +00:00
Erick Friis
855ba46f80
standard-tests: a standard unit and integration test set (#20182)
just chat models for now
2024-04-09 12:43:00 -07:00
Bagatur
3490d70238
mistralai[patch]: standardize model params (#20163)
Related to #20085
2024-04-08 11:48:38 -05:00
Bagatur
2f5606a318
mistralai[patch]: correct integration_test (#19774) 2024-03-29 21:47:35 +00:00
Pierre Véron
ace7b66261
mistralai[patch]: add missing _combine_llm_outputs implementation in ChatMistralAI (#18603)
# Description
Implementing `_combine_llm_outputs` to `ChatMistralAI` to override the
default implementation in `BaseChatModel` returning `{}`. The
implementation is inspired by the one in `ChatOpenAI` from package
`langchain-openai`.
# Issue
None
# Dependencies
None
# Twitter handle
None

---------

Co-authored-by: Bagatur <baskaryan@gmail.com>
2024-03-29 14:43:20 -07:00
Erick Friis
441a8012b3
mistralai[patch]: release 0.1.0 (#19540) 2024-03-25 17:29:40 -07:00
Erick Friis
b617085af0
mistralai[patch]: streaming tool calls (#19469) 2024-03-23 19:24:53 +00:00
aditya thomas
bc028294d0
docs: delete mistralai embeddings doc from incorrect location (#19432)
**Description:** Delete MistralAIEmbeddings usage document from folder
partners/mistralai/docs
**Issue:** The document is present in the folder docs/docs
**Dependencies:** None
2024-03-22 14:02:59 -07:00
Erick Friis
11e37943ed
mistralai[patch]: fix core version (#19454) 2024-03-22 20:48:13 +00:00
Erick Friis
3b093160c4
mistralai[patch]: release 0.1.0rc1 (#19453) 2024-03-22 20:34:36 +00:00
ccurme
c4599444ee
mistralai: update tool calling (#19451)
```python
from langchain.agents import tool
from langchain_mistralai import ChatMistralAI


llm = ChatMistralAI(model="mistral-large-latest", temperature=0)

@tool
def get_word_length(word: str) -> int:
    """Returns the length of a word."""
    return len(word)


tools = [get_word_length]
llm_with_tools = llm.bind_tools(tools)

llm_with_tools.invoke("how long is the word chrysanthemum")
```
currently raises
```
AttributeError: 'dict' object has no attribute 'model_dump'
```

Same with `.with_structured_output`
```python
from langchain_mistralai import ChatMistralAI
from langchain_core.pydantic_v1 import BaseModel

class AnswerWithJustification(BaseModel):
    """An answer to the user question along with justification for the answer."""
    answer: str
    justification: str

llm = ChatMistralAI(model="mistral-large-latest", temperature=0)
structured_llm = llm.with_structured_output(AnswerWithJustification)

structured_llm.invoke("What weighs more a pound of bricks or a pound of feathers")
```

This appears to fix.
2024-03-22 16:03:48 -04:00
Erick Friis
53ac1ebbbc
mistralai[minor]: 0.1.0rc0, remove mistral sdk (#19420) 2024-03-22 01:24:58 +00:00
Bagatur
242af4b5a4
openai[patch], mistral[patch], fireworks[patch]: releases 0.0.8, 0.0.5, 0.0.2 (#18186) 2024-02-27 04:22:24 -08:00
Bagatur
b3f4de38ae
mistral[minor]: Function calling and with_structured_output (#18150)
![Screenshot 2024-02-26 at 2 07 06
PM](https://github.com/langchain-ai/langchain/assets/22008038/20cacb47-3b24-45b5-871b-dd169f1acd37)
2024-02-26 16:22:30 -08:00
Erick Friis
a99c667c22
partners: version constraints (#17492)
Core should be ^0.1 by default

Careful about 0.x.y and 0.0.z packages
2024-02-14 08:57:46 -08:00
Erick Friis
3a2eb6e12b
infra: add print rule to ruff (#16221)
Added noqa for existing prints. Can slowly remove / will prevent more
being intro'd
2024-02-09 16:13:30 -08:00
Charlie Marsh
24c0bab57b
infra, multiple: Upgrade configuration for Ruff v0.2.0 (#16905)
## Summary

This PR upgrades LangChain's Ruff configuration in preparation for
Ruff's v0.2.0 release. (The changes are compatible with Ruff v0.1.5,
which LangChain uses today.) Specifically, we're now warning when
linter-only options are specified under `[tool.ruff]` instead of
`[tool.ruff.lint]`.

---------

Co-authored-by: Erick Friis <erick@langchain.dev>
Co-authored-by: Bagatur <baskaryan@gmail.com>
2024-02-09 14:28:02 -08:00
Erick Friis
3e58df43c2
mistralai[patch]: release 0.0.4 (#17139) 2024-02-06 16:05:20 -08:00
Erick Friis
f881a3330c
mistralai[patch]: 16k token batching logic embed (#17136) 2024-02-06 15:59:08 -08:00
ccurme
0826d87ecd
langchain_mistralai[patch]: Invoke callback prior to yielding token (#16986)
- **Description:** Invoke callback prior to yielding token in stream and
astream methods for ChatMistralAI.
- **Issue:** https://github.com/langchain-ai/langchain/issues/16913
2024-02-03 16:30:50 -08:00
Erick Friis
65b231d40b
mistralai[patch]: async integration tests (#16214) 2024-01-18 09:45:44 -08:00
Erick Friis
06fe2f4fb0
partners: add license field (#16117)
- bumps package post versions for packages without current unreleased
updates
- will bump package version in release prs associated with packages that
do have changes (mistral, vertex)
2024-01-17 08:37:13 -08:00
Erick Friis
ce10fe0c2f
mistralai[patch]: release 0.0.3 (#16116)
embeddings
2024-01-17 08:36:05 -08:00
David
c323742f4f
mistralai[minor]: Add embeddings (#15282)
- **Description:** Adds MistralAIEmbeddings class for embeddings, using
the new official API.
- **Dependencies:** mistralai
- **Tag maintainer**: @efriis, @hwchase17
- **Twitter handle:** @LMS_David_RS

Create `integrations/text_embedding/mistralai.ipynb`: an example
notebook for MistralAIEmbeddings class
Modify `embeddings/__init__.py`: Import the class
Create `embeddings/mistralai.py`: The embedding class
Create `integration_tests/embeddings/test_mistralai.py`: The test file.

---------

Co-authored-by: Erick Friis <erick@langchain.dev>
2024-01-16 17:48:37 -08:00
Erick Friis
69533c8628
multiple[patch]: .post releases and pyproject metadata (#15962) 2024-01-12 10:09:02 -08:00
Erick Friis
ebb6ad4f7a
mistralai[patch]: release 0.0.2 (#15912) 2024-01-11 13:42:04 -08:00
Erick Friis
0c95f3a981
mistralai[patch]: warn on stop token, fix on_llm_new_token (#15787)
Fixes #15269

Addresses with warning. MistralAI API doesn't support stop token yet.

---------

Co-authored-by: Niels Garve <info@nielsgarve.com>
2024-01-09 16:27:20 -08:00
Erick Friis
323941a90a
mistralai[patch]: persist async client (#15786) 2024-01-09 16:21:39 -08:00
thehunmonkgroup
dc20766513
docs: readme for langchain-mistralai (#14917)
- **Description:** Add README doc for MistralAI partner package.
  - **Tag maintainer:** @baskaryan
2023-12-20 00:22:43 -05:00
Bagatur
a5be9f9475
mistralai: Add langchain-mistralai partner package (#14783)
Co-authored-by: Chad Phillips <chad@apartmentlines.com>
2023-12-19 10:34:19 -05:00