langchain/libs/partners/mistralai
ccurme fbfed65fb1
core, partners: add token usage attribute to AIMessage (#21944)
```python
class UsageMetadata(TypedDict):
    """Usage metadata for a message, such as token counts.

    Attributes:
        input_tokens: (int) count of input (or prompt) tokens
        output_tokens: (int) count of output (or completion) tokens
        total_tokens: (int) total token count
    """

    input_tokens: int
    output_tokens: int
    total_tokens: int
```
```python
class AIMessage(BaseMessage):
    ...
    usage_metadata: Optional[UsageMetadata] = None
    """If provided, token usage information associated with the message."""
    ...
```
2024-05-23 14:21:58 -04:00
..
langchain_mistralai mistral: implement ls_params (#21867) 2024-05-20 11:49:48 -07:00
scripts infra: add print rule to ruff (#16221) 2024-02-09 16:13:30 -08:00
tests core, partners: add token usage attribute to AIMessage (#21944) 2024-05-23 14:21:58 -04:00
.gitignore
LICENSE
Makefile
poetry.lock mistral: implement ls_params (#21867) 2024-05-20 11:49:48 -07:00
pyproject.toml mistral: implement ls_params (#21867) 2024-05-20 11:49:48 -07:00
README.md

langchain-mistralai

This package contains the LangChain integrations for MistralAI through their mistralai SDK.

Installation

pip install -U langchain-mistralai

Chat Models

This package contains the ChatMistralAI class, which is the recommended way to interface with MistralAI models.

To use, install the requirements, and configure your environment.

export MISTRAL_API_KEY=your-api-key

Then initialize

from langchain_core.messages import HumanMessage
from langchain_mistralai.chat_models import ChatMistralAI

chat = ChatMistralAI(model="mistral-small")
messages = [HumanMessage(content="say a brief hello")]
chat.invoke(messages)

ChatMistralAI also supports async and streaming functionality:

# For async...
await chat.ainvoke(messages)

# For streaming...
for chunk in chat.stream(messages):
    print(chunk.content, end="", flush=True)

Embeddings

With MistralAIEmbeddings, you can directly use the default model 'mistral-embed', or set a different one if available.

Choose model

embedding.model = 'mistral-embed'

Simple query

res_query = embedding.embed_query("The test information")

Documents

res_document = embedding.embed_documents(["test1", "another test"])