langchain/libs/partners/mistralai
thehunmonkgroup dc20766513
docs: readme for langchain-mistralai (#14917)
- **Description:** Add README doc for MistralAI partner package.
  - **Tag maintainer:** @baskaryan
2023-12-20 00:22:43 -05:00
..
langchain_mistralai mistralai: Add langchain-mistralai partner package (#14783) 2023-12-19 10:34:19 -05:00
scripts mistralai: Add langchain-mistralai partner package (#14783) 2023-12-19 10:34:19 -05:00
tests mistralai: Add langchain-mistralai partner package (#14783) 2023-12-19 10:34:19 -05:00
.gitignore mistralai: Add langchain-mistralai partner package (#14783) 2023-12-19 10:34:19 -05:00
LICENSE mistralai: Add langchain-mistralai partner package (#14783) 2023-12-19 10:34:19 -05:00
Makefile mistralai: Add langchain-mistralai partner package (#14783) 2023-12-19 10:34:19 -05:00
poetry.lock mistralai: Add langchain-mistralai partner package (#14783) 2023-12-19 10:34:19 -05:00
pyproject.toml mistralai: Add langchain-mistralai partner package (#14783) 2023-12-19 10:34:19 -05:00
README.md docs: readme for langchain-mistralai (#14917) 2023-12-20 00:22:43 -05:00

langchain-mistralai

This package contains the LangChain integrations for MistralAI through their mistralai SDK.

Installation

pip install -U langchain-mistralai

Chat Models

This package contains the ChatMistralAI class, which is the recommended way to interface with MistralAI models.

To use, install the requirements, and configure your environment.

export MISTRAL_API_KEY=your-api-key

Then initialize

from langchain_core.messages import HumanMessage
from langchain_mistralai.chat_models import ChatMistralAI

chat = ChatMistralAI(model="mistral-small")
messages = [HumanMessage(content="say a brief hello")]
chat.invoke(messages)

ChatMistralAI also supports async and streaming functionality:

# For async...
await chat.ainvoke(messages)

# For streaming...
for chunk in chat.stream(messages):
    print(chunk.content, end="", flush=True)