mirror of https://github.com/hwchase17/langchain
docs: `providers` updates 1 (#20256)
- Proviers pages: added missed integrations; fixed format - `mistralai` converted from notebook to .mdx formaterick/docs-ignore-echo-false-blocks
parent
15cb1133e7
commit
4c48732f94
@ -1,43 +1,44 @@
|
||||
# Anthropic
|
||||
|
||||
All functionality related to Anthropic models.
|
||||
>[Anthropic](https://www.anthropic.com/) is an AI safety and research company, and is the creator of `Claude`.
|
||||
This page covers all integrations between `Anthropic` models and `LangChain`.
|
||||
|
||||
[Anthropic](https://www.anthropic.com/) is an AI safety and research company, and is the creator of Claude.
|
||||
This page covers all integrations between Anthropic models and LangChain.
|
||||
## Installation and Setup
|
||||
|
||||
## Installation
|
||||
To use `Anthropic` models, you need to install a python package:
|
||||
|
||||
To use Anthropic models, you will need to install the `langchain-anthropic` package.
|
||||
You can do this with the following command:
|
||||
|
||||
```
|
||||
pip install langchain-anthropic
|
||||
```bash
|
||||
pip install -U langchain-anthropic
|
||||
```
|
||||
|
||||
## Environment Setup
|
||||
|
||||
To use Anthropic models, you will need to set the `ANTHROPIC_API_KEY` environment variable.
|
||||
You need to set the `ANTHROPIC_API_KEY` environment variable.
|
||||
You can get an Anthropic API key [here](https://console.anthropic.com/settings/keys)
|
||||
|
||||
## `ChatAnthropic`
|
||||
## LLMs
|
||||
|
||||
`ChatAnthropic` is a subclass of LangChain's `ChatModel`.
|
||||
You can import this wrapper with the following code:
|
||||
### [Legacy] AnthropicLLM
|
||||
|
||||
```
|
||||
from langchain_anthropic import ChatAnthropic
|
||||
model = ChatAnthropic(model='claude-3-opus-20240229')
|
||||
**NOTE**: `AnthropicLLM` only supports legacy `Claude 2` models.
|
||||
To use the newest `Claude 3` models, please use `ChatAnthropic` instead.
|
||||
|
||||
See a [usage example](/docs/integrations/llms/anthropic).
|
||||
|
||||
```python
|
||||
from langchain_anthropic import AnthropicLLM
|
||||
|
||||
model = AnthropicLLM(model='claude-2.1')
|
||||
```
|
||||
|
||||
Read more in the [ChatAnthropic documentation](/docs/integrations/chat/anthropic).
|
||||
## Chat Models
|
||||
|
||||
## [Legacy] `AnthropicLLM`
|
||||
### ChatAnthropic
|
||||
|
||||
`AnthropicLLM` is a subclass of LangChain's `LLM`. It is a wrapper around Anthropic's
|
||||
text-based completion endpoints.
|
||||
See a [usage example](/docs/integrations/chat/anthropic).
|
||||
|
||||
```python
|
||||
from langchain_anthropic import AnthropicLLM
|
||||
from langchain_anthropic import ChatAnthropic
|
||||
|
||||
model = ChatAnthropic(model='claude-3-opus-20240229')
|
||||
```
|
||||
|
||||
|
||||
model = AnthropicLLM(model='claude-2.1')
|
||||
```
|
@ -1,78 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# MistralAI\n",
|
||||
"\n",
|
||||
"Mistral AI is a platform that offers hosting for their powerful open source models.\n",
|
||||
"\n",
|
||||
"You can access them via their [API](https://docs.mistral.ai/api/).\n",
|
||||
"\n",
|
||||
"A valid [API key](https://console.mistral.ai/users/api-keys/) is needed to communicate with the API.\n",
|
||||
"\n",
|
||||
"You will also need the `langchain-mistralai` package:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%pip install -qU langchain-core langchain-mistralai"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 2,
|
||||
"metadata": {
|
||||
"id": "y8ku6X96sebl"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from langchain_mistralai import ChatMistralAI, MistralAIEmbeddings"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"See the docs for their\n",
|
||||
"\n",
|
||||
"- [Chat Model](/docs/integrations/chat/mistralai)\n",
|
||||
"- [Embeddings Model](/docs/integrations/text_embedding/mistralai)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"colab": {
|
||||
"provenance": []
|
||||
},
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3 (ipykernel)",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.10.11"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 1
|
||||
}
|
@ -0,0 +1,34 @@
|
||||
# MistralAI
|
||||
|
||||
>[Mistral AI](https://docs.mistral.ai/api/) is a platform that offers hosting for their powerful open source models.
|
||||
|
||||
|
||||
## Installation and Setup
|
||||
|
||||
A valid [API key](https://console.mistral.ai/users/api-keys/) is needed to communicate with the API.
|
||||
|
||||
You will also need the `langchain-mistralai` package:
|
||||
|
||||
```bash
|
||||
pip install langchain-mistralai
|
||||
```
|
||||
|
||||
## Chat models
|
||||
|
||||
### ChatMistralAI
|
||||
|
||||
See a [usage example](/docs/integrations/chat/mistralai).
|
||||
|
||||
```python
|
||||
from langchain_mistralai.chat_models import ChatMistralAI
|
||||
```
|
||||
|
||||
## Embedding models
|
||||
|
||||
### MistralAIEmbeddings
|
||||
|
||||
See a [usage example](/docs/integrations/text_embedding/mistralai).
|
||||
|
||||
```python
|
||||
from langchain_mistralai import MistralAIEmbeddings
|
||||
```
|
Loading…
Reference in New Issue