2023-12-20 02:55:19 +00:00
|
|
|
# langchain-anthropic
|
2024-02-23 00:22:30 +00:00
|
|
|
|
|
|
|
This package contains the LangChain integration for Anthropic's generative models.
|
|
|
|
|
|
|
|
## Installation
|
|
|
|
|
|
|
|
`pip install -U langchain-anthropic`
|
|
|
|
|
|
|
|
## Chat Models
|
|
|
|
|
2024-03-05 16:30:16 +00:00
|
|
|
Anthropic recommends using their chat models over text completions.
|
|
|
|
|
|
|
|
You can see their recommended models [here](https://docs.anthropic.com/claude/docs/models-overview#model-recommendations).
|
2024-02-23 00:22:30 +00:00
|
|
|
|
|
|
|
To use, you should have an Anthropic API key configured. Initialize the model as:
|
|
|
|
|
|
|
|
```
|
2024-03-05 16:30:16 +00:00
|
|
|
from langchain_anthropic import ChatAnthropic
|
2024-02-23 00:22:30 +00:00
|
|
|
from langchain_core.messages import AIMessage, HumanMessage
|
|
|
|
|
2024-03-05 16:30:16 +00:00
|
|
|
model = ChatAnthropic(model="claude-3-opus-20240229", temperature=0, max_tokens=1024)
|
2024-02-23 00:22:30 +00:00
|
|
|
```
|
|
|
|
|
|
|
|
### Define the input message
|
|
|
|
|
|
|
|
`message = HumanMessage(content="What is the capital of France?")`
|
|
|
|
|
|
|
|
### Generate a response using the model
|
|
|
|
|
|
|
|
`response = model.invoke([message])`
|
|
|
|
|
|
|
|
For a more detailed walkthrough see [here](https://python.langchain.com/docs/integrations/chat/anthropic).
|
2024-03-05 16:30:16 +00:00
|
|
|
|
|
|
|
## LLMs (Legacy)
|
|
|
|
|
|
|
|
You can use the Claude 2 models for text completions.
|
|
|
|
|
|
|
|
```python
|
|
|
|
from langchain_anthropic import AnthropicLLM
|
|
|
|
|
|
|
|
model = AnthropicLLM(model="claude-2.1", temperature=0, max_tokens=1024)
|
|
|
|
response = model.invoke("The best restaurant in San Francisco is: ")
|
|
|
|
```
|