langchain/libs/partners/anthropic
2024-03-04 19:25:19 -08:00
..
langchain_anthropic anthropic[patch]: model type string (#18510) 2024-03-04 19:25:19 -08:00
scripts infra: add print rule to ruff (#16221) 2024-02-09 16:13:30 -08:00
tests anthropic[patch]: multimodal (#18517) 2024-03-04 17:50:13 -08:00
.gitignore
LICENSE
Makefile anthropic[patch]: de-beta anthropic messages, release 0.0.2 (#17540) 2024-02-14 10:31:45 -08:00
poetry.lock anthropic[minor]: claude 3 (#18508) 2024-03-04 15:03:51 +00:00
pyproject.toml anthropic[minor]: claude 3 (#18508) 2024-03-04 15:03:51 +00:00
README.md docs: langchain-anthropic README updates (#17684) 2024-02-22 16:22:30 -08:00

langchain-anthropic

This package contains the LangChain integration for Anthropic's generative models.

Installation

pip install -U langchain-anthropic

Chat Models

API Model Name Model Family
claude-instant-1.2 Claude Instant
claude-2.1 Claude
claude-2.0 Claude

To use, you should have an Anthropic API key configured. Initialize the model as:

from langchain_anthropic import ChatAnthropicMessages
from langchain_core.messages import AIMessage, HumanMessage

model = ChatAnthropicMessages(model="claude-2.1", temperature=0, max_tokens=1024)

Define the input message

message = HumanMessage(content="What is the capital of France?")

Generate a response using the model

response = model.invoke([message])

For a more detailed walkthrough see here.