langchain/libs/partners/anthropic
Maxime Perrin 5ac0d1f67b
partners[anthropic]: fix anthropic chat model message type lookup keys (#19034)
- **Description:** Fixing message formatting issue in ChatAnthropic
model by adding dictionary keys for `AIMessageChunk `and
`HumanMessageChunk`
  - **Issue:** #19025 
  - **Twitter handle:** @maximeperrin_

Co-authored-by: Maxime Perrin <mperrin@doing.fr>
Co-authored-by: Erick Friis <erick@langchain.dev>
2024-04-06 00:22:14 +00:00
..
langchain_anthropic partners[anthropic]: fix anthropic chat model message type lookup keys (#19034) 2024-04-06 00:22:14 +00:00
scripts infra: add print rule to ruff (#16221) 2024-02-09 16:13:30 -08:00
tests anthropic[patch]: use anthropic 0.23 (#20022) 2024-04-04 14:23:53 -07:00
.gitignore anthropic: beta messages integration (#14928) 2023-12-19 18:55:19 -08:00
LICENSE anthropic: beta messages integration (#14928) 2023-12-19 18:55:19 -08:00
Makefile anthropic[patch]: de-beta anthropic messages, release 0.0.2 (#17540) 2024-02-14 10:31:45 -08:00
poetry.lock anthropic[patch]: use anthropic 0.23 (#20022) 2024-04-04 14:23:53 -07:00
pyproject.toml anthropic[patch]: Release 0.1.6 (#20026) 2024-04-04 14:29:50 -07:00
README.md anthropic[minor]: add tool calling (#18554) 2024-03-05 08:30:16 -08:00

langchain-anthropic

This package contains the LangChain integration for Anthropic's generative models.

Installation

pip install -U langchain-anthropic

Chat Models

Anthropic recommends using their chat models over text completions.

You can see their recommended models here.

To use, you should have an Anthropic API key configured. Initialize the model as:

from langchain_anthropic import ChatAnthropic
from langchain_core.messages import AIMessage, HumanMessage

model = ChatAnthropic(model="claude-3-opus-20240229", temperature=0, max_tokens=1024)

Define the input message

message = HumanMessage(content="What is the capital of France?")

Generate a response using the model

response = model.invoke([message])

For a more detailed walkthrough see here.

LLMs (Legacy)

You can use the Claude 2 models for text completions.

from langchain_anthropic import AnthropicLLM

model = AnthropicLLM(model="claude-2.1", temperature=0, max_tokens=1024)
response = model.invoke("The best restaurant in San Francisco is: ")