langchain/libs/partners/anthropic
Jorge Villegas f6a98032e4
docs: langchain-anthropic README updates (#17684)
# PR Message

- **Description:** This PR adds a README file for the Anthropic API in
the `libs/partners` folder of this repository. The README includes:
  - A brief description of the Anthropic package
  - Installation & API instructions
  - Usage examples
  
- **Issue:**
[17545](https://github.com/langchain-ai/langchain/issues/17545)
  
- **Dependencies:** None

Additional notes:
This change only affects the docs package and does not introduce any new
dependencies.

---------

Co-authored-by: Bagatur <baskaryan@gmail.com>
2024-02-22 16:22:30 -08:00
..
langchain_anthropic anthropic[patch]: de-beta anthropic messages, release 0.0.2 (#17540) 2024-02-14 10:31:45 -08:00
scripts infra: add print rule to ruff (#16221) 2024-02-09 16:13:30 -08:00
tests anthropic[patch]: allow pop by field name (#16544) 2024-01-24 15:48:31 -07:00
.gitignore anthropic: beta messages integration (#14928) 2023-12-19 18:55:19 -08:00
LICENSE anthropic: beta messages integration (#14928) 2023-12-19 18:55:19 -08:00
Makefile anthropic[patch]: de-beta anthropic messages, release 0.0.2 (#17540) 2024-02-14 10:31:45 -08:00
poetry.lock anthropic[patch]: de-beta anthropic messages, release 0.0.2 (#17540) 2024-02-14 10:31:45 -08:00
pyproject.toml anthropic[patch]: de-beta anthropic messages, release 0.0.2 (#17540) 2024-02-14 10:31:45 -08:00
README.md docs: langchain-anthropic README updates (#17684) 2024-02-22 16:22:30 -08:00

langchain-anthropic

This package contains the LangChain integration for Anthropic's generative models.

Installation

pip install -U langchain-anthropic

Chat Models

API Model Name Model Family
claude-instant-1.2 Claude Instant
claude-2.1 Claude
claude-2.0 Claude

To use, you should have an Anthropic API key configured. Initialize the model as:

from langchain_anthropic import ChatAnthropicMessages
from langchain_core.messages import AIMessage, HumanMessage

model = ChatAnthropicMessages(model="claude-2.1", temperature=0, max_tokens=1024)

Define the input message

message = HumanMessage(content="What is the capital of France?")

Generate a response using the model

response = model.invoke([message])

For a more detailed walkthrough see here.