mirror of
https://github.com/hwchase17/langchain
synced 2024-11-13 19:10:52 +00:00
community: Add Writer integration (#27646)
**Description:** Add support for Writer chat models **Issue:** N/A **Dependencies:** Add `writer-sdk` to optional dependencies. **Twitter handle:** Please tag `@samjulien` and `@Get_Writer` **Tests and docs** - [x] Unit test - [x] Example notebook in `docs/docs/integrations` directory. **Lint and test** - [x] Run `make format` - [x] Run `make lint` - [x] Run `make test` --------- Co-authored-by: Johannes <tolstoy.work@gmail.com> Co-authored-by: Erick Friis <erick@langchain.dev>
This commit is contained in:
parent
595dc592c9
commit
0a472e2a2d
343
docs/docs/integrations/chat/writer.ipynb
Normal file
343
docs/docs/integrations/chat/writer.ipynb
Normal file
@ -0,0 +1,343 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"metadata": {},
|
||||||
|
"cell_type": "raw",
|
||||||
|
"source": [
|
||||||
|
"---\n",
|
||||||
|
"sidebar_label: Writer\n",
|
||||||
|
"---"
|
||||||
|
],
|
||||||
|
"id": "85e07aae70a15572"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "cb4dd00a-8893-4a45-96f7-9a9fc341cd61",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# ChatWriter\n",
|
||||||
|
"\n",
|
||||||
|
"This notebook provides a quick overview for getting started with Writer [chat models](/docs/concepts/#chat-models).\n",
|
||||||
|
"\n",
|
||||||
|
"Writer has several chat models. You can find information about their latest models and their costs, context windows, and supported input types in the [Writer docs](https://dev.writer.com/home/models).\n",
|
||||||
|
"\n",
|
||||||
|
":::"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "e49f1e0d",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Overview\n",
|
||||||
|
"\n",
|
||||||
|
"### Integration details\n",
|
||||||
|
"| Class | Package | Local | Serializable | [JS support](https://js.langchain.com/docs/integrations/chat/openai) | Package downloads | Package latest |\n",
|
||||||
|
"| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n",
|
||||||
|
"| ChatWriter | langchain-community | ❌ | ❌ | ❌ | ❌ | ❌ |\n",
|
||||||
|
"\n",
|
||||||
|
"### Model features\n",
|
||||||
|
"| [Tool calling](/docs/how_to/tool_calling) | [Structured output](/docs/how_to/structured_output/) | JSON mode | Image input | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | Native async | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
|
||||||
|
"| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n",
|
||||||
|
"| ✅ | ❌ | ✅ | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ | ✅ | \n",
|
||||||
|
"\n",
|
||||||
|
"## Setup\n",
|
||||||
|
"\n",
|
||||||
|
"To access Writer models you'll need to create a Writer account, get an API key, and install the `writer-sdk` and `langchain-community` packages.\n",
|
||||||
|
"\n",
|
||||||
|
"### Credentials\n",
|
||||||
|
"\n",
|
||||||
|
"Head to [Writer AI Studio](https://app.writer.com/aistudio/signup?utm_campaign=devrel) to sign up to OpenAI and generate an API key. Once you've done this set the WRITER_API_KEY environment variable:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"id": "e817fe2e-4f1d-4533-b19e-2400b1cf6ce8",
|
||||||
|
"metadata": {
|
||||||
|
"ExecuteTime": {
|
||||||
|
"end_time": "2024-10-24T13:51:54.323678Z",
|
||||||
|
"start_time": "2024-10-24T13:51:42.127404Z"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"import getpass\n",
|
||||||
|
"import os\n",
|
||||||
|
"\n",
|
||||||
|
"if not os.environ.get(\"WRITER_API_KEY\"):\n",
|
||||||
|
" os.environ[\"WRITER_API_KEY\"] = getpass.getpass(\"Enter your Writer API key: \")"
|
||||||
|
],
|
||||||
|
"outputs": [],
|
||||||
|
"execution_count": 1
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "c59722a9-6dbb-45f7-ae59-5be50ca5733d",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"### Installation\n",
|
||||||
|
"\n",
|
||||||
|
"The LangChain Writer integration lives in the `langchain-community` package:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"id": "2113471c-75d7-45df-b784-d78da4ef7aba",
|
||||||
|
"metadata": {
|
||||||
|
"ExecuteTime": {
|
||||||
|
"end_time": "2024-10-24T13:52:49.262240Z",
|
||||||
|
"start_time": "2024-10-24T13:52:47.564879Z"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"%pip install -qU langchain-community writer-sdk"
|
||||||
|
],
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"Note: you may need to restart the kernel to use updated packages.\n"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"execution_count": 4
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "1098bc9d-ce83-462b-8c19-f85bf3a159dc",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Instantiation\n",
|
||||||
|
"\n",
|
||||||
|
"Now we can instantiate our model object and generate chat completions:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"id": "522686de",
|
||||||
|
"metadata": {
|
||||||
|
"tags": [],
|
||||||
|
"ExecuteTime": {
|
||||||
|
"end_time": "2024-10-24T13:52:38.822950Z",
|
||||||
|
"start_time": "2024-10-24T13:52:38.674441Z"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"from langchain_community.chat_models.writer import ChatWriter\n",
|
||||||
|
"\n",
|
||||||
|
"llm = ChatWriter(\n",
|
||||||
|
" model=\"palmyra-x-004\",\n",
|
||||||
|
" temperature=0.7,\n",
|
||||||
|
" max_tokens=1000,\n",
|
||||||
|
" # api_key=\"...\", # if you prefer to pass api key in directly instaed of using env vars\n",
|
||||||
|
" # base_url=\"...\",\n",
|
||||||
|
" # other params...\n",
|
||||||
|
")"
|
||||||
|
],
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"ename": "ImportError",
|
||||||
|
"evalue": "cannot import name 'ChatWriter' from 'langchain_community.chat_models' (/home/yanomaly/PycharmProjects/whitesnake/writer/langсhain/libs/community/langchain_community/chat_models/__init__.py)",
|
||||||
|
"output_type": "error",
|
||||||
|
"traceback": [
|
||||||
|
"\u001B[0;31m---------------------------------------------------------------------------\u001B[0m",
|
||||||
|
"\u001B[0;31mImportError\u001B[0m Traceback (most recent call last)",
|
||||||
|
"Cell \u001B[0;32mIn[3], line 1\u001B[0m\n\u001B[0;32m----> 1\u001B[0m \u001B[38;5;28;01mfrom\u001B[39;00m \u001B[38;5;21;01mlangchain_community\u001B[39;00m\u001B[38;5;21;01m.\u001B[39;00m\u001B[38;5;21;01mchat_models\u001B[39;00m \u001B[38;5;28;01mimport\u001B[39;00m ChatWriter\n\u001B[1;32m 3\u001B[0m llm \u001B[38;5;241m=\u001B[39m ChatWriter(\n\u001B[1;32m 4\u001B[0m model\u001B[38;5;241m=\u001B[39m\u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mpalmyra-x-004\u001B[39m\u001B[38;5;124m\"\u001B[39m,\n\u001B[1;32m 5\u001B[0m temperature\u001B[38;5;241m=\u001B[39m\u001B[38;5;241m0.7\u001B[39m,\n\u001B[0;32m (...)\u001B[0m\n\u001B[1;32m 9\u001B[0m \u001B[38;5;66;03m# other params...\u001B[39;00m\n\u001B[1;32m 10\u001B[0m )\n",
|
||||||
|
"\u001B[0;31mImportError\u001B[0m: cannot import name 'ChatWriter' from 'langchain_community.chat_models' (/home/yanomaly/PycharmProjects/whitesnake/writer/langсhain/libs/community/langchain_community/chat_models/__init__.py)"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"execution_count": 3
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "6511982a-734a-4193-a47d-254f8dcaff5e",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Invocation"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "ce16ad78-8e6f-48cd-954e-98be75eb5836",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"messages = [\n",
|
||||||
|
" (\n",
|
||||||
|
" \"system\",\n",
|
||||||
|
" \"You are a helpful assistant that writes poems about the Python programming language.\",\n",
|
||||||
|
" ),\n",
|
||||||
|
" (\"human\", \"Write a poem about Python.\"),\n",
|
||||||
|
"]\n",
|
||||||
|
"ai_msg = llm.invoke(messages)\n",
|
||||||
|
"ai_msg"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "2cd224b8-4499-41fb-a604-d53a7ff17b2e",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"print(ai_msg.content)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "778f912a-66ea-4a5d-b3de-6c7db4baba26",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Chaining\n",
|
||||||
|
"\n",
|
||||||
|
"We can [chain](/docs/how_to/sequence/) our model with a prompt template like so:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "fbb043e6",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain_core.prompts import ChatPromptTemplate\n",
|
||||||
|
"\n",
|
||||||
|
"prompt = ChatPromptTemplate.from_messages(\n",
|
||||||
|
" [\n",
|
||||||
|
" (\n",
|
||||||
|
" \"system\",\n",
|
||||||
|
" \"You are a helpful assistant that writes poems about the {input_language} programming language.\",\n",
|
||||||
|
" ),\n",
|
||||||
|
" (\"human\", \"{input}\"),\n",
|
||||||
|
" ]\n",
|
||||||
|
")\n",
|
||||||
|
"\n",
|
||||||
|
"chain = prompt | llm\n",
|
||||||
|
"chain.invoke(\n",
|
||||||
|
" {\n",
|
||||||
|
" \"input_language\": \"Java\",\n",
|
||||||
|
" \"input\": \"Write a poem about Java.\",\n",
|
||||||
|
" }\n",
|
||||||
|
")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "0b1b52a5-b58d-40c9-bcdd-88eb8fb351e2",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Tool calling\n",
|
||||||
|
"\n",
|
||||||
|
"Writer supports [tool calling](https://dev.writer.com/api-guides/tool-calling), which lets you describe tools and their arguments, and have the model return a JSON object with a tool to invoke and the inputs to that tool.\n",
|
||||||
|
"\n",
|
||||||
|
"### ChatWriter.bind_tools()\n",
|
||||||
|
"\n",
|
||||||
|
"With `ChatWriter.bind_tools`, we can easily pass in Pydantic classes, dict schemas, LangChain tools, or even functions as tools to the model. Under the hood these are converted to tool schemas, which looks like:\n",
|
||||||
|
"```\n",
|
||||||
|
"{\n",
|
||||||
|
" \"name\": \"...\",\n",
|
||||||
|
" \"description\": \"...\",\n",
|
||||||
|
" \"parameters\": {...} # JSONSchema\n",
|
||||||
|
"}\n",
|
||||||
|
"```\n",
|
||||||
|
"and passed in every model invocation."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 6,
|
||||||
|
"id": "b7ea7690-ec7a-4337-b392-e87d1f39a6ec",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from pydantic import BaseModel, Field\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"class GetWeather(BaseModel):\n",
|
||||||
|
" \"\"\"Get the current weather in a given location\"\"\"\n",
|
||||||
|
"\n",
|
||||||
|
" location: str = Field(..., description=\"The city and state, e.g. San Francisco, CA\")\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"llm_with_tools = llm.bind_tools([GetWeather])"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "1d1ab955-6a68-42f8-bb5d-86eb1111478a",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"ai_msg = llm_with_tools.invoke(\n",
|
||||||
|
" \"what is the weather like in New York City\",\n",
|
||||||
|
")\n",
|
||||||
|
"ai_msg"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "768d1ae4-4b1a-48eb-a329-c8d5051067a3",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"### AIMessage.tool_calls\n",
|
||||||
|
"Notice that the AIMessage has a `tool_calls` attribute. This contains in a standardized ToolCall format that is model-provider agnostic."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "166cb7ce-831d-4a7c-9721-abc107f11084",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"ai_msg.tool_calls"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "e082c9ac-c7c7-4aff-a8ec-8e220262a59c",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"For more on binding tools and tool call outputs, head to the [tool calling](/docs/how_to/function_calling) docs."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "a796d728-971b-408b-88d5-440015bbb941",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## API reference\n",
|
||||||
|
"\n",
|
||||||
|
"For detailed documentation of all Writer features, head to our [API reference](https://dev.writer.com/api-guides/api-reference/completion-api/chat-completion)."
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": ".venv",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.11.4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 5
|
||||||
|
}
|
@ -96,3 +96,4 @@ nanopq==0.2.1
|
|||||||
mlflow[genai]>=2.14.0
|
mlflow[genai]>=2.14.0
|
||||||
databricks-sdk>=0.30.0
|
databricks-sdk>=0.30.0
|
||||||
websocket>=0.2.1,<1
|
websocket>=0.2.1,<1
|
||||||
|
writer-sdk>=1.2.0
|
||||||
|
317
libs/community/langchain_community/chat_models/writer.py
Normal file
317
libs/community/langchain_community/chat_models/writer.py
Normal file
@ -0,0 +1,317 @@
|
|||||||
|
"""Writer chat wrapper."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import logging
|
||||||
|
from typing import (
|
||||||
|
Any,
|
||||||
|
AsyncIterator,
|
||||||
|
Callable,
|
||||||
|
Dict,
|
||||||
|
Iterator,
|
||||||
|
List,
|
||||||
|
Literal,
|
||||||
|
Mapping,
|
||||||
|
Optional,
|
||||||
|
Sequence,
|
||||||
|
Tuple,
|
||||||
|
Type,
|
||||||
|
Union,
|
||||||
|
)
|
||||||
|
|
||||||
|
from langchain_core.callbacks import (
|
||||||
|
AsyncCallbackManagerForLLMRun,
|
||||||
|
CallbackManagerForLLMRun,
|
||||||
|
)
|
||||||
|
from langchain_core.language_models import LanguageModelInput
|
||||||
|
from langchain_core.language_models.chat_models import (
|
||||||
|
BaseChatModel,
|
||||||
|
agenerate_from_stream,
|
||||||
|
generate_from_stream,
|
||||||
|
)
|
||||||
|
from langchain_core.messages import (
|
||||||
|
AIMessage,
|
||||||
|
AIMessageChunk,
|
||||||
|
BaseMessage,
|
||||||
|
ChatMessage,
|
||||||
|
HumanMessage,
|
||||||
|
SystemMessage,
|
||||||
|
ToolMessage,
|
||||||
|
)
|
||||||
|
from langchain_core.outputs import ChatGeneration, ChatGenerationChunk, ChatResult
|
||||||
|
from langchain_core.runnables import Runnable
|
||||||
|
from langchain_core.utils.function_calling import convert_to_openai_tool
|
||||||
|
from pydantic import BaseModel, ConfigDict, Field, SecretStr
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def _convert_message_to_dict(message: BaseMessage) -> dict:
|
||||||
|
"""Convert a LangChain message to a Writer message dict."""
|
||||||
|
message_dict = {"role": "", "content": message.content}
|
||||||
|
|
||||||
|
if isinstance(message, ChatMessage):
|
||||||
|
message_dict["role"] = message.role
|
||||||
|
elif isinstance(message, HumanMessage):
|
||||||
|
message_dict["role"] = "user"
|
||||||
|
elif isinstance(message, AIMessage):
|
||||||
|
message_dict["role"] = "assistant"
|
||||||
|
if message.tool_calls:
|
||||||
|
message_dict["tool_calls"] = [
|
||||||
|
{
|
||||||
|
"id": tool["id"],
|
||||||
|
"type": "function",
|
||||||
|
"function": {"name": tool["name"], "arguments": tool["args"]},
|
||||||
|
}
|
||||||
|
for tool in message.tool_calls
|
||||||
|
]
|
||||||
|
elif isinstance(message, SystemMessage):
|
||||||
|
message_dict["role"] = "system"
|
||||||
|
elif isinstance(message, ToolMessage):
|
||||||
|
message_dict["role"] = "tool"
|
||||||
|
message_dict["tool_call_id"] = message.tool_call_id
|
||||||
|
else:
|
||||||
|
raise ValueError(f"Got unknown message type: {type(message)}")
|
||||||
|
|
||||||
|
if message.name:
|
||||||
|
message_dict["name"] = message.name
|
||||||
|
|
||||||
|
return message_dict
|
||||||
|
|
||||||
|
|
||||||
|
def _convert_dict_to_message(response_dict: Dict[str, Any]) -> BaseMessage:
|
||||||
|
"""Convert a Writer message dict to a LangChain message."""
|
||||||
|
role = response_dict["role"]
|
||||||
|
content = response_dict.get("content", "")
|
||||||
|
|
||||||
|
if role == "user":
|
||||||
|
return HumanMessage(content=content)
|
||||||
|
elif role == "assistant":
|
||||||
|
additional_kwargs = {}
|
||||||
|
if tool_calls := response_dict.get("tool_calls"):
|
||||||
|
additional_kwargs["tool_calls"] = tool_calls
|
||||||
|
return AIMessageChunk(content=content, additional_kwargs=additional_kwargs)
|
||||||
|
elif role == "system":
|
||||||
|
return SystemMessage(content=content)
|
||||||
|
elif role == "tool":
|
||||||
|
return ToolMessage(
|
||||||
|
content=content,
|
||||||
|
tool_call_id=response_dict["tool_call_id"],
|
||||||
|
name=response_dict.get("name"),
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
return ChatMessage(content=content, role=role)
|
||||||
|
|
||||||
|
|
||||||
|
class ChatWriter(BaseChatModel):
|
||||||
|
"""Writer chat model.
|
||||||
|
|
||||||
|
To use, you should have the ``writer-sdk`` Python package installed, and the
|
||||||
|
environment variable ``WRITER_API_KEY`` set with your API key.
|
||||||
|
|
||||||
|
Example:
|
||||||
|
.. code-block:: python
|
||||||
|
|
||||||
|
from langchain_community.chat_models import ChatWriter
|
||||||
|
|
||||||
|
chat = ChatWriter(model="palmyra-x-004")
|
||||||
|
"""
|
||||||
|
|
||||||
|
client: Any = Field(default=None, exclude=True) #: :meta private:
|
||||||
|
async_client: Any = Field(default=None, exclude=True) #: :meta private:
|
||||||
|
model_name: str = Field(default="palmyra-x-004", alias="model")
|
||||||
|
"""Model name to use."""
|
||||||
|
temperature: float = 0.7
|
||||||
|
"""What sampling temperature to use."""
|
||||||
|
model_kwargs: Dict[str, Any] = Field(default_factory=dict)
|
||||||
|
"""Holds any model parameters valid for `create` call not explicitly specified."""
|
||||||
|
writer_api_key: Optional[SecretStr] = Field(default=None, alias="api_key")
|
||||||
|
"""Writer API key."""
|
||||||
|
writer_api_base: Optional[str] = Field(default=None, alias="base_url")
|
||||||
|
"""Base URL for API requests."""
|
||||||
|
streaming: bool = False
|
||||||
|
"""Whether to stream the results or not."""
|
||||||
|
n: int = 1
|
||||||
|
"""Number of chat completions to generate for each prompt."""
|
||||||
|
max_tokens: Optional[int] = None
|
||||||
|
"""Maximum number of tokens to generate."""
|
||||||
|
|
||||||
|
model_config = ConfigDict(populate_by_name=True)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def _llm_type(self) -> str:
|
||||||
|
"""Return type of chat model."""
|
||||||
|
return "writer-chat"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def _identifying_params(self) -> Dict[str, Any]:
|
||||||
|
"""Get the identifying parameters."""
|
||||||
|
return {
|
||||||
|
"model_name": self.model_name,
|
||||||
|
"temperature": self.temperature,
|
||||||
|
"streaming": self.streaming,
|
||||||
|
**self.model_kwargs,
|
||||||
|
}
|
||||||
|
|
||||||
|
def _create_chat_result(self, response: Mapping[str, Any]) -> ChatResult:
|
||||||
|
generations = []
|
||||||
|
for choice in response["choices"]:
|
||||||
|
message = _convert_dict_to_message(choice["message"])
|
||||||
|
gen = ChatGeneration(
|
||||||
|
message=message,
|
||||||
|
generation_info=dict(finish_reason=choice.get("finish_reason")),
|
||||||
|
)
|
||||||
|
generations.append(gen)
|
||||||
|
|
||||||
|
token_usage = response.get("usage", {})
|
||||||
|
llm_output = {
|
||||||
|
"token_usage": token_usage,
|
||||||
|
"model_name": self.model_name,
|
||||||
|
"system_fingerprint": response.get("system_fingerprint", ""),
|
||||||
|
}
|
||||||
|
|
||||||
|
return ChatResult(generations=generations, llm_output=llm_output)
|
||||||
|
|
||||||
|
def _convert_messages_to_dicts(
|
||||||
|
self, messages: List[BaseMessage], stop: Optional[List[str]] = None
|
||||||
|
) -> Tuple[List[Dict[str, Any]], Dict[str, Any]]:
|
||||||
|
params = {
|
||||||
|
"model": self.model_name,
|
||||||
|
"temperature": self.temperature,
|
||||||
|
"n": self.n,
|
||||||
|
"stream": self.streaming,
|
||||||
|
**self.model_kwargs,
|
||||||
|
}
|
||||||
|
if stop:
|
||||||
|
params["stop"] = stop
|
||||||
|
if self.max_tokens is not None:
|
||||||
|
params["max_tokens"] = self.max_tokens
|
||||||
|
|
||||||
|
message_dicts = [_convert_message_to_dict(m) for m in messages]
|
||||||
|
return message_dicts, params
|
||||||
|
|
||||||
|
def _stream(
|
||||||
|
self,
|
||||||
|
messages: List[BaseMessage],
|
||||||
|
stop: Optional[List[str]] = None,
|
||||||
|
run_manager: Optional[CallbackManagerForLLMRun] = None,
|
||||||
|
**kwargs: Any,
|
||||||
|
) -> Iterator[ChatGenerationChunk]:
|
||||||
|
message_dicts, params = self._convert_messages_to_dicts(messages, stop)
|
||||||
|
params = {**params, **kwargs, "stream": True}
|
||||||
|
|
||||||
|
response = self.client.chat.chat(messages=message_dicts, **params)
|
||||||
|
|
||||||
|
for chunk in response:
|
||||||
|
delta = chunk["choices"][0].get("delta")
|
||||||
|
if not delta or not delta.get("content"):
|
||||||
|
continue
|
||||||
|
chunk = _convert_dict_to_message(
|
||||||
|
{"role": "assistant", "content": delta["content"]}
|
||||||
|
)
|
||||||
|
chunk = ChatGenerationChunk(message=chunk)
|
||||||
|
|
||||||
|
if run_manager:
|
||||||
|
run_manager.on_llm_new_token(chunk.text)
|
||||||
|
|
||||||
|
yield chunk
|
||||||
|
|
||||||
|
async def _astream(
|
||||||
|
self,
|
||||||
|
messages: List[BaseMessage],
|
||||||
|
stop: Optional[List[str]] = None,
|
||||||
|
run_manager: Optional[AsyncCallbackManagerForLLMRun] = None,
|
||||||
|
**kwargs: Any,
|
||||||
|
) -> AsyncIterator[ChatGenerationChunk]:
|
||||||
|
message_dicts, params = self._convert_messages_to_dicts(messages, stop)
|
||||||
|
params = {**params, **kwargs, "stream": True}
|
||||||
|
|
||||||
|
response = await self.async_client.chat.chat(messages=message_dicts, **params)
|
||||||
|
|
||||||
|
async for chunk in response:
|
||||||
|
delta = chunk["choices"][0].get("delta")
|
||||||
|
if not delta or not delta.get("content"):
|
||||||
|
continue
|
||||||
|
chunk = _convert_dict_to_message(
|
||||||
|
{"role": "assistant", "content": delta["content"]}
|
||||||
|
)
|
||||||
|
chunk = ChatGenerationChunk(message=chunk)
|
||||||
|
|
||||||
|
if run_manager:
|
||||||
|
await run_manager.on_llm_new_token(chunk.text)
|
||||||
|
|
||||||
|
yield chunk
|
||||||
|
|
||||||
|
def _generate(
|
||||||
|
self,
|
||||||
|
messages: List[BaseMessage],
|
||||||
|
stop: Optional[List[str]] = None,
|
||||||
|
run_manager: Optional[CallbackManagerForLLMRun] = None,
|
||||||
|
**kwargs: Any,
|
||||||
|
) -> ChatResult:
|
||||||
|
if self.streaming:
|
||||||
|
return generate_from_stream(
|
||||||
|
self._stream(messages, stop, run_manager, **kwargs)
|
||||||
|
)
|
||||||
|
|
||||||
|
message_dicts, params = self._convert_messages_to_dicts(messages, stop)
|
||||||
|
params = {**params, **kwargs}
|
||||||
|
response = self.client.chat.chat(messages=message_dicts, **params)
|
||||||
|
return self._create_chat_result(response)
|
||||||
|
|
||||||
|
async def _agenerate(
|
||||||
|
self,
|
||||||
|
messages: List[BaseMessage],
|
||||||
|
stop: Optional[List[str]] = None,
|
||||||
|
run_manager: Optional[AsyncCallbackManagerForLLMRun] = None,
|
||||||
|
**kwargs: Any,
|
||||||
|
) -> ChatResult:
|
||||||
|
if self.streaming:
|
||||||
|
return await agenerate_from_stream(
|
||||||
|
self._astream(messages, stop, run_manager, **kwargs)
|
||||||
|
)
|
||||||
|
|
||||||
|
message_dicts, params = self._convert_messages_to_dicts(messages, stop)
|
||||||
|
params = {**params, **kwargs}
|
||||||
|
response = await self.async_client.chat.chat(messages=message_dicts, **params)
|
||||||
|
return self._create_chat_result(response)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def _default_params(self) -> Dict[str, Any]:
|
||||||
|
"""Get the default parameters for calling Writer API."""
|
||||||
|
return {
|
||||||
|
"model": self.model_name,
|
||||||
|
"temperature": self.temperature,
|
||||||
|
"stream": self.streaming,
|
||||||
|
"n": self.n,
|
||||||
|
"max_tokens": self.max_tokens,
|
||||||
|
**self.model_kwargs,
|
||||||
|
}
|
||||||
|
|
||||||
|
def bind_tools(
|
||||||
|
self,
|
||||||
|
tools: Sequence[Union[Dict[str, Any], Type[BaseModel], Callable]],
|
||||||
|
*,
|
||||||
|
tool_choice: Optional[Union[str, Literal["auto", "none"]]] = None,
|
||||||
|
**kwargs: Any,
|
||||||
|
) -> Runnable[LanguageModelInput, BaseMessage]:
|
||||||
|
"""Bind tools to the chat model.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
tools: Tools to bind to the model
|
||||||
|
tool_choice: Which tool to require ('auto', 'none', or specific tool name)
|
||||||
|
**kwargs: Additional parameters to pass to the chat model
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
A runnable that will use the tools
|
||||||
|
"""
|
||||||
|
formatted_tools = [convert_to_openai_tool(tool) for tool in tools]
|
||||||
|
|
||||||
|
if tool_choice:
|
||||||
|
kwargs["tool_choice"] = (
|
||||||
|
(tool_choice)
|
||||||
|
if tool_choice in ("auto", "none")
|
||||||
|
else {"type": "function", "function": {"name": tool_choice}}
|
||||||
|
)
|
||||||
|
|
||||||
|
return super().bind(tools=formatted_tools, **kwargs)
|
303
libs/community/tests/unit_tests/chat_models/test_writer.py
Normal file
303
libs/community/tests/unit_tests/chat_models/test_writer.py
Normal file
@ -0,0 +1,303 @@
|
|||||||
|
"""Unit tests for Writer chat model integration."""
|
||||||
|
|
||||||
|
import json
|
||||||
|
from typing import Any, Dict, List
|
||||||
|
from unittest.mock import AsyncMock, MagicMock, patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from langchain_core.callbacks.manager import CallbackManager
|
||||||
|
from langchain_core.messages import AIMessage, HumanMessage, SystemMessage, ToolMessage
|
||||||
|
from pydantic import SecretStr
|
||||||
|
|
||||||
|
from langchain_community.chat_models.writer import ChatWriter, _convert_dict_to_message
|
||||||
|
from tests.unit_tests.callbacks.fake_callback_handler import FakeCallbackHandler
|
||||||
|
|
||||||
|
|
||||||
|
class TestChatWriter:
|
||||||
|
def test_writer_model_param(self) -> None:
|
||||||
|
"""Test different ways to initialize the chat model."""
|
||||||
|
test_cases: List[dict] = [
|
||||||
|
{"model_name": "palmyra-x-004", "writer_api_key": "test-key"},
|
||||||
|
{"model": "palmyra-x-004", "writer_api_key": "test-key"},
|
||||||
|
{"model_name": "palmyra-x-004", "writer_api_key": "test-key"},
|
||||||
|
{
|
||||||
|
"model": "palmyra-x-004",
|
||||||
|
"writer_api_key": "test-key",
|
||||||
|
"temperature": 0.5,
|
||||||
|
},
|
||||||
|
]
|
||||||
|
|
||||||
|
for case in test_cases:
|
||||||
|
chat = ChatWriter(**case)
|
||||||
|
assert chat.model_name == "palmyra-x-004"
|
||||||
|
assert chat.writer_api_key
|
||||||
|
assert chat.writer_api_key.get_secret_value() == "test-key"
|
||||||
|
assert chat.temperature == (0.5 if "temperature" in case else 0.7)
|
||||||
|
|
||||||
|
def test_convert_dict_to_message_human(self) -> None:
|
||||||
|
"""Test converting a human message dict to a LangChain message."""
|
||||||
|
message = {"role": "user", "content": "Hello"}
|
||||||
|
result = _convert_dict_to_message(message)
|
||||||
|
assert isinstance(result, HumanMessage)
|
||||||
|
assert result.content == "Hello"
|
||||||
|
|
||||||
|
def test_convert_dict_to_message_ai(self) -> None:
|
||||||
|
"""Test converting an AI message dict to a LangChain message."""
|
||||||
|
message = {"role": "assistant", "content": "Hello"}
|
||||||
|
result = _convert_dict_to_message(message)
|
||||||
|
assert isinstance(result, AIMessage)
|
||||||
|
assert result.content == "Hello"
|
||||||
|
|
||||||
|
def test_convert_dict_to_message_system(self) -> None:
|
||||||
|
"""Test converting a system message dict to a LangChain message."""
|
||||||
|
message = {"role": "system", "content": "You are a helpful assistant"}
|
||||||
|
result = _convert_dict_to_message(message)
|
||||||
|
assert isinstance(result, SystemMessage)
|
||||||
|
assert result.content == "You are a helpful assistant"
|
||||||
|
|
||||||
|
def test_convert_dict_to_message_tool_call(self) -> None:
|
||||||
|
"""Test converting a tool call message dict to a LangChain message."""
|
||||||
|
content = json.dumps({"result": 42})
|
||||||
|
message = {
|
||||||
|
"role": "tool",
|
||||||
|
"name": "get_number",
|
||||||
|
"content": content,
|
||||||
|
"tool_call_id": "call_abc123",
|
||||||
|
}
|
||||||
|
result = _convert_dict_to_message(message)
|
||||||
|
assert isinstance(result, ToolMessage)
|
||||||
|
assert result.name == "get_number"
|
||||||
|
assert result.content == content
|
||||||
|
|
||||||
|
def test_convert_dict_to_message_with_tool_calls(self) -> None:
|
||||||
|
"""Test converting an AIMessage with tool calls."""
|
||||||
|
message = {
|
||||||
|
"role": "assistant",
|
||||||
|
"content": "",
|
||||||
|
"tool_calls": [
|
||||||
|
{
|
||||||
|
"id": "call_abc123",
|
||||||
|
"type": "function",
|
||||||
|
"function": {
|
||||||
|
"name": "get_weather",
|
||||||
|
"arguments": '{"location": "London"}',
|
||||||
|
},
|
||||||
|
}
|
||||||
|
],
|
||||||
|
}
|
||||||
|
result = _convert_dict_to_message(message)
|
||||||
|
assert isinstance(result, AIMessage)
|
||||||
|
assert result.tool_calls
|
||||||
|
assert len(result.tool_calls) == 1
|
||||||
|
assert result.tool_calls[0]["name"] == "get_weather"
|
||||||
|
assert result.tool_calls[0]["args"]["location"] == "London"
|
||||||
|
|
||||||
|
@pytest.fixture(autouse=True)
|
||||||
|
def mock_completion(self) -> Dict[str, Any]:
|
||||||
|
"""Fixture providing a mock API response."""
|
||||||
|
return {
|
||||||
|
"id": "chat-12345",
|
||||||
|
"object": "chat.completion",
|
||||||
|
"created": 1699000000,
|
||||||
|
"model": "palmyra-x-004",
|
||||||
|
"choices": [
|
||||||
|
{
|
||||||
|
"index": 0,
|
||||||
|
"message": {
|
||||||
|
"role": "assistant",
|
||||||
|
"content": "Hello! How can I help you?",
|
||||||
|
},
|
||||||
|
"finish_reason": "stop",
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"usage": {"prompt_tokens": 10, "completion_tokens": 8, "total_tokens": 18},
|
||||||
|
}
|
||||||
|
|
||||||
|
@pytest.fixture(autouse=True)
|
||||||
|
def mock_response(self) -> Dict[str, Any]:
|
||||||
|
response = {
|
||||||
|
"id": "chat-12345",
|
||||||
|
"choices": [
|
||||||
|
{
|
||||||
|
"message": {
|
||||||
|
"role": "assistant",
|
||||||
|
"content": "",
|
||||||
|
"tool_calls": [
|
||||||
|
{
|
||||||
|
"id": "call_abc123",
|
||||||
|
"type": "function",
|
||||||
|
"function": {
|
||||||
|
"name": "GetWeather",
|
||||||
|
"arguments": '{"location": "London"}',
|
||||||
|
},
|
||||||
|
}
|
||||||
|
],
|
||||||
|
},
|
||||||
|
"finish_reason": "tool_calls",
|
||||||
|
}
|
||||||
|
],
|
||||||
|
}
|
||||||
|
return response
|
||||||
|
|
||||||
|
@pytest.fixture(autouse=True)
|
||||||
|
def mock_streaming_chunks(self) -> List[Dict[str, Any]]:
|
||||||
|
"""Fixture providing mock streaming response chunks."""
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"id": "chat-12345",
|
||||||
|
"object": "chat.completion.chunk",
|
||||||
|
"created": 1699000000,
|
||||||
|
"model": "palmyra-x-004",
|
||||||
|
"choices": [
|
||||||
|
{
|
||||||
|
"index": 0,
|
||||||
|
"delta": {
|
||||||
|
"role": "assistant",
|
||||||
|
"content": "Hello",
|
||||||
|
},
|
||||||
|
"finish_reason": None,
|
||||||
|
}
|
||||||
|
],
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": "chat-12345",
|
||||||
|
"object": "chat.completion.chunk",
|
||||||
|
"created": 1699000000,
|
||||||
|
"model": "palmyra-x-004",
|
||||||
|
"choices": [
|
||||||
|
{
|
||||||
|
"index": 0,
|
||||||
|
"delta": {
|
||||||
|
"content": "!",
|
||||||
|
},
|
||||||
|
"finish_reason": "stop",
|
||||||
|
}
|
||||||
|
],
|
||||||
|
},
|
||||||
|
]
|
||||||
|
|
||||||
|
def test_sync_completion(self, mock_completion: Dict[str, Any]) -> None:
|
||||||
|
"""Test basic chat completion with mocked response."""
|
||||||
|
chat = ChatWriter(api_key=SecretStr("test-key"))
|
||||||
|
mock_client = MagicMock()
|
||||||
|
mock_client.chat.chat.return_value = mock_completion
|
||||||
|
|
||||||
|
with patch.object(chat, "client", mock_client):
|
||||||
|
message = HumanMessage(content="Hi there!")
|
||||||
|
response = chat.invoke([message])
|
||||||
|
assert isinstance(response, AIMessage)
|
||||||
|
assert response.content == "Hello! How can I help you?"
|
||||||
|
|
||||||
|
async def test_async_completion(self, mock_completion: Dict[str, Any]) -> None:
|
||||||
|
"""Test async chat completion with mocked response."""
|
||||||
|
chat = ChatWriter(api_key=SecretStr("test-key"))
|
||||||
|
mock_client = AsyncMock()
|
||||||
|
mock_client.chat.chat.return_value = mock_completion
|
||||||
|
|
||||||
|
with patch.object(chat, "async_client", mock_client):
|
||||||
|
message = HumanMessage(content="Hi there!")
|
||||||
|
response = await chat.ainvoke([message])
|
||||||
|
assert isinstance(response, AIMessage)
|
||||||
|
assert response.content == "Hello! How can I help you?"
|
||||||
|
|
||||||
|
def test_sync_streaming(self, mock_streaming_chunks: List[Dict[str, Any]]) -> None:
|
||||||
|
"""Test sync streaming with callback handler."""
|
||||||
|
callback_handler = FakeCallbackHandler()
|
||||||
|
callback_manager = CallbackManager([callback_handler])
|
||||||
|
|
||||||
|
chat = ChatWriter(
|
||||||
|
streaming=True,
|
||||||
|
callback_manager=callback_manager,
|
||||||
|
max_tokens=10,
|
||||||
|
api_key=SecretStr("test-key"),
|
||||||
|
)
|
||||||
|
|
||||||
|
mock_client = MagicMock()
|
||||||
|
mock_response = MagicMock()
|
||||||
|
mock_response.__iter__.return_value = mock_streaming_chunks
|
||||||
|
mock_client.chat.chat.return_value = mock_response
|
||||||
|
|
||||||
|
with patch.object(chat, "client", mock_client):
|
||||||
|
message = HumanMessage(content="Hi")
|
||||||
|
response = chat.invoke([message])
|
||||||
|
|
||||||
|
assert isinstance(response, AIMessage)
|
||||||
|
assert callback_handler.llm_streams > 0
|
||||||
|
assert response.content == "Hello!"
|
||||||
|
|
||||||
|
async def test_async_streaming(
|
||||||
|
self, mock_streaming_chunks: List[Dict[str, Any]]
|
||||||
|
) -> None:
|
||||||
|
"""Test async streaming with callback handler."""
|
||||||
|
callback_handler = FakeCallbackHandler()
|
||||||
|
callback_manager = CallbackManager([callback_handler])
|
||||||
|
|
||||||
|
chat = ChatWriter(
|
||||||
|
streaming=True,
|
||||||
|
callback_manager=callback_manager,
|
||||||
|
max_tokens=10,
|
||||||
|
api_key=SecretStr("test-key"),
|
||||||
|
)
|
||||||
|
|
||||||
|
mock_client = AsyncMock()
|
||||||
|
mock_response = AsyncMock()
|
||||||
|
mock_response.__aiter__.return_value = mock_streaming_chunks
|
||||||
|
mock_client.chat.chat.return_value = mock_response
|
||||||
|
|
||||||
|
with patch.object(chat, "async_client", mock_client):
|
||||||
|
message = HumanMessage(content="Hi")
|
||||||
|
response = await chat.ainvoke([message])
|
||||||
|
|
||||||
|
assert isinstance(response, AIMessage)
|
||||||
|
assert callback_handler.llm_streams > 0
|
||||||
|
assert response.content == "Hello!"
|
||||||
|
|
||||||
|
def test_sync_tool_calling(self, mock_response: Dict[str, Any]) -> None:
|
||||||
|
"""Test synchronous tool calling functionality."""
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
|
||||||
|
class GetWeather(BaseModel):
|
||||||
|
"""Get the weather in a location."""
|
||||||
|
|
||||||
|
location: str = Field(..., description="The location to get weather for")
|
||||||
|
|
||||||
|
mock_client = MagicMock()
|
||||||
|
mock_client.chat.chat.return_value = mock_response
|
||||||
|
|
||||||
|
chat = ChatWriter(api_key=SecretStr("test-key"), client=mock_client)
|
||||||
|
|
||||||
|
chat_with_tools = chat.bind_tools(
|
||||||
|
tools=[GetWeather],
|
||||||
|
tool_choice="GetWeather",
|
||||||
|
)
|
||||||
|
|
||||||
|
response = chat_with_tools.invoke("What's the weather in London?")
|
||||||
|
assert isinstance(response, AIMessage)
|
||||||
|
assert response.tool_calls
|
||||||
|
assert response.tool_calls[0]["name"] == "GetWeather"
|
||||||
|
assert response.tool_calls[0]["args"]["location"] == "London"
|
||||||
|
|
||||||
|
async def test_async_tool_calling(self, mock_response: Dict[str, Any]) -> None:
|
||||||
|
"""Test asynchronous tool calling functionality."""
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
|
||||||
|
class GetWeather(BaseModel):
|
||||||
|
"""Get the weather in a location."""
|
||||||
|
|
||||||
|
location: str = Field(..., description="The location to get weather for")
|
||||||
|
|
||||||
|
mock_client = AsyncMock()
|
||||||
|
mock_client.chat.chat.return_value = mock_response
|
||||||
|
|
||||||
|
chat = ChatWriter(api_key=SecretStr("test-key"), async_client=mock_client)
|
||||||
|
|
||||||
|
chat_with_tools = chat.bind_tools(
|
||||||
|
tools=[GetWeather],
|
||||||
|
tool_choice="GetWeather",
|
||||||
|
)
|
||||||
|
|
||||||
|
response = await chat_with_tools.ainvoke("What's the weather in London?")
|
||||||
|
assert isinstance(response, AIMessage)
|
||||||
|
assert response.tool_calls
|
||||||
|
assert response.tool_calls[0]["name"] == "GetWeather"
|
||||||
|
assert response.tool_calls[0]["args"]["location"] == "London"
|
Loading…
Reference in New Issue
Block a user