You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
langchain/docs/docs/integrations/chat/zhipuai.ipynb

278 lines
6.6 KiB
Plaintext

This file contains ambiguous Unicode characters!

This file contains ambiguous Unicode characters that may be confused with others in your current locale. If your use case is intentional and legitimate, you can safely ignore this warning. Use the Escape button to highlight these characters.

{
"cells": [
{
"cell_type": "raw",
"metadata": {},
"source": [
"---\n",
"sidebar_label: ZHIPU AI\n",
"---"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# ZHIPU AI\n",
"\n",
"This notebook shows how to use [ZHIPU AI API](https://open.bigmodel.cn/dev/api) in LangChain with the langchain.chat_models.ChatZhipuAI.\n",
"\n",
">[*GLM-4*](https://open.bigmodel.cn/) is a multi-lingual large language model aligned with human intent, featuring capabilities in Q&A, multi-turn dialogue, and code generation. The overall performance of the new generation base model GLM-4 has been significantly improved compared to the previous generation, supporting longer contexts; Stronger multimodality; Support faster inference speed, more concurrency, greatly reducing inference costs; Meanwhile, GLM-4 enhances the capabilities of intelligent agents.\n",
"\n",
"## Getting started\n",
"### Installation\n",
"First, ensure the zhipuai package is installed in your Python environment. Run the following command:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#!pip install --upgrade httpx httpx-sse PyJWT"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Importing the Required Modules\n",
"After installation, import the necessary modules to your Python script:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from langchain_community.chat_models import ChatZhipuAI\n",
"from langchain_core.messages import AIMessage, HumanMessage, SystemMessage"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Setting Up Your API Key\n",
"Sign in to [ZHIPU AI](https://open.bigmodel.cn/login?redirect=%2Fusercenter%2Fapikeys) for the an API Key to access our models."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"\n",
"os.environ[\"ZHIPUAI_API_KEY\"] = \"zhipuai_api_key\""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Initialize the ZHIPU AI Chat Model\n",
"Here's how to initialize the chat model:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"chat = ChatZhipuAI(\n",
" model=\"glm-4\",\n",
" temperature=0.5,\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Basic Usage\n",
"Invoke the model with system and human messages like this:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"scrolled": true
},
"outputs": [],
"source": [
"messages = [\n",
" AIMessage(content=\"Hi.\"),\n",
" SystemMessage(content=\"Your role is a poet.\"),\n",
" HumanMessage(content=\"Write a short poem about AI in four lines.\"),\n",
"]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"response = chat(messages)\n",
"print(response.content) # Displays the AI-generated poem"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Advanced Features\n",
"### Streaming Support\n",
"For continuous interaction, use the streaming feature:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from langchain_core.callbacks.manager import CallbackManager\n",
"from langchain_core.callbacks.streaming_stdout import StreamingStdOutCallbackHandler"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"streaming_chat = ChatZhipuAI(\n",
" model=\"glm-4\",\n",
" temperature=0.5,\n",
" streaming=True,\n",
" callback_manager=CallbackManager([StreamingStdOutCallbackHandler()]),\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"streaming_chat(messages)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Asynchronous Calls\n",
"For non-blocking calls, use the asynchronous approach:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"async_chat = ChatZhipuAI(\n",
" model=\"glm-4\",\n",
" temperature=0.5,\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"scrolled": true
},
"outputs": [],
"source": [
"response = await async_chat.agenerate([messages])\n",
"print(response)"
]
},
{
"cell_type": "markdown",
"source": [
"### Using With Functions Call\n",
"\n",
"GLM-4 Model can be used with the function call as welluse the following code to run a simple LangChain json_chat_agent."
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "code",
"outputs": [],
"source": [
"os.environ[\"TAVILY_API_KEY\"] = \"tavily_api_key\""
],
"metadata": {
"collapsed": false
},
"execution_count": null
},
{
"cell_type": "code",
"outputs": [],
"source": [
"from langchain import hub\n",
"from langchain.agents import AgentExecutor, create_json_chat_agent\n",
"from langchain_community.tools.tavily_search import TavilySearchResults\n",
"\n",
"tools = [TavilySearchResults(max_results=1)]\n",
"prompt = hub.pull(\"hwchase17/react-chat-json\")\n",
"llm = ChatZhipuAI(temperature=0.01, model=\"glm-4\")\n",
"\n",
"agent = create_json_chat_agent(llm, tools, prompt)\n",
"agent_executor = AgentExecutor(\n",
" agent=agent, tools=tools, verbose=True, handle_parsing_errors=True\n",
")"
],
"metadata": {
"collapsed": false
},
"execution_count": null
},
{
"cell_type": "code",
"outputs": [],
"source": [
"agent_executor.invoke({\"input\": \"what is LangChain?\"})"
],
"metadata": {
"collapsed": false
},
"execution_count": null
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.18"
}
},
"nbformat": 4,
"nbformat_minor": 4
}