langchain/docs/extras/modules/model_io/models/llms/integrations/chatglm.ipynb

122 lines
4.6 KiB
Plaintext
Raw Normal View History

{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"# ChatGLM\n",
"\n",
"[ChatGLM-6B](https://github.com/THUDM/ChatGLM-6B) is an open bilingual language model based on General Language Model (GLM) framework, with 6.2 billion parameters. With the quantization technique, users can deploy locally on consumer-grade graphics cards (only 6GB of GPU memory is required at the INT4 quantization level). \n",
"\n",
"[ChatGLM2-6B](https://github.com/THUDM/ChatGLM2-6B) is the second-generation version of the open-source bilingual (Chinese-English) chat model ChatGLM-6B. It retains the smooth conversation flow and low deployment threshold of the first-generation model, while introducing the new features like better performance, longer context and more efficient inference.\n",
"\n",
"This example goes over how to use LangChain to interact with ChatGLM2-6B Inference for text completion.\n",
"ChatGLM-6B and ChatGLM2-6B has the same api specs, so this example should work with both."
]
},
{
"cell_type": "code",
"execution_count": 21,
"metadata": {},
"outputs": [],
"source": [
"from langchain.llms import ChatGLM\n",
"from langchain import PromptTemplate, LLMChain\n",
"\n",
"# import os"
]
},
{
"cell_type": "code",
"execution_count": 22,
"metadata": {},
"outputs": [],
"source": [
"template = \"\"\"{question}\"\"\"\n",
"prompt = PromptTemplate(template=template, input_variables=[\"question\"])"
]
},
{
"cell_type": "code",
"execution_count": 23,
"metadata": {},
"outputs": [],
"source": [
"# default endpoint_url for a local deployed ChatGLM api server\n",
"endpoint_url = \"http://127.0.0.1:8000\"\n",
"\n",
"# direct access endpoint in a proxied environment\n",
"# os.environ['NO_PROXY'] = '127.0.0.1'\n",
"\n",
"llm = ChatGLM(\n",
" endpoint_url=endpoint_url,\n",
" max_token=80000,\n",
" history=[[\"我将从美国到中国来旅游,出行前希望了解中国的城市\", \"欢迎问我任何问题。\"]],\n",
" top_p=0.9,\n",
" model_kwargs={\"sample_model_args\": False},\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 24,
"metadata": {},
"outputs": [],
"source": [
"llm_chain = LLMChain(prompt=prompt, llm=llm)"
]
},
{
"cell_type": "code",
"execution_count": 25,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"ChatGLM payload: {'prompt': '北京和上海两座城市有什么不同?', 'temperature': 0.1, 'history': [['我将从美国到中国来旅游,出行前希望了解中国的城市', '欢迎问我任何问题。']], 'max_length': 80000, 'top_p': 0.9, 'sample_model_args': False}\n"
]
},
{
"data": {
"text/plain": [
"'北京和上海是中国的两个首都,它们在许多方面都有所不同。\\n\\n北京是中国的政治和文化中心拥有悠久的历史和灿烂的文化。它是中国最重要的古都之一也是中国历史上最后一个封建王朝的都城。北京有许多著名的古迹和景点例如紫禁城、天安门广场和长城等。\\n\\n上海是中国最现代化的城市之一也是中国商业和金融中心。上海拥有许多国际知名的企业和金融机构同时也有许多著名的景点和美食。上海的外滩是一个历史悠久的商业区拥有许多欧式建筑和餐馆。\\n\\n除此之外北京和上海在交通和人口方面也有很大差异。北京是中国的首都人口众多交通拥堵问题较为严重。而上海是中国的商业和金融中心人口密度较低交通相对较为便利。\\n\\n总的来说北京和上海是两个拥有独特魅力和特点的城市可以根据自己的兴趣和时间来选择前往其中一座城市旅游。'"
]
},
"execution_count": 25,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"question = \"北京和上海两座城市有什么不同?\"\n",
"\n",
"llm_chain.run(question)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "langchain-dev",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
}
},
"nbformat": 4,
"nbformat_minor": 2
}