langchain/docs/modules/models/llms/integrations/manifest.ipynb
leo-gan 5420a0e404
updated langchain/docs/modules/models/llms/integrations/ notebooks (#3041)
- Updated `langchain/docs/modules/models/llms/integrations/` notebooks:
added links to the original sites, the install information, etc.
- Added the `nlpcloud` notebook.
- Removed "Example" from Titles of some notebooks, so all notebook
titles are consistent.
2023-04-17 20:25:32 -07:00

226 lines
6.6 KiB
Plaintext

{
"cells": [
{
"cell_type": "markdown",
"id": "b4462a94",
"metadata": {},
"source": [
"# Manifest\n",
"\n",
"This notebook goes over how to use Manifest and LangChain."
]
},
{
"cell_type": "markdown",
"id": "59fcaebc",
"metadata": {},
"source": [
"For more detailed information on `manifest`, and how to use it with local hugginface models like in this example, see https://github.com/HazyResearch/manifest\n",
"\n",
"Another example of [using Manifest with Langchain](https://github.com/HazyResearch/manifest/blob/main/examples/langchain_chatgpt.ipynb)."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "1205d1e4-e6da-4d67-a0c7-b7e8fd1e98d5",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"!pip install manifest-ml"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "04a0170a",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"from manifest import Manifest\n",
"from langchain.llms.manifest import ManifestWrapper"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "de250a6a",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"manifest = Manifest(\n",
" client_name = \"huggingface\",\n",
" client_connection = \"http://127.0.0.1:5000\"\n",
")\n",
"print(manifest.client.get_model_params())"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "67b719d6",
"metadata": {},
"outputs": [],
"source": [
"llm = ManifestWrapper(client=manifest, llm_kwargs={\"temperature\": 0.001, \"max_tokens\": 256})"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "5af505a8",
"metadata": {},
"outputs": [],
"source": [
"# Map reduce example\n",
"from langchain import PromptTemplate\n",
"from langchain.text_splitter import CharacterTextSplitter\n",
"from langchain.chains.mapreduce import MapReduceChain\n",
"\n",
"\n",
"_prompt = \"\"\"Write a concise summary of the following:\n",
"\n",
"\n",
"{text}\n",
"\n",
"\n",
"CONCISE SUMMARY:\"\"\"\n",
"prompt = PromptTemplate(template=_prompt, input_variables=[\"text\"])\n",
"\n",
"text_splitter = CharacterTextSplitter()\n",
"\n",
"mp_chain = MapReduceChain.from_params(llm, prompt, text_splitter)"
]
},
{
"cell_type": "code",
"execution_count": 7,
"id": "485b3ec3",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"'President Obama delivered his annual State of the Union address on Tuesday night, laying out his priorities for the coming year. Obama said the government will provide free flu vaccines to all Americans, ending the government shutdown and allowing businesses to reopen. The president also said that the government will continue to send vaccines to 112 countries, more than any other nation. \"We have lost so much to COVID-19,\" Trump said. \"Time with one another. And worst of all, so much loss of life.\" He said the CDC is working on a vaccine for kids under 5, and that the government will be ready with plenty of vaccines when they are available. Obama says the new guidelines are a \"great step forward\" and that the virus is no longer a threat. He says the government is launching a \"Test to Treat\" initiative that will allow people to get tested at a pharmacy and get antiviral pills on the spot at no cost. Obama says the new guidelines are a \"great step forward\" and that the virus is no longer a threat. He says the government will continue to send vaccines to 112 countries, more than any other nation. \"We are coming for your'"
]
},
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"with open('../../../state_of_the_union.txt') as f:\n",
" state_of_the_union = f.read()\n",
"mp_chain.run(state_of_the_union)"
]
},
{
"cell_type": "markdown",
"id": "6e9d45a8",
"metadata": {},
"source": [
"## Compare HF Models"
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "33407ab3",
"metadata": {},
"outputs": [],
"source": [
"from langchain.model_laboratory import ModelLaboratory\n",
"\n",
"manifest1 = ManifestWrapper(\n",
" client=Manifest(\n",
" client_name=\"huggingface\",\n",
" client_connection=\"http://127.0.0.1:5000\"\n",
" ),\n",
" llm_kwargs={\"temperature\": 0.01}\n",
")\n",
"manifest2 = ManifestWrapper(\n",
" client=Manifest(\n",
" client_name=\"huggingface\",\n",
" client_connection=\"http://127.0.0.1:5001\"\n",
" ),\n",
" llm_kwargs={\"temperature\": 0.01}\n",
")\n",
"manifest3 = ManifestWrapper(\n",
" client=Manifest(\n",
" client_name=\"huggingface\",\n",
" client_connection=\"http://127.0.0.1:5002\"\n",
" ),\n",
" llm_kwargs={\"temperature\": 0.01}\n",
")\n",
"llms = [manifest1, manifest2, manifest3]\n",
"model_lab = ModelLaboratory(llms)"
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "448935c3",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[1mInput:\u001b[0m\n",
"What color is a flamingo?\n",
"\n",
"\u001b[1mManifestWrapper\u001b[0m\n",
"Params: {'model_name': 'bigscience/T0_3B', 'model_path': 'bigscience/T0_3B', 'temperature': 0.01}\n",
"\u001b[104mpink\u001b[0m\n",
"\n",
"\u001b[1mManifestWrapper\u001b[0m\n",
"Params: {'model_name': 'EleutherAI/gpt-neo-125M', 'model_path': 'EleutherAI/gpt-neo-125M', 'temperature': 0.01}\n",
"\u001b[103mA flamingo is a small, round\u001b[0m\n",
"\n",
"\u001b[1mManifestWrapper\u001b[0m\n",
"Params: {'model_name': 'google/flan-t5-xl', 'model_path': 'google/flan-t5-xl', 'temperature': 0.01}\n",
"\u001b[101mpink\u001b[0m\n",
"\n"
]
}
],
"source": [
"model_lab.compare(\"What color is a flamingo?\")"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.6"
},
"vscode": {
"interpreter": {
"hash": "51b9b5b89a4976ad21c8b4273a6c78d700e2954ce7d7452948b7774eb33bbce4"
}
}
},
"nbformat": 4,
"nbformat_minor": 5
}