{ "cells": [ { "attachments": {}, "cell_type": "markdown", "metadata": {}, "source": [ "# Context\n", "\n", "![Context - Product Analytics for AI Chatbots](https://go.getcontext.ai/langchain.png)\n", "\n", "[Context](https://getcontext.ai/) provides product analytics for AI chatbots.\n", "\n", "Context helps you understand how users are interacting with your AI chat products.\n", "Gain critical insights, optimise poor experiences, and minimise brand risks.\n" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": {}, "source": [ "In this guide we will show you how to integrate with Context." ] }, { "attachments": {}, "cell_type": "markdown", "metadata": { "tags": [] }, "source": [ "## Installation and Setup" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "vscode": { "languageId": "shellscript" } }, "outputs": [], "source": [ "$ pip install context-python --upgrade" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": {}, "source": [ "### Getting API Credentials\n", "\n", "To get your Context API token:\n", "\n", "1. Go to the settings page within your Context account (https://go.getcontext.ai/settings).\n", "2. Generate a new API Token.\n", "3. Store this token somewhere secure." ] }, { "attachments": {}, "cell_type": "markdown", "metadata": {}, "source": [ "### Setup Context\n", "\n", "To use the `ContextCallbackHandler`, import the handler from Langchain and instantiate it with your Context API token.\n", "\n", "Ensure you have installed the `context-python` package before using the handler." ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [], "source": [ "import os\n", "\n", "from langchain.callbacks import ContextCallbackHandler\n", "\n", "token = os.environ[\"CONTEXT_API_TOKEN\"]\n", "\n", "context_callback = ContextCallbackHandler(token)" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": {}, "source": [ "## Usage\n", "### Using the Context callback within a Chat Model\n", "\n", "The Context callback handler can be used to directly record transcripts between users and AI assistants.\n", "\n", "#### Example" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [], "source": [ "import os\n", "\n", "from langchain.chat_models import ChatOpenAI\n", "from langchain.schema import (\n", " SystemMessage,\n", " HumanMessage,\n", ")\n", "from langchain.callbacks import ContextCallbackHandler\n", "\n", "token = os.environ[\"CONTEXT_API_TOKEN\"]\n", "\n", "chat = ChatOpenAI(\n", " headers={\"user_id\": \"123\"}, temperature=0, callbacks=[ContextCallbackHandler(token)]\n", ")\n", "\n", "messages = [\n", " SystemMessage(\n", " content=\"You are a helpful assistant that translates English to French.\"\n", " ),\n", " HumanMessage(content=\"I love programming.\"),\n", "]\n", "\n", "print(chat(messages))" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": {}, "source": [ "### Using the Context callback within Chains\n", "\n", "The Context callback handler can also be used to record the inputs and outputs of chains. Note that intermediate steps of the chain are not recorded - only the starting inputs and final outputs.\n", "\n", "__Note:__ Ensure that you pass the same context object to the chat model and the chain.\n", "\n", "Wrong:\n", "> ```python\n", "> chat = ChatOpenAI(temperature=0.9, callbacks=[ContextCallbackHandler(token)])\n", "> chain = LLMChain(llm=chat, prompt=chat_prompt_template, callbacks=[ContextCallbackHandler(token)])\n", "> ```\n", "\n", "Correct:\n", ">```python\n", ">handler = ContextCallbackHandler(token)\n", ">chat = ChatOpenAI(temperature=0.9, callbacks=[callback])\n", ">chain = LLMChain(llm=chat, prompt=chat_prompt_template, callbacks=[callback])\n", ">```\n", "\n", "#### Example" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import os\n", "\n", "from langchain.chat_models import ChatOpenAI\n", "from langchain import LLMChain\n", "from langchain.prompts import PromptTemplate\n", "from langchain.prompts.chat import (\n", " ChatPromptTemplate,\n", " HumanMessagePromptTemplate,\n", ")\n", "from langchain.callbacks import ContextCallbackHandler\n", "\n", "token = os.environ[\"CONTEXT_API_TOKEN\"]\n", "\n", "human_message_prompt = HumanMessagePromptTemplate(\n", " prompt=PromptTemplate(\n", " template=\"What is a good name for a company that makes {product}?\",\n", " input_variables=[\"product\"],\n", " )\n", ")\n", "chat_prompt_template = ChatPromptTemplate.from_messages([human_message_prompt])\n", "callback = ContextCallbackHandler(token)\n", "chat = ChatOpenAI(temperature=0.9, callbacks=[callback])\n", "chain = LLMChain(llm=chat, prompt=chat_prompt_template, callbacks=[callback])\n", "print(chain.run(\"colorful socks\"))" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.11.3" }, "vscode": { "interpreter": { "hash": "a53ebf4a859167383b364e7e7521d0add3c2dbbdecce4edf676e8c4634ff3fbb" } } }, "nbformat": 4, "nbformat_minor": 4 }