openai-cookbook/apps/enterprise-knowledge-retrieval/enterprise_knowledge_retrieval.ipynb

2194 lines
170 KiB
Plaintext
Raw Normal View History

{
"cells": [
{
"cell_type": "markdown",
"id": "685d4507",
"metadata": {},
"source": [
"# Enterprise Knowledge Retrieval\n",
"\n",
"This notebook contains an end-to-end workflow to set up an Enterprise Knowledge Retrieval solution from scratch.\n",
"\n",
"### Problem Statement\n",
"\n",
"LLMs have great conversational ability but their knowledge is general and often out of date. Relevant knowledge often exists, but is kept in disparate datestores that are hard to surface with current search solutions.\n",
"\n",
"\n",
"### Objective\n",
"\n",
"We want to deliver an outstanding user experience where the user is presented with the right knowledge when they need it in a clear and conversational way. To accomplish this we need an LLM-powered solution that knows our organizational context and data, that can retrieve the right knowledge when the user needs it. \n"
]
},
{
"cell_type": "markdown",
"id": "8eab9aae",
"metadata": {},
"source": [
"## Solution\n",
"\n",
"![title](img/enterprise_knowledge_retrieval.png)\n",
"\n",
"We'll build a knowledge retrieval solution that will embed a corpus of knowledge (in our case a database of Wikipedia manuals) and use it to answer user questions.\n",
"\n",
"### Learning Path\n",
"\n",
"#### Walkthrough\n",
"\n",
"You can follow on to this solution walkthrough through either the video recorded here, or the text walkthrough below. We'll build out the solution in the following stages:\n",
"- **Setup:** Initiate variables and connect to a vector database.\n",
"- **Storage:** Configure the database, prepare our data and store embeddings and metadata for retrieval.\n",
"- **Search:** Extract relevant documents back out with a basic search function and use an LLM to summarise results into a concise reply.\n",
"- **Answer:** Add a more sophisticated agent which will process the user's query and maintain a memory for follow-up questions.\n",
"- **Evaluate:** Take a sample evaluated question/answer pairs using our service and plot them to scope out remedial action."
]
},
{
"cell_type": "markdown",
"id": "ae9b1412",
"metadata": {},
"source": [
"## Walkthrough"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "4e85be52",
"metadata": {},
"outputs": [],
"source": [
"%load_ext autoreload\n",
"%autoreload 2"
]
},
{
"cell_type": "markdown",
"id": "ab1a0a6a",
"metadata": {},
"source": [
"## Setup\n",
"\n",
"Import libraries and set up a connection to a Redis vector database for our knowledge base.\n",
"\n",
"You can substitute Redis for any other vectorstore or database - there are a [selection](https://python.langchain.com/en/latest/modules/indexes/vectorstores.html) that are supported by Langchain natively, while other connectors will need to be developed yourself."
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "c79535f1",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Requirement already satisfied: redis in /opt/homebrew/lib/python3.11/site-packages (4.5.5)\r\n"
]
}
],
"source": [
"!pip install redis"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "cd8e3d30",
"metadata": {},
"outputs": [],
"source": [
"from ast import literal_eval\n",
"import concurrent\n",
"import openai\n",
"import os\n",
"import numpy as np\n",
"from numpy import array, average\n",
"import pandas as pd\n",
"from tenacity import retry, wait_random_exponential, stop_after_attempt\n",
"import tiktoken\n",
"from tqdm import tqdm\n",
"from typing import List, Iterator\n",
"import wget\n",
"\n",
"# Redis imports\n",
"from redis import Redis as r\n",
"from redis.commands.search.query import Query\n",
"from redis.commands.search.field import (\n",
" TextField,\n",
" VectorField,\n",
" NumericField\n",
")\n",
"from redis.commands.search.indexDefinition import (\n",
" IndexDefinition,\n",
" IndexType\n",
")\n",
"\n",
"# Langchain imports\n",
"from langchain.embeddings import OpenAIEmbeddings\n",
"from langchain.chains import RetrievalQA\n",
"\n",
"CHAT_MODEL = \"gpt-3.5-turbo\""
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "53641bc5",
"metadata": {},
"outputs": [],
"source": [
"pd.set_option('display.max_colwidth', 0)"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "6fbde85b",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\r",
" 0% [ ] 0 / 4470649\r",
" 0% [ ] 8192 / 4470649\r",
" 0% [ ] 16384 / 4470649\r",
" 0% [ ] 24576 / 4470649\r",
" 0% [ ] 32768 / 4470649\r",
" 0% [ ] 40960 / 4470649\r",
" 1% [ ] 49152 / 4470649\r",
" 1% [ ] 57344 / 4470649\r",
" 1% [. ] 65536 / 4470649\r",
" 1% [. ] 73728 / 4470649\r",
" 1% [. ] 81920 / 4470649\r",
" 2% [. ] 90112 / 4470649\r",
" 2% [. ] 98304 / 4470649\r",
" 2% [. ] 106496 / 4470649\r",
" 2% [. ] 114688 / 4470649\r",
" 2% [.. ] 122880 / 4470649\r",
" 2% [.. ] 131072 / 4470649\r",
" 3% [.. ] 139264 / 4470649\r",
" 3% [.. ] 147456 / 4470649\r",
" 3% [.. ] 155648 / 4470649\r",
" 3% [.. ] 163840 / 4470649\r",
" 3% [.. ] 172032 / 4470649\r",
" 4% [.. ] 180224 / 4470649\r",
" 4% [... ] 188416 / 4470649\r",
" 4% [... ] 196608 / 4470649\r",
" 4% [... ] 204800 / 4470649\r",
" 4% [... ] 212992 / 4470649\r",
" 4% [... ] 221184 / 4470649\r",
" 5% [... ] 229376 / 4470649\r",
" 5% [... ] 237568 / 4470649\r",
" 5% [.... ] 245760 / 4470649\r",
" 5% [.... ] 253952 / 4470649\r",
" 5% [.... ] 262144 / 4470649\r",
" 6% [.... ] 270336 / 4470649\r",
" 6% [.... ] 278528 / 4470649\r",
" 6% [.... ] 286720 / 4470649\r",
" 6% [.... ] 294912 / 4470649\r",
" 6% [..... ] 303104 / 4470649\r",
" 6% [..... ] 311296 / 4470649\r",
" 7% [..... ] 319488 / 4470649\r",
" 7% [..... ] 327680 / 4470649\r",
" 7% [..... ] 335872 / 4470649\r",
" 7% [..... ] 344064 / 4470649\r",
" 7% [..... ] 352256 / 4470649\r",
" 8% [..... ] 360448 / 4470649\r",
" 8% [...... ] 368640 / 4470649\r",
" 8% [...... ] 376832 / 4470649\r",
" 8% [...... ] 385024 / 4470649\r",
" 8% [...... ] 393216 / 4470649\r",
" 8% [...... ] 401408 / 4470649\r",
" 9% [...... ] 409600 / 4470649\r",
" 9% [...... ] 417792 / 4470649\r",
" 9% [....... ] 425984 / 4470649\r",
" 9% [....... ] 434176 / 4470649\r",
" 9% [....... ] 442368 / 4470649\r",
" 10% [....... ] 450560 / 4470649\r",
" 10% [....... ] 458752 / 4470649\r",
" 10% [....... ] 466944 / 4470649\r",
" 10% [....... ] 475136 / 4470649\r",
" 10% [........ ] 483328 / 4470649\r",
" 10% [........ ] 491520 / 4470649\r",
" 11% [........ ] 499712 / 4470649\r",
" 11% [........ ] 507904 / 4470649\r",
" 11% [........ ] 516096 / 4470649\r",
" 11% [........ ] 524288 / 4470649\r",
" 11% [........ ] 532480 / 4470649\r",
" 12% [........ ] 540672 / 4470649\r",
" 12% [......... ] 548864 / 4470649\r",
" 12% [......... ] 557056 / 4470649\r",
" 12% [......... ] 565248 / 4470649\r",
" 12% [......... ] 573440 / 4470649\r",
" 13% [......... ] 581632 / 4470649\r",
" 13% [......... ] 589824 / 4470649\r",
" 13% [......... ] 598016 / 4470649\r",
" 13% [.......... ] 606208 / 4470649\r",
" 13% [.......... ] 614400 / 4470649\r",
" 13% [.......... ] 622592 / 4470649\r",
" 14% [.......... ] 630784 / 4470649\r",
" 14% [.......... ] 638976 / 4470649\r",
" 14% [.......... ] 647168 / 4470649\r",
" 14% [.......... ] 655360 / 4470649\r",
" 14% [.......... ] 663552 / 4470649\r",
" 15% [........... ] 671744 / 4470649\r",
" 15% [........... ] 679936 / 4470649\r",
" 15% [........... ] 688128 / 4470649\r",
" 15% [........... ] 696320 / 4470649\r",
" 15% [........... ] 704512 / 4470649\r",
" 15% [........... ] 712704 / 4470649\r",
" 16% [........... ] 720896 / 4470649\r",
" 16% [............ ] 729088 / 4470649\r",
" 16% [............ ] 737280 / 4470649\r",
" 16% [............ ] 745472 / 4470649\r",
" 16% [............ ] 753664 / 4470649\r",
" 17% [............ ] 761856 / 4470649\r",
" 17% [............ ] 770048 / 4470649\r",
" 17% [............ ] 778240 / 4470649\r",
" 17% [............. ] 786432 / 4470649\r",
" 17% [............. ] 794624 / 4470649\r",
" 17% [............. ] 802816 / 4470649\r",
" 18% [............. ] 811008 / 4470649\r",
" 18% [............. ] 819200 / 4470649\r",
" 18% [............. ] 827392 / 4470649\r",
" 18% [............. ] 835584 / 4470649\r",
" 18% [............. ] 843776 / 4470649\r",
" 19% [.............. ] 851968 / 4470649\r",
" 19% [.............. ] 860160 / 4470649\r",
" 19% [.............. ] 868352 / 4470649\r",
" 19% [.............. ] 876544 / 4470649\r",
" 19% [.............. ] 884736 / 4470649\r",
" 19% [.............. ] 892928 / 4470649\r",
" 20% [.............. ] 901120 / 4470649\r",
" 20% [............... ] 909312 / 4470649\r",
" 20% [............... ] 917504 / 4470649\r",
" 20% [............... ] 925696 / 4470649\r",
" 20% [............... ] 933888 / 4470649\r",
" 21% [............... ] 942080 / 4470649\r",
" 21% [............... ] 950272 / 4470649\r",
" 21% [............... ] 958464 / 4470649\r",
" 21% [................ ] 966656 / 4470649\r",
" 21% [................ ] 974848 / 4470649\r",
" 21% [................ ] 983040 / 4470649\r",
" 22% [................ ] 991232 / 4470649\r",
" 22% [................ ] 999424 / 4470649\r",
" 22% [................ ] 1007616 / 4470649\r",
" 22% [................ ] 1015808 / 4470649\r",
" 22% [................ ] 1024000 / 4470649\r",
" 23% [................. ] 1032192 / 4470649\r",
" 23% [................. ] 1040384 / 4470649\r",
" 23% [................. ] 1048576 / 4470649\r",
" 23% [................. ] 1056768 / 4470649\r",
" 23% [................. ] 1064960 / 4470649\r",
" 24% [................. ] 1073152 / 4470649\r",
" 24% [................. ] 1081344 / 4470649\r",
" 24% [.................. ] 1089536 / 4470649\r",
" 24% [.................. ] 1097728 / 4470649\r",
" 24% [.................. ] 1105920 / 4470649\r",
" 24% [.................. ] 1114112 / 4470649\r",
" 25% [.................. ] 1122304 / 4470649\r",
" 25% [.................. ] 1130496 / 4470649\r",
" 25% [.................. ] 1138688 / 4470649\r",
" 25% [.................. ] 1146880 / 4470649\r",
" 25% [................... ] 1155072 / 4470649\r",
" 26% [................... ] 1163264 / 4470649\r",
" 26% [................... ] 1171456 / 4470649\r",
" 26% [................... ] 1179648 / 4470649\r",
" 26% [................... ] 1187840 / 4470649\r",
" 26% [................... ] 1196032 / 4470649\r",
" 26% [................... ] 1204224 / 4470649\r",
" 27% [.................... ] 1212416 / 4470649\r",
" 27% [.................... ] 1220608 / 4470649\r",
" 27% [.................... ] 1228800 / 4470649\r",
" 27% [.................... ] 1236992 / 4470649\r",
" 27% [.................... ] 1245184 / 4470649\r",
" 28% [.................... ] 1253376 / 4470649\r",
" 28% [.................... ] 1261568 / 4470649\r",
" 28% [..................... ] 1269760 / 4470649\r",
" 28% [..................... ] 1277952 / 4470649\r",
" 28% [..................... ] 1286144 / 4470649\r",
" 28% [..................... ] 1294336 / 4470649\r",
" 29% [..................... ] 1302528 / 4470649\r",
" 29% [..................... ] 1310720 / 4470649\r",
" 29% [..................... ] 1318912 / 4470649\r",
" 29% [..................... ] 1327104 / 4470649\r",
" 29% [...................... ] 1335296 / 4470649\r",
" 30% [...................... ] 1343488 / 4470649\r",
" 30% [...................... ] 1351680 / 4470649\r",
" 30% [...................... ] 1359872 / 4470649\r",
" 30% [...................... ] 1368064 / 4470649\r",
" 30% [...................... ] 1376256 / 4470649\r",
" 30% [...................... ] 1384448 / 4470649\r",
" 31% [....................... ] 1392640 / 4470649\r",
" 31% [....................... ] 1400832 / 4470649\r",
" 31% [....................... ] 1409024 / 4470649\r",
" 31% [....................... ] 1417216 / 4470649\r",
" 31% [....................... ] 1425408 / 4470649\r",
" 32% [....................... ] 1433600 / 4470649\r",
" 32% [....................... ] 1441792 / 4470649\r",
" 32% [........................ ] 1449984 / 4470649\r",
" 32% [........................ ] 1458176 / 4470649\r",
" 32% [........................ ] 1466368 / 4470649\r",
" 32% [........................ ] 1474560 / 4470649\r",
" 33% [........................ ] 1482752 / 4470649\r",
" 33% [........................ ] 1490944 / 4470649\r",
" 33% [........................ ] 1499136 / 4470649\r",
" 33% [........................ ] 1507328 / 4470649\r",
" 33% [......................... ] 1515520 / 4470649\r",
" 34% [......................... ] 1523712 / 4470649\r",
" 34% [......................... ] 1531904 / 4470649\r",
" 34% [......................... ] 1540096 / 4470649\r",
" 34% [......................... ] 1548288 / 4470649\r",
" 34% [......................... ] 1556480 / 4470649\r",
" 34% [......................... ] 1564672 / 4470649\r",
" 35% [.......................... ] 1572864 / 4470649\r",
" 35% [.......................... ] 1581056 / 4470649\r",
" 35% [.......................... ] 1589248 / 4470649\r",
" 35% [.......................... ] 1597440 / 4470649\r",
" 35% [.......................... ] 1605632 / 4470649\r",
" 36% [.......................... ] 1613824 / 4470649\r",
" 36% [.......................... ] 1622016 / 4470649\r",
" 36% [.......................... ] 1630208 / 4470649\r",
" 36% [........................... ] 1638400 / 4470649\r",
" 36% [........................... ] 1646592 / 4470649\r",
" 37% [........................... ] 1654784 / 4470649\r",
" 37% [........................... ] 1662976 / 4470649\r",
" 37% [........................... ] 1671168 / 4470649\r",
" 37% [........................... ] 1679360 / 4470649\r",
" 37% [........................... ] 1687552 / 4470649\r",
" 37% [............................ ] 1695744 / 4470649\r",
" 38% [............................ ] 1703936 / 4470649\r",
" 38% [............................ ] 1712128 / 4470649\r",
" 38% [............................ ] 1720320 / 4470649\r",
" 38% [............................ ] 1728512 / 4470649\r",
" 38% [............................ ] 1736704 / 4470649\r",
" 39% [............................ ] 1744896 / 4470649\r",
" 39% [............................. ] 1753088 / 4470649\r",
" 39% [............................. ] 1761280 / 4470649\r",
" 39% [............................. ] 1769472 / 4470649\r",
" 39% [............................. ] 1777664 / 4470649\r",
" 39% [............................. ] 1785856 / 4470649\r",
" 40% [............................. ] 1794048 / 4470649\r",
" 40% [............................. ] 1802240 / 4470649\r",
" 40% [............................. ] 1810432 / 4470649\r",
" 40% [.............................. ] 1818624 / 4470649\r",
" 40% [.............................. ] 1826816 / 4470649\r",
" 41% [.............................. ] 1835008 / 4470649\r",
" 41% [.............................. ] 1843200 / 4470649\r",
" 41% [.............................. ] 1851392 / 4470649\r",
" 41% [.............................. ] 1859584 / 4470649\r",
" 41% [.............................. ] 1867776 / 4470649\r",
" 41% [............................... ] 1875968 / 4470649\r",
" 42% [............................... ] 1884160 / 4470649\r",
" 42% [............................... ] 1892352 / 4470649\r",
" 42% [............................... ] 1900544 / 4470649\r",
" 42% [............................... ] 1908736 / 4470649\r",
" 42% [............................... ] 1916928 / 4470649\r",
" 43% [............................... ] 1925120 / 4470649\r",
" 43% [................................ ] 1933312 / 4470649\r",
" 43% [................................ ] 1941504 / 4470649\r",
" 43% [................................ ] 1949696 / 4470649\r",
" 43% [................................ ] 1957888 / 4470649\r",
" 43% [................................ ] 1966080 / 4470649\r",
" 44% [................................ ] 1974272 / 4470649\r",
" 44% [................................ ] 1982464 / 4470649\r",
" 44% [................................ ] 1990656 / 4470649\r",
" 44% [................................. ] 1998848 / 4470649\r",
" 44% [................................. ] 2007040 / 4470649\r",
" 45% [................................. ] 2015232 / 4470649\r",
" 45% [................................. ] 2023424 / 4470649\r",
" 45% [................................. ] 2031616 / 4470649\r",
" 45% [................................. ] 2039808 / 4470649\r",
" 45% [................................. ] 2048000 / 4470649\r",
" 45% [.................................. ] 2056192 / 4470649\r",
" 46% [.................................. ] 2064384 / 4470649\r",
" 46% [.................................. ] 2072576 / 4470649\r",
" 46% [.................................. ] 2080768 / 4470649\r",
" 46% [.................................. ] 2088960 / 4470649\r",
" 46% [.................................. ] 2097152 / 4470649\r",
" 47% [.................................. ] 2105344 / 4470649\r",
" 47% [.................................. ] 2113536 / 4470649\r",
" 47% [................................... ] 2121728 / 4470649\r",
" 47% [................................... ] 2129920 / 4470649\r",
" 47% [................................... ] 2138112 / 4470649\r",
" 48% [................................... ] 2146304 / 4470649\r",
" 48% [................................... ] 2154496 / 4470649\r",
" 48% [................................... ] 2162688 / 4470649\r",
" 48% [................................... ] 2170880 / 4470649\r",
" 48% [.................................... ] 2179072 / 4470649\r",
" 48% [.................................... ] 2187264 / 4470649\r",
" 49% [.................................... ] 2195456 / 4470649\r",
" 49% [.................................... ] 2203648 / 4470649\r",
" 49% [.................................... ] 2211840 / 4470649\r",
" 49% [.................................... ] 2220032 / 4470649\r",
" 49% [.................................... ] 2228224 / 4470649\r",
" 50% [..................................... ] 2236416 / 4470649\r",
" 50% [..................................... ] 2244608 / 4470649\r",
" 50% [..................................... ] 2252800 / 4470649\r",
" 50% [..................................... ] 2260992 / 4470649\r",
" 50% [..................................... ] 2269184 / 4470649\r",
" 50% [..................................... ] 2277376 / 4470649\r",
" 51% [..................................... ] 2285568 / 4470649\r",
" 51% [..................................... ] 2293760 / 4470649\r",
" 51% [...................................... ] 2301952 / 4470649\r",
" 51% [...................................... ] 2310144 / 4470649\r",
" 51% [...................................... ] 2318336 / 4470649\r",
" 52% [...................................... ] 2326528 / 4470649\r",
" 52% [...................................... ] 2334720 / 4470649\r",
" 52% [...................................... ] 2342912 / 4470649\r",
" 52% [...................................... ] 2351104 / 4470649\r",
" 52% [....................................... ] 2359296 / 4470649\r",
" 52% [....................................... ] 2367488 / 4470649\r",
" 53% [....................................... ] 2375680 / 4470649\r",
" 53% [....................................... ] 2383872 / 4470649\r",
" 53% [....................................... ] 2392064 / 4470649\r",
" 53% [....................................... ] 2400256 / 4470649\r",
" 53% [....................................... ] 2408448 / 4470649\r",
" 54% [........................................ ] 2416640 / 4470649\r",
" 54% [........................................ ] 2424832 / 4470649\r",
" 54% [........................................ ] 2433024 / 4470649\r",
" 54% [........................................ ] 2441216 / 4470649\r",
" 54% [........................................ ] 2449408 / 4470649\r",
" 54% [........................................ ] 2457600 / 4470649\r",
" 55% [........................................ ] 2465792 / 4470649\r",
" 55% [........................................ ] 2473984 / 4470649\r",
" 55% [......................................... ] 2482176 / 4470649\r",
" 55% [......................................... ] 2490368 / 4470649\r",
" 55% [......................................... ] 2498560 / 4470649\r",
" 56% [......................................... ] 2506752 / 4470649\r",
" 56% [......................................... ] 2514944 / 4470649\r",
" 56% [......................................... ] 2523136 / 4470649\r",
" 56% [......................................... ] 2531328 / 4470649\r",
" 56% [.......................................... ] 2539520 / 4470649\r",
" 56% [.......................................... ] 2547712 / 4470649\r",
" 57% [.......................................... ] 2555904 / 4470649\r",
" 57% [.......................................... ] 2564096 / 4470649\r",
" 57% [.......................................... ] 2572288 / 4470649\r",
" 57% [.......................................... ] 2580480 / 4470649\r",
" 57% [.......................................... ] 2588672 / 4470649\r",
" 58% [.......................................... ] 2596864 / 4470649\r",
" 58% [........................................... ] 2605056 / 4470649\r",
" 58% [........................................... ] 2613248 / 4470649\r",
" 58% [........................................... ] 2621440 / 4470649\r",
" 58% [........................................... ] 2629632 / 4470649\r",
" 59% [........................................... ] 2637824 / 4470649\r",
" 59% [........................................... ] 2646016 / 4470649\r",
" 59% [........................................... ] 2654208 / 4470649\r",
" 59% [............................................ ] 2662400 / 4470649\r",
" 59% [............................................ ] 2670592 / 4470649\r",
" 59% [............................................ ] 2678784 / 4470649\r",
" 60% [............................................ ] 2686976 / 4470649\r",
" 60% [............................................ ] 2695168 / 4470649\r",
" 60% [............................................ ] 2703360 / 4470649\r",
" 60% [............................................ ] 2711552 / 4470649\r",
" 60% [............................................. ] 2719744 / 4470649\r",
" 61% [............................................. ] 2727936 / 4470649\r",
" 61% [............................................. ] 2736128 / 4470649\r",
" 61% [............................................. ] 2744320 / 4470649\r",
" 61% [............................................. ] 2752512 / 4470649\r",
" 61% [............................................. ] 2760704 / 4470649\r",
" 61% [............................................. ] 2768896 / 4470649\r",
" 62% [............................................. ] 2777088 / 4470649\r",
" 62% [.............................................. ] 2785280 / 4470649\r",
" 62% [.............................................. ] 2793472 / 4470649\r",
" 62% [.............................................. ] 2801664 / 4470649\r",
" 62% [.............................................. ] 2809856 / 4470649\r",
" 63% [.............................................. ] 2818048 / 4470649\r",
" 63% [.............................................. ] 2826240 / 4470649\r",
" 63% [.............................................. ] 2834432 / 4470649\r",
" 63% [............................................... ] 2842624 / 4470649\r",
" 63% [............................................... ] 2850816 / 4470649\r",
" 63% [............................................... ] 2859008 / 4470649\r",
" 64% [............................................... ] 2867200 / 4470649\r",
" 64% [............................................... ] 2875392 / 4470649\r",
" 64% [............................................... ] 2883584 / 4470649\r",
" 64% [............................................... ] 2891776 / 4470649\r",
" 64% [................................................ ] 2899968 / 4470649\r",
" 65% [................................................ ] 2908160 / 4470649\r",
" 65% [................................................ ] 2916352 / 4470649\r",
" 65% [................................................ ] 2924544 / 4470649\r",
" 65% [................................................ ] 2932736 / 4470649\r",
" 65% [................................................ ] 2940928 / 4470649\r",
" 65% [................................................ ] 2949120 / 4470649\r",
" 66% [................................................ ] 2957312 / 4470649\r",
" 66% [................................................. ] 2965504 / 4470649\r",
" 66% [................................................. ] 2973696 / 4470649\r",
" 66% [................................................. ] 2981888 / 4470649\r",
" 66% [................................................. ] 2990080 / 4470649\r",
" 67% [................................................. ] 2998272 / 4470649\r",
" 67% [................................................. ] 3006464 / 4470649\r",
" 67% [................................................. ] 3014656 / 4470649\r",
" 67% [.................................................. ] 3022848 / 4470649\r",
" 67% [.................................................. ] 3031040 / 4470649\r",
" 67% [.................................................. ] 3039232 / 4470649\r",
" 68% [.................................................. ] 3047424 / 4470649\r",
" 68% [.................................................. ] 3055616 / 4470649\r",
" 68% [.................................................. ] 3063808 / 4470649\r",
" 68% [.................................................. ] 3072000 / 4470649\r",
" 68% [.................................................. ] 3080192 / 4470649\r",
" 69% [................................................... ] 3088384 / 4470649\r",
" 69% [................................................... ] 3096576 / 4470649\r",
" 69% [................................................... ] 3104768 / 4470649\r",
" 69% [................................................... ] 3112960 / 4470649\r",
" 69% [................................................... ] 3121152 / 4470649\r",
" 69% [................................................... ] 3129344 / 4470649\r",
" 70% [................................................... ] 3137536 / 4470649\r",
" 70% [.................................................... ] 3145728 / 4470649\r",
" 70% [.................................................... ] 3153920 / 4470649\r",
" 70% [.................................................... ] 3162112 / 4470649\r",
" 70% [.................................................... ] 3170304 / 4470649\r",
" 71% [.................................................... ] 3178496 / 4470649\r",
" 71% [.................................................... ] 3186688 / 4470649\r",
" 71% [.................................................... ] 3194880 / 4470649\r",
" 71% [..................................................... ] 3203072 / 4470649\r",
" 71% [..................................................... ] 3211264 / 4470649\r",
" 72% [..................................................... ] 3219456 / 4470649\r",
" 72% [..................................................... ] 3227648 / 4470649\r",
" 72% [..................................................... ] 3235840 / 4470649\r",
" 72% [..................................................... ] 3244032 / 4470649\r",
" 72% [..................................................... ] 3252224 / 4470649\r",
" 72% [..................................................... ] 3260416 / 4470649\r",
" 73% [...................................................... ] 3268608 / 4470649\r",
" 73% [...................................................... ] 3276800 / 4470649\r",
" 73% [...................................................... ] 3284992 / 4470649\r",
" 73% [...................................................... ] 3293184 / 4470649\r",
" 73% [...................................................... ] 3301376 / 4470649\r",
" 74% [...................................................... ] 3309568 / 4470649\r",
" 74% [...................................................... ] 3317760 / 4470649\r",
" 74% [....................................................... ] 3325952 / 4470649\r",
" 74% [....................................................... ] 3334144 / 4470649\r",
" 74% [....................................................... ] 3342336 / 4470649\r",
" 74% [....................................................... ] 3350528 / 4470649\r",
" 75% [....................................................... ] 3358720 / 4470649\r",
" 75% [....................................................... ] 3366912 / 4470649\r",
" 75% [....................................................... ] 3375104 / 4470649\r",
" 75% [........................................................ ] 3383296 / 4470649\r",
" 75% [........................................................ ] 3391488 / 4470649\r",
" 76% [........................................................ ] 3399680 / 4470649\r",
" 76% [........................................................ ] 3407872 / 4470649\r",
" 76% [........................................................ ] 3416064 / 4470649\r",
" 76% [........................................................ ] 3424256 / 4470649\r",
" 76% [........................................................ ] 3432448 / 4470649\r",
" 76% [........................................................ ] 3440640 / 4470649\r",
" 77% [......................................................... ] 3448832 / 4470649\r",
" 77% [......................................................... ] 3457024 / 4470649\r",
" 77% [......................................................... ] 3465216 / 4470649\r",
" 77% [......................................................... ] 3473408 / 4470649\r",
" 77% [......................................................... ] 3481600 / 4470649\r",
" 78% [......................................................... ] 3489792 / 4470649\r",
" 78% [......................................................... ] 3497984 / 4470649\r",
" 78% [.......................................................... ] 3506176 / 4470649\r",
" 78% [.......................................................... ] 3514368 / 4470649\r",
" 78% [.......................................................... ] 3522560 / 4470649\r",
" 78% [.......................................................... ] 3530752 / 4470649\r",
" 79% [.......................................................... ] 3538944 / 4470649\r",
" 79% [.......................................................... ] 3547136 / 4470649\r",
" 79% [.......................................................... ] 3555328 / 4470649\r",
" 79% [.......................................................... ] 3563520 / 4470649\r",
" 79% [........................................................... ] 3571712 / 4470649\r",
" 80% [........................................................... ] 3579904 / 4470649\r",
" 80% [........................................................... ] 3588096 / 4470649\r",
" 80% [........................................................... ] 3596288 / 4470649\r",
" 80% [........................................................... ] 3604480 / 4470649\r",
" 80% [........................................................... ] 3612672 / 4470649\r",
" 80% [........................................................... ] 3620864 / 4470649\r",
" 81% [............................................................ ] 3629056 / 4470649\r",
" 81% [............................................................ ] 3637248 / 4470649\r",
" 81% [............................................................ ] 3645440 / 4470649\r",
" 81% [............................................................ ] 3653632 / 4470649\r",
" 81% [............................................................ ] 3661824 / 4470649\r",
" 82% [............................................................ ] 3670016 / 4470649\r",
" 82% [............................................................ ] 3678208 / 4470649\r",
" 82% [............................................................. ] 3686400 / 4470649\r",
" 82% [............................................................. ] 3694592 / 4470649\r",
" 82% [............................................................. ] 3702784 / 4470649\r",
" 83% [............................................................. ] 3710976 / 4470649\r",
" 83% [............................................................. ] 3719168 / 4470649\r",
" 83% [............................................................. ] 3727360 / 4470649\r",
" 83% [............................................................. ] 3735552 / 4470649\r",
" 83% [............................................................. ] 3743744 / 4470649\r",
" 83% [.............................................................. ] 3751936 / 4470649\r",
" 84% [.............................................................. ] 3760128 / 4470649\r",
" 84% [.............................................................. ] 3768320 / 4470649\r",
" 84% [.............................................................. ] 3776512 / 4470649\r",
" 84% [.............................................................. ] 3784704 / 4470649\r",
" 84% [.............................................................. ] 3792896 / 4470649\r",
" 85% [.............................................................. ] 3801088 / 4470649\r",
" 85% [............................................................... ] 3809280 / 4470649\r",
" 85% [............................................................... ] 3817472 / 4470649\r",
" 85% [............................................................... ] 3825664 / 4470649\r",
" 85% [............................................................... ] 3833856 / 4470649\r",
" 85% [............................................................... ] 3842048 / 4470649\r",
" 86% [............................................................... ] 3850240 / 4470649\r",
" 86% [............................................................... ] 3858432 / 4470649\r",
" 86% [................................................................ ] 3866624 / 4470649\r",
" 86% [................................................................ ] 3874816 / 4470649\r",
" 86% [................................................................ ] 3883008 / 4470649\r",
" 87% [................................................................ ] 3891200 / 4470649\r",
" 87% [................................................................ ] 3899392 / 4470649\r",
" 87% [................................................................ ] 3907584 / 4470649\r",
" 87% [................................................................ ] 3915776 / 4470649\r",
" 87% [................................................................ ] 3923968 / 4470649\r",
" 87% [................................................................. ] 3932160 / 4470649\r",
" 88% [................................................................. ] 3940352 / 4470649\r",
" 88% [................................................................. ] 3948544 / 4470649\r",
" 88% [................................................................. ] 3956736 / 4470649\r",
" 88% [................................................................. ] 3964928 / 4470649\r",
" 88% [................................................................. ] 3973120 / 4470649\r",
" 89% [................................................................. ] 3981312 / 4470649\r",
" 89% [.................................................................. ] 3989504 / 4470649\r",
" 89% [.................................................................. ] 3997696 / 4470649\r",
" 89% [.................................................................. ] 4005888 / 4470649\r",
" 89% [.................................................................. ] 4014080 / 4470649\r",
" 89% [.................................................................. ] 4022272 / 4470649\r",
" 90% [.................................................................. ] 4030464 / 4470649\r",
" 90% [.................................................................. ] 4038656 / 4470649\r",
" 90% [.................................................................. ] 4046848 / 4470649\r",
" 90% [................................................................... ] 4055040 / 4470649\r",
" 90% [................................................................... ] 4063232 / 4470649\r",
" 91% [................................................................... ] 4071424 / 4470649\r",
" 91% [................................................................... ] 4079616 / 4470649\r",
" 91% [................................................................... ] 4087808 / 4470649\r",
" 91% [................................................................... ] 4096000 / 4470649\r",
" 91% [................................................................... ] 4104192 / 4470649"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\r",
" 91% [.................................................................... ] 4112384 / 4470649\r",
" 92% [.................................................................... ] 4120576 / 4470649\r",
" 92% [.................................................................... ] 4128768 / 4470649\r",
" 92% [.................................................................... ] 4136960 / 4470649\r",
" 92% [.................................................................... ] 4145152 / 4470649\r",
" 92% [.................................................................... ] 4153344 / 4470649\r",
" 93% [.................................................................... ] 4161536 / 4470649\r",
" 93% [..................................................................... ] 4169728 / 4470649\r",
" 93% [..................................................................... ] 4177920 / 4470649\r",
" 93% [..................................................................... ] 4186112 / 4470649\r",
" 93% [..................................................................... ] 4194304 / 4470649\r",
" 94% [..................................................................... ] 4202496 / 4470649\r",
" 94% [..................................................................... ] 4210688 / 4470649\r",
" 94% [..................................................................... ] 4218880 / 4470649\r",
" 94% [..................................................................... ] 4227072 / 4470649\r",
" 94% [...................................................................... ] 4235264 / 4470649\r",
" 94% [...................................................................... ] 4243456 / 4470649\r",
" 95% [...................................................................... ] 4251648 / 4470649\r",
" 95% [...................................................................... ] 4259840 / 4470649\r",
" 95% [...................................................................... ] 4268032 / 4470649\r",
" 95% [...................................................................... ] 4276224 / 4470649\r",
" 95% [...................................................................... ] 4284416 / 4470649\r",
" 96% [....................................................................... ] 4292608 / 4470649\r",
" 96% [....................................................................... ] 4300800 / 4470649\r",
" 96% [....................................................................... ] 4308992 / 4470649\r",
" 96% [....................................................................... ] 4317184 / 4470649\r",
" 96% [....................................................................... ] 4325376 / 4470649\r",
" 96% [....................................................................... ] 4333568 / 4470649\r",
" 97% [....................................................................... ] 4341760 / 4470649\r",
" 97% [........................................................................ ] 4349952 / 4470649\r",
" 97% [........................................................................ ] 4358144 / 4470649\r",
" 97% [........................................................................ ] 4366336 / 4470649\r",
" 97% [........................................................................ ] 4374528 / 4470649\r",
" 98% [........................................................................ ] 4382720 / 4470649\r",
" 98% [........................................................................ ] 4390912 / 4470649\r",
" 98% [........................................................................ ] 4399104 / 4470649\r",
" 98% [........................................................................ ] 4407296 / 4470649\r",
" 98% [......................................................................... ] 4415488 / 4470649\r",
" 98% [......................................................................... ] 4423680 / 4470649\r",
" 99% [......................................................................... ] 4431872 / 4470649\r",
" 99% [......................................................................... ] 4440064 / 4470649\r",
" 99% [......................................................................... ] 4448256 / 4470649\r",
" 99% [......................................................................... ] 4456448 / 4470649\r",
" 99% [......................................................................... ] 4464640 / 4470649\r",
"100% [..........................................................................] 4470649 / 4470649"
]
},
{
"data": {
"text/plain": [
"'wikipedia_articles_2000 (2).csv'"
]
},
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"embeddings_url = 'https://cdn.openai.com/API/examples/data/wikipedia_articles_2000.csv'\n",
"\n",
"# The file is ~700 MB so this will take some time\n",
"wget.download(embeddings_url)"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "5b873693",
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>Unnamed: 0</th>\n",
" <th>id</th>\n",
" <th>url</th>\n",
" <th>title</th>\n",
" <th>text</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>0</th>\n",
" <td>878</td>\n",
" <td>3661</td>\n",
" <td>https://simple.wikipedia.org/wiki/Photon</td>\n",
" <td>Photon</td>\n",
" <td>Photons (from Greek φως, meaning light), in many atomic models in physics, are particles which transmit light. In other words, light is carried over space by photons. Photon is an elementary particle that is its own antiparticle. In quantum mechanics each photon has a characteristic quantum of energy that depends on frequency: A photon associated with light at a higher frequency will have more energy (and be associated with light at a shorter wavelength).\\n\\nPhotons have a rest mass of 0 (zero). However, Einstein's theory of relativity says that they do have a certain amount of momentum. Before the photon got its name, Einstein revived the proposal that light is separate pieces of energy (particles). These particles came to be known as photons. \\n\\nA photon is usually given the symbol γ (gamma),\\n\\nProperties \\n\\nPhotons are fundamental particles. Although they can be created and destroyed, their lifetime is infinite.\\n\\nIn a vacuum, all photons move at the speed of light, c, which is equal to 299,792,458 meters (approximately 300,000 kilometers) per second.\\n\\nA photon has a given frequency, which determines its color. Radio technology makes great use of frequency. Beyond the visible range, frequency is less discussed, for example it is little used in distinguishing between X-Ray photons and infrared. Frequency is equivalent to the quantum energy of the photon, as related by the Planck constant equation,\\n\\n,\\n\\nwhere is the photon's energy, is the Plank constant, and is the frequency of the light associated with the photon. This frequency, , is typically measured in cycles per second, or equivalently, in Hz. The quantum energy of different photons is often used in cameras, and other machines that use visible and higher than visible radiation. This because these photons are energetic enough to ionize atoms. \\n\\nAnother property of a photon is its wavelength. The frequency , wavelength , and speed of light are related by the equation,\\n\\n,\\n\\nwhere (lambda) is the wavelength, or length of the wave (typically measured in meters.)\\n\\nAnother important property of a photon is its polarity. If you saw a giant photon coming straight at you, it could appear as a swath whipping vertically, horizontally, or somewhere in between. Polarized sunglasses stop photons swinging up and down from passing. This is how they reduce glare as light bouncing off of surfaces tend to fly that way. Liquid crystal displays also use polarity to control which light passes through. Some animals can see light polarization. \\n\\nFinally, a photon has a property called spin. Spin is related to light's circular polarization.\\n\\nPhoton interactions with matter\\nLight is often created or absorbed when an electron gains or loses energy. This energy can be in the form of heat, kinetic energy, or other form. For example, an incandescent light bulb uses heat. The increase of energy can push an electron up one level in a shell called a \"valence\". This makes it unstable, and like everything, it wants to be in the lowest energy state. (If being in the lowest energy state is confusing, pick up a pencil and drop it. Once on the ground, the pencil will be in a lower energy state). When the electron drops back down to a lower energy state, it needs to release the energy that hit it, and it must obey the conservation of energy (energy can neither be created nor destroyed). Electrons release this energy as photons, and at higher intensities, this photon can be seen as visible light.\\n\\nPhotons and the electromagnetic force\\nIn particle physics, photons are responsible for electromagnetic force. Electromagnetism is an idea that combines electricity with magnetism. One common way that we experience electromagnetism in our daily lives is light, which is caused by electromagnetism. Electromagnetism is also responsible for charge, which is the reason that you can not push your hand through a table. Since photons are the force-carrying particle of electromagnetism, they are also gauge bosons. Some mattercal
" </tr>\n",
" <tr>\n",
" <th>1</th>\n",
" <td>2425</td>\n",
" <td>7796</td>\n",
" <td>https://simple.wikipedia.org/wiki/Thomas%20Dolby</td>\n",
" <td>Thomas Dolby</td>\n",
" <td>Thomas Dolby (born Thomas Morgan Robertson; 14 October 1958) is a British musican and computer designer. He is probably most famous for his 1982 hit, \"She Blinded me with Science\".\\n\\nHe married actress Kathleen Beller in 1988. The couple have three children together.\\n\\nDiscography\\n\\nSingles\\n\\nA Track did not chart in North America until 1983, after the success of \"She Blinded Me With Science\".\\n\\nAlbums\\n\\nStudio albums\\n\\nEPs\\n\\nReferences\\n\\nEnglish musicians\\nLiving people\\n1958 births\\nNew wave musicians\\nWarner Bros. Records artists</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2</th>\n",
" <td>18059</td>\n",
" <td>67912</td>\n",
" <td>https://simple.wikipedia.org/wiki/Embroidery</td>\n",
" <td>Embroidery</td>\n",
" <td>Embroidery is the art of decorating fabric or other materials with designs stitched in strands of thread or yarn using a needle. Embroidery may also incorporate other materials such as metal strips, pearls, beads, quills, and sequins. Sewing machines can be used to create machine embroidery.\\n\\nQualifications \\nCity and Guilds qualification in Embroidery allows embroiderers to become recognized for their skill. This qualification also gives them the credibility to teach. For example, the notable textiles artist, Kathleen Laurel Sage, began her teaching career by getting the City and Guilds Embroidery 1 and 2 qualifications. She has now gone on to write a book on the subject.\\n\\nReferences\\n\\nOther websites\\n The Crimson Thread of Kinship at the National Museum of Australia\\n\\nNeedlework</td>\n",
" </tr>\n",
" <tr>\n",
" <th>3</th>\n",
" <td>12045</td>\n",
" <td>44309</td>\n",
" <td>https://simple.wikipedia.org/wiki/Consecutive%20integer</td>\n",
" <td>Consecutive integer</td>\n",
" <td>Consecutive numbers are numbers that follow each other in order. They have a difference of 1 between every two numbers. In a set of consecutive numbers, the mean and the median are equal. \\n\\nIf n is a number, then the next numbers will be n+1 and n+2. \\n\\nExamples \\n\\nConsecutive numbers that follow each other in order:\\n\\n 1, 2, 3, 4, 5\\n -3, 2, 1, 0, 1, 2, 3, 4\\n 6, 7, 8, 9, 10, 11, 12, 13\\n\\nConsecutive even numbers \\nConsecutive even numbers are even numbers that follow each other. They have a difference of 2 between every two numbers.\\n\\nIf n is an even integer, then n, n+2, n+4 and n+6 will be consecutive even numbers.\\n\\nFor example - 2,4,6,8,10,12,14,18 etc.\\n\\nConsecutive odd numbers\\nConsecutive odd numbers are odd numbers that follow each other. Like consecutive odd numbers, they have a difference of 2 between every two numbers.\\n\\nIf n is an odd integer, then n, n+2, n+4 and n+6 will be consecutive odd numbers.\\n\\nExamples\\n\\n3, 5, 7, 9, 11, 13, etc.\\n\\n23, 21, 19, 17, 15, -13, -11\\n\\nIntegers</td>\n",
" </tr>\n",
" <tr>\n",
" <th>4</th>\n",
" <td>11477</td>\n",
" <td>41741</td>\n",
" <td>https://simple.wikipedia.org/wiki/German%20Empire</td>\n",
" <td>German Empire</td>\n",
" <td>The German Empire (\"Deutsches Reich\" or \"Deutsches Kaiserreich\" in the German language) is the name for a group of German countries from January 18, 1871 to November 9, 1918. This is from the Unification of Germany when Wilhelm I of Prussia was made German Kaiser to when the third Emperor Wilhelm II was removed from power at the end of the First World War. In the 1920s, German nationalists started to call it the \"Second Reich\".\\n\\nThe name of Germany was \"Deutsches Reich\" until 1945. \"Reich\" can mean many things, empire, kingdom, state, \"richness\" or \"wealth\". Most members of the Empire were previously members of the North German Confederation. \\n\\nAt different times, there were three groups of smaller countries, each group was later called a \"Reich\" by some Germans. The first was the Holy Roman Empire. The second was the German Empire. The third was the Third Reich.\\n\\nThe words \"Second Reich\" were used for the German Empire by Arthur Moeller van den Bruck, a nationalist writer in the 1920s. He was trying to make a link with the earlier Holy Roman Empire which had once been very strong. Germany had lost First World War and was suffering big problems. van den Bruck wanted to start a \"Third Reich\" to unite the country. These words were later used by the Nazis to make themselves appear stronger.\\n\\nStates in the Empire\\n\\nRelated pages\\n Germany\\n Holy Roman Empire\\n Nazi Germany, or \"Drittes Reich\"\\n\\n1870s establishments in Germany\\n \\nStates and territories disestablished in the 20th century\\nStates and territories established in the 19th century\\n1871 establishments in Europe\\n1918 disestablishments in Germany</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"</div>"
],
"text/plain": [
" Unnamed: 0 id url \\\n",
"0 878 3661 https://simple.wikipedia.org/wiki/Photon \n",
"1 2425 7796 https://simple.wikipedia.org/wiki/Thomas%20Dolby \n",
"2 18059 67912 https://simple.wikipedia.org/wiki/Embroidery \n",
"3 12045 44309 https://simple.wikipedia.org/wiki/Consecutive%20integer \n",
"4 11477 41741 https://simple.wikipedia.org/wiki/German%20Empire \n",
"\n",
" title \\\n",
"0 Photon \n",
"1 Thomas Dolby \n",
"2 Embroidery \n",
"3 Consecutive integer \n",
"4 German Empire \n",
"\n",
"
"0 Photons (from Greek φως, meaning light), in many atomic models in physics, are particles which transmit light. In other words, light is carried over space by photons. Photon is an elementary particle that is its own antiparticle. In quantum mechanics each photon has a characteristic quantum of energy that depends on frequency: A photon associated with light at a higher frequency will have more energy (and be associated with light at a shorter wavelength).\\n\\nPhotons have a rest mass of 0 (zero). However, Einstein's theory of relativity says that they do have a certain amount of momentum. Before the photon got its name, Einstein revived the proposal that light is separate pieces of energy (particles). These particles came to be known as photons. \\n\\nA photon is usually given the symbol γ (gamma),\\n\\nProperties \\n\\nPhotons are fundamental particles. Although they can be created and destroyed, their lifetime is infinite.\\n\\nIn a vacuum, all photons move at the speed of light, c, which is equal to 299,792,458 meters (approximately 300,000 kilometers) per second.\\n\\nA photon has a given frequency, which determines its color. Radio technology makes great use of frequency. Beyond the visible range, frequency is less discussed, for example it is little used in distinguishing between X-Ray photons and infrared. Frequency is equivalent to the quantum energy of the photon, as related by the Planck constant equation,\\n\\n,\\n\\nwhere is the photon's energy, is the Plank constant, and is the frequency of the light associated with the photon. This frequency, , is typically measured in cycles per second, or equivalently, in Hz. The quantum energy of different photons is often used in cameras, and other machines that use visible and higher than visible radiation. This because these photons are energetic enough to ionize atoms. \\n\\nAnother property of a photon is its wavelength. The frequency , wavelength , and speed of light are related by the equation,\\n\\n,\\n\\nwhere (lambda) is the wavelength, or length of the wave (typically measured in meters.)\\n\\nAnother important property of a photon is its polarity. If you saw a giant photon coming straight at you, it could appear as a swath whipping vertically, horizontally, or somewhere in between. Polarized sunglasses stop photons swinging up and down from passing. This is how they reduce glare as light bouncing off of surfaces tend to fly that way. Liquid crystal displays also use polarity to control which light passes through. Some animals can see light polarization. \\n\\nFinally, a photon has a property called spin. Spin is related to light's circular polarization.\\n\\nPhoton interactions with matter\\nLight is often created or absorbed when an electron gains or loses energy. This energy can be in the form of heat, kinetic energy, or other form. For example, an incandescent light bulb uses heat. The increase of energy can push an electron up one level in a shell called a \"valence\". This makes it unstable, and like everything, it wants to be in the lowest energy state. (If being in the lowest energy state is confusing, pick up a pencil and drop it. Once on the ground, the pencil will be in a lower energy state). When the electron drops back down to a lower energy state, it needs to release the energy that hit it, and it must obey the conservation of energy (energy can neither be created nor destroyed). Electrons release this energy as photons, and at higher intensities, this photon can be seen as visible light.\\n\\nPhotons and the electromagnetic force\\nIn particle physics, photons are responsible for electromagnetic force. Electromagnetism is an idea that combines electricity with magnetism. One common way that we experience electromagnetism in our daily lives is light, which is caused by electromagnetism. Electromagnetism is also responsible for charge, which is the reason that you can not push your hand through a table. Since photons are the force-carrying particle of electromagnetism, they are also gauge bosons. Some mattercalled dar
"1 Thomas Dolby (born Thomas Morgan Robertson; 14 October 1958) is a British musican and computer designer. He is probably most famous for his 1982 hit, \"She Blinded me with Science\".\\n\\nHe married actress Kathleen Beller in 1988. The couple have three children together.\\n\\nDiscography\\n\\nSingles\\n\\nA Track did not chart in North America until 1983, after the success of \"She Blinded Me With Science\".\\n\\nAlbums\\n\\nStudio albums\\n\\nEPs\\n\\nReferences\\n\\nEnglish musicians\\nLiving people\\n1958 births\\nNew wave musicians\\nWarner Bros. Records artists
"2 Embroidery is the art of decorating fabric or other materials with designs stitched in strands of thread or yarn using a needle. Embroidery may also incorporate other materials such as metal strips, pearls, beads, quills, and sequins. Sewing machines can be used to create machine embroidery.\\n\\nQualifications \\nCity and Guilds qualification in Embroidery allows embroiderers to become recognized for their skill. This qualification also gives them the credibility to teach. For example, the notable textiles artist, Kathleen Laurel Sage, began her teaching career by getting the City and Guilds Embroidery 1 and 2 qualifications. She has now gone on to write a book on the subject.\\n\\nReferences\\n\\nOther websites\\n The Crimson Thread of Kinship at the National Museum of Australia\\n\\nNeedlework
"3 Consecutive numbers are numbers that follow each other in order. They have a difference of 1 between every two numbers. In a set of consecutive numbers, the mean and the median are equal. \\n\\nIf n is a number, then the next numbers will be n+1 and n+2. \\n\\nExamples \\n\\nConsecutive numbers that follow each other in order:\\n\\n 1, 2, 3, 4, 5\\n -3, 2, 1, 0, 1, 2, 3, 4\\n 6, 7, 8, 9, 10, 11, 12, 13\\n\\nConsecutive even numbers \\nConsecutive even numbers are even numbers that follow each other. They have a difference of 2 between every two numbers.\\n\\nIf n is an even integer, then n, n+2, n+4 and n+6 will be consecutive even numbers.\\n\\nFor example - 2,4,6,8,10,12,14,18 etc.\\n\\nConsecutive odd numbers\\nConsecutive odd numbers are odd numbers that follow each other. Like consecutive odd numbers, they have a difference of 2 between every two numbers.\\n\\nIf n is an odd integer, then n, n+2, n+4 and n+6 will be consecutive odd numbers.\\n\\nExamples\\n\\n3, 5, 7, 9, 11, 13, etc.\\n\\n23, 21, 19, 17, 15, -13, -11\\n\\nIntegers
"4 The German Empire (\"Deutsches Reich\" or \"Deutsches Kaiserreich\" in the German language) is the name for a group of German countries from January 18, 1871 to November 9, 1918. This is from the Unification of Germany when Wilhelm I of Prussia was made German Kaiser to when the third Emperor Wilhelm II was removed from power at the end of the First World War. In the 1920s, German nationalists started to call it the \"Second Reich\".\\n\\nThe name of Germany was \"Deutsches Reich\" until 1945. \"Reich\" can mean many things, empire, kingdom, state, \"richness\" or \"wealth\". Most members of the Empire were previously members of the North German Confederation. \\n\\nAt different times, there were three groups of smaller countries, each group was later called a \"Reich\" by some Germans. The first was the Holy Roman Empire. The second was the German Empire. The third was the Third Reich.\\n\\nThe words \"Second Reich\" were used for the German Empire by Arthur Moeller van den Bruck, a nationalist writer in the 1920s. He was trying to make a link with the earlier Holy Roman Empire which had once been very strong. Germany had lost First World War and was suffering big problems. van den Bruck wanted to start a \"Third Reich\" to unite the country. These words were later used by the Nazis to make themselves appear stronger.\\n\\nStates in the Empire\\n\\nRelated pages\\n Germany\\n Holy Roman Empire\\n Nazi Germany, or \"Drittes Reich\"\\n\\n1870s establishments in Germany\\n \\nStates and territories disestablished in the 20th century\\nStates and territories established in the 19th century\\n1871 establishments in Europe\\n1918 disestablishments in Germany
]
},
"execution_count": 6,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"article_df = pd.read_csv('./wikipedia_articles_2000.csv')\n",
"article_df.head()"
]
},
{
"cell_type": "markdown",
"id": "ee8240c5",
"metadata": {},
"source": [
"## Storage\n",
"\n",
"We'll initialise our vector database first. Which database you choose and how you store data in it is a key decision point, and we've collated a few principles to aid your decision here:\n",
"\n",
"#### How much data to store\n",
"How much metadata do you want to include in the index. Metadata can be used to filter your queries or to bring back more information upon retrieval for your application to use, but larger indices will be slower so there is a trade-off.\n",
"\n",
"There are two common design patterns here:\n",
"- **All-in-one:** Store your metadata with the vector embeddings so you perform semantic search and retrieval on the same database. This is easier to setup and run, but can run into scaling issues when your index grows.\n",
"- **Vectors only:** Store just the embeddings and any IDs/references needed to locate the metadata that goes with the vector in a different database or location. In this pattern the vector database is only used to locate the most relevant IDs, then those are looked up from a different database. This can be more scalable if your vector database is going to be extremely large, or if you have large volumes of metadata with each vector.\n",
"\n",
"#### Which vector database to use\n",
"\n",
"The vector database market is wide and varied, so we won't recommend one over the other. For a few options you can review [this cookbook](./vector_databases/Using_vector_databases_for_embeddings_search.ipynb) and the sub-folders, which have examples supplied by many of the vector database providers in the market. \n",
"\n",
"We're going to use Redis as our database for both document contents and the vector embeddings. You will need the full Redis Stack to enable use of Redisearch, which is the module that allows semantic search - more detail is in the [docs for Redis Stack](https://redis.io/docs/stack/get-started/install/docker/).\n",
"\n",
"To set this up locally, you will need to:\n",
"- Install an appropriate version of [Docker](https://docs.docker.com/desktop/) for your OS\n",
"- Ensure Docker is running i.e. by running ```docker run hello-world```\n",
"- Run the following command: ```docker run -d --name redis-stack -p 6379:6379 -p 8001:8001 redis/redis-stack:latest```.\n",
"\n",
"The code used here draws heavily on [this repo](https://github.com/RedisAI/vecsim-demo).\n",
"\n",
"After setting up the Docker instance of Redis Stack, you can follow the below instructions to initiate a Redis connection and create a Hierarchical Navigable Small World (HNSW) index for semantic search."
]
},
{
"cell_type": "code",
"execution_count": 7,
"id": "fecba6de",
"metadata": {},
"outputs": [],
"source": [
"# Setup Redis\n",
"\n",
"\n",
"REDIS_HOST = 'localhost'\n",
"REDIS_PORT = '6379'\n",
"REDIS_DB = '0'\n",
"\n",
"redis_client = r(host=REDIS_HOST, port=REDIS_PORT, db=REDIS_DB,decode_responses=False)\n",
"\n",
"\n",
"# Constants\n",
"VECTOR_DIM = 1536 # length of the vectors\n",
"PREFIX = \"wiki\" # prefix for the document keys\n",
"DISTANCE_METRIC = \"COSINE\" # distance metric for the vectors (ex. COSINE, IP, L2)"
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "4cb5247d",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"True"
]
},
"execution_count": 8,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Create search index\n",
"\n",
"# Index\n",
"INDEX_NAME = \"wiki-index\" # name of the search index\n",
"VECTOR_FIELD_NAME = 'content_vector'\n",
"\n",
"# Define RediSearch fields for each of the columns in the dataset\n",
"# This is where you should add any additional metadata you want to capture\n",
"id = TextField(\"id\")\n",
"url = TextField(\"url\")\n",
"title = TextField(\"title\")\n",
"text_chunk = TextField(\"content\")\n",
"file_chunk_index = NumericField(\"file_chunk_index\")\n",
"\n",
"# define RediSearch vector fields to use HNSW index\n",
"\n",
"text_embedding = VectorField(VECTOR_FIELD_NAME,\n",
" \"HNSW\", {\n",
" \"TYPE\": \"FLOAT32\",\n",
" \"DIM\": VECTOR_DIM,\n",
" \"DISTANCE_METRIC\": DISTANCE_METRIC\n",
" }\n",
")\n",
"# Add all our field objects to a list to be created as an index\n",
"fields = [url,title,text_chunk,file_chunk_index,text_embedding]\n",
"\n",
"redis_client.ping()"
]
},
{
"cell_type": "markdown",
"id": "33c07f00",
"metadata": {},
"source": [
"Optional step to drop the index if it already exists\n",
"\n",
"```redis_client.ft(INDEX_NAME).dropindex()```\n",
"\n",
"If you want to clear the whole DB use:\n",
"\n",
"```redis_client.flushall()```"
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "08f30b56",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Unknown Index name\n",
"Not there yet. Creating\n"
]
}
],
"source": [
"# Check if index exists\n",
"try:\n",
" redis_client.ft(INDEX_NAME).info()\n",
" print(\"Index already exists\")\n",
"except Exception as e:\n",
" print(e)\n",
" # Create RediSearch Index\n",
" print('Not there yet. Creating')\n",
" redis_client.ft(INDEX_NAME).create_index(\n",
" fields = fields,\n",
" definition = IndexDefinition(prefix=[PREFIX], index_type=IndexType.HASH)\n",
" )"
]
},
{
"cell_type": "markdown",
"id": "7684b539",
"metadata": {},
"source": [
"### Data preparation\n",
"\n",
"The next step is to prepare your data. There are a few decisions to keep in mind here:\n",
"\n",
"#### Chunking your data\n",
"\n",
"In this context, \"chunking\" means cutting up the text into reasonable sizes so that the content will fit into the context length of the language model you choose. If your data is small enough or your LLM has a large enough context limit then you can proceed with no chunking, but in many cases you'll need to chunk your data. I'll share two main design patterns here:\n",
"- **Token-based:** Chunking your data based on some common token threshold i.e. 300, 500, 1000 depending on your use case. This approach works best with a grid-search evaluation to decide the optimal chunking logic over a set of evaluation questions. Variables to consider are whether chunks have overlaps, and whether you extend or truncate a section to keep full sentences and paragraphs together.\n",
"- **Deterministic:** Deterministic chunking uses some common delimiter, like a page break, paragraph end, section header etc. to chunk. This can work well if you have data of reasonable uniform structure, or if you can use GPT to help annotate the data first so you can guarantee common delimiters. However, it can be difficult to handle your chunks when you stuff them into the prompt given you need to cater for many different lengths of content, so consider that in your application design.\n",
"\n",
"#### Which vectors should you store\n",
"\n",
"It is critical to think through the user experience you're building towards because this will inform both the number and content of your vectors. Here are two example use cases that show how these can pan out:\n",
"- **Tool Manual Knowledge Base:** We have a database of manuals that our customers want to search over. For this use case, we want a vector to allow the user to identify the right manual, before searching a different set of vectors to interrogate the content of the manual to avoid any cross-pollination of similar content between different manuals. \n",
" - **Title Vector:** Could include title, author name, brand and abstract.\n",
" - **Content Vector:** Includes content only.\n",
"- **Investor Reports:** We have a database of investor reports that contain financial information about public companies. I want relevant snippets pulled out and summarised so I can decide how to invest. In this instance we want one set of content vectors, so that the retrieval can pull multiple entries on a company or industry, and summarise them to form a composite analysis.\n",
" - **Content Vector:** Includes content only, or content supplemented by other features that improve search quality such as author, industry etc.\n",
" \n",
"For this walkthrough we'll go with 1000 token-based chunking of text content with no overlap, and embed them with the article title included as a prefix."
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "948225f7",
"metadata": {},
"outputs": [],
"source": [
"# We'll use 1000 token chunks with some intelligence to not split at the end of a sentence\n",
"TEXT_EMBEDDING_CHUNK_SIZE = 1000\n",
"EMBEDDINGS_MODEL = \"text-embedding-ada-002\""
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "31004582",
"metadata": {},
"outputs": [],
"source": [
"## Chunking Logic\n",
"\n",
"# Split a text into smaller chunks of size n, preferably ending at the end of a sentence\n",
"def chunks(text, n, tokenizer):\n",
" tokens = tokenizer.encode(text)\n",
" \"\"\"Yield successive n-sized chunks from text.\"\"\"\n",
" i = 0\n",
" while i < len(tokens):\n",
" # Find the nearest end of sentence within a range of 0.5 * n and 1.5 * n tokens\n",
" j = min(i + int(1.5 * n), len(tokens))\n",
" while j > i + int(0.5 * n):\n",
" # Decode the tokens and check for full stop or newline\n",
" chunk = tokenizer.decode(tokens[i:j])\n",
" if chunk.endswith(\".\") or chunk.endswith(\"\\n\"):\n",
" break\n",
" j -= 1\n",
" # If no end of sentence found, use n tokens as the chunk size\n",
" if j == i + int(0.5 * n):\n",
" j = min(i + n, len(tokens))\n",
" yield tokens[i:j]\n",
" i = j\n",
" \n",
"def get_unique_id_for_file_chunk(title, chunk_index):\n",
" return str(title+\"-!\"+str(chunk_index))\n",
"\n",
"def chunk_text(x,text_list):\n",
" url = x['url']\n",
" title = x['title']\n",
" file_body_string = x['text']\n",
" \n",
" \"\"\"Return a list of tuples (text_chunk, embedding) for a text.\"\"\"\n",
" token_chunks = list(chunks(file_body_string, TEXT_EMBEDDING_CHUNK_SIZE, tokenizer))\n",
" text_chunks = [f'Title: {title};\\n'+ tokenizer.decode(chunk) for chunk in token_chunks]\n",
" \n",
" #embeddings_response = openai.Embedding.create(input=text_chunks, model=EMBEDDINGS_MODEL)\n",
"\n",
" #embeddings = [embedding[\"embedding\"] for embedding in embeddings_response['data']]\n",
" #text_embeddings = list(zip(text_chunks, embeddings))\n",
"\n",
" # Get the vectors array of triples: file_chunk_id, embedding, metadata for each embedding\n",
" # Metadata is a dict with keys: filename, file_chunk_index\n",
" \n",
" for i, text_chunk in enumerate(text_chunks):\n",
" id = get_unique_id_for_file_chunk(title, i)\n",
" text_list.append(({'id': id\n",
" , 'metadata': {\"url\": x['url']\n",
" ,\"title\": title\n",
" , \"content\": text_chunk\n",
" , \"file_chunk_index\": i}}))"
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "4f4cf064",
"metadata": {},
"outputs": [],
"source": [
"## Batch Embedding Logic\n",
"\n",
"# Simple function to take in a list of text objects and return them as a list of embeddings\n",
2023-05-12 13:30:33 +00:00
"@retry(wait=wait_random_exponential(min=1, max=40), stop=stop_after_attempt(10))\n",
"def get_embeddings(input: List):\n",
" response = openai.Embedding.create(\n",
" input=input,\n",
" model=EMBEDDINGS_MODEL,\n",
" )[\"data\"]\n",
" return [data[\"embedding\"] for data in response]\n",
"\n",
"def batchify(iterable, n=1):\n",
" l = len(iterable)\n",
" for ndx in range(0, l, n):\n",
" yield iterable[ndx : min(ndx + n, l)]\n",
"\n",
"# Function for batching and parallel processing the embeddings\n",
"def embed_corpus(\n",
" corpus: List[str],\n",
" batch_size=64,\n",
" num_workers=8,\n",
" max_context_len=8191,\n",
"):\n",
"\n",
" # Encode the corpus, truncating to max_context_len\n",
" encoding = tiktoken.get_encoding(\"cl100k_base\")\n",
" encoded_corpus = [\n",
" encoded_article[:max_context_len] for encoded_article in encoding.encode_batch(corpus)\n",
" ]\n",
"\n",
" # Calculate corpus statistics: the number of inputs, the total number of tokens, and the estimated cost to embed\n",
" num_tokens = sum(len(article) for article in encoded_corpus)\n",
" cost_to_embed_tokens = num_tokens / 1_000 * 0.0004\n",
" print(\n",
" f\"num_articles={len(encoded_corpus)}, num_tokens={num_tokens}, est_embedding_cost={cost_to_embed_tokens:.2f} USD\"\n",
" )\n",
"\n",
" # Embed the corpus\n",
" with concurrent.futures.ThreadPoolExecutor(max_workers=num_workers) as executor:\n",
" \n",
" futures = [\n",
" executor.submit(get_embeddings, text_batch)\n",
" for text_batch in batchify(encoded_corpus, batch_size)\n",
" ]\n",
"\n",
" with tqdm(total=len(encoded_corpus)) as pbar:\n",
" for _ in concurrent.futures.as_completed(futures):\n",
" pbar.update(batch_size)\n",
"\n",
" embeddings = []\n",
" for future in futures:\n",
" data = future.result()\n",
" embeddings.extend(data)\n",
"\n",
" return embeddings"
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "dfeff174",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"CPU times: user 1.04 s, sys: 131 ms, total: 1.17 s\n",
"Wall time: 1.2 s\n"
]
}
],
"source": [
"%%time\n",
"# Initialise tokenizer\n",
"tokenizer = tiktoken.get_encoding(\"cl100k_base\")\n",
"\n",
"# List to hold vectors\n",
"text_list = []\n",
"\n",
"# Process each PDF file and prepare for embedding\n",
"x = article_df.apply(lambda x: chunk_text(x, text_list),axis = 1)"
]
},
{
"cell_type": "code",
"execution_count": 14,
"id": "6f259e2d",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"{'id': 'Photon-!0',\n",
" 'metadata': {'url': 'https://simple.wikipedia.org/wiki/Photon',\n",
" 'title': 'Photon',\n",
" 'content': 'Title: Photon;\\nPhotons (from Greek φως, meaning light), in many atomic models in physics, are particles which transmit light. In other words, light is carried over space by photons. Photon is an elementary particle that is its own antiparticle. In quantum mechanics each photon has a characteristic quantum of energy that depends on frequency: A photon associated with light at a higher frequency will have more energy (and be associated with light at a shorter wavelength).\\n\\nPhotons have a rest mass of 0 (zero). However, Einstein\\'s theory of relativity says that they do have a certain amount of momentum. Before the photon got its name, Einstein revived the proposal that light is separate pieces of energy (particles). These particles came to be known as photons. \\n\\nA photon is usually given the symbol γ (gamma),\\n\\nProperties \\n\\nPhotons are fundamental particles. Although they can be created and destroyed, their lifetime is infinite.\\n\\nIn a vacuum, all photons move at the speed of light, c, which is equal to 299,792,458 meters (approximately 300,000 kilometers) per second.\\n\\nA photon has a given frequency, which determines its color. Radio technology makes great use of frequency. Beyond the visible range, frequency is less discussed, for example it is little used in distinguishing between X-Ray photons and infrared. Frequency is equivalent to the quantum energy of the photon, as related by the Planck constant equation,\\n\\n,\\n\\nwhere is the photon\\'s energy, is the Plank constant, and is the frequency of the light associated with the photon. This frequency, , is typically measured in cycles per second, or equivalently, in Hz. The quantum energy of different photons is often used in cameras, and other machines that use visible and higher than visible radiation. This because these photons are energetic enough to ionize atoms. \\n\\nAnother property of a photon is its wavelength. The frequency , wavelength , and speed of light are related by the equation,\\n\\n,\\n\\nwhere (lambda) is the wavelength, or length of the wave (typically measured in meters.)\\n\\nAnother important property of a photon is its polarity. If you saw a giant photon coming straight at you, it could appear as a swath whipping vertically, horizontally, or somewhere in between. Polarized sunglasses stop photons swinging up and down from passing. This is how they reduce glare as light bouncing off of surfaces tend to fly that way. Liquid crystal displays also use polarity to control which light passes through. Some animals can see light polarization. \\n\\nFinally, a photon has a property called spin. Spin is related to light\\'s circular polarization.\\n\\nPhoton interactions with matter\\nLight is often created or absorbed when an electron gains or loses energy. This energy can be in the form of heat, kinetic energy, or other form. For example, an incandescent light bulb uses heat. The increase of energy can push an electron up one level in a shell called a \"valence\". This makes it unstable, and like everything, it wants to be in the lowest energy state. (If being in the lowest energy state is confusing, pick up a pencil and drop it. Once on the ground, the pencil will be in a lower energy state). When the electron drops back down to a lower energy state, it needs to release the energy that hit it, and it must obey the conservation of energy (energy can neither be created nor destroyed). Electrons release this energy as photons, and at higher intensities, this photon can be seen as visible light.\\n\\nPhotons and the electromagnetic force\\nIn particle physics, photons are responsible for electromagnetic force. Electromagnetism is an idea that combines electricity with magnetism. One common way that we experience electromagnetism in our daily lives is light, which is caused by electromagnetism. Electromagnetism is also responsible for charge, which is the reason that you can not push your hand through a table. Since photons are the force-carrying particle of electromagnetism, they are also gaug
" 'file_chunk_index': 0}}"
]
},
"execution_count": 14,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"text_list[0]"
]
},
{
"cell_type": "code",
"execution_count": 15,
"id": "4fd503a0",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"num_articles=2693, num_tokens=1046988, est_embedding_cost=0.42 USD\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"2752it [00:10, 271.48it/s] \n"
]
}
],
"source": [
"# Batch embed our chunked text - this will cost you about $0.50\n",
"embeddings = embed_corpus([text[\"metadata\"]['content'] for text in text_list])"
]
},
{
"cell_type": "code",
"execution_count": 16,
"id": "49f75d78",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"{'id': 'Photon-!0',\n",
" 'metadata': {'url': 'https://simple.wikipedia.org/wiki/Photon',\n",
" 'title': 'Photon',\n",
" 'content': 'Title: Photon;\\nPhotons (from Greek φως, meaning light), in many atomic models in physics, are particles which transmit light. In other words, light is carried over space by photons. Photon is an elementary particle that is its own antiparticle. In quantum mechanics each photon has a characteristic quantum of energy that depends on frequency: A photon associated with light at a higher frequency will have more energy (and be associated with light at a shorter wavelength).\\n\\nPhotons have a rest mass of 0 (zero). However, Einstein\\'s theory of relativity says that they do have a certain amount of momentum. Before the photon got its name, Einstein revived the proposal that light is separate pieces of energy (particles). These particles came to be known as photons. \\n\\nA photon is usually given the symbol γ (gamma),\\n\\nProperties \\n\\nPhotons are fundamental particles. Although they can be created and destroyed, their lifetime is infinite.\\n\\nIn a vacuum, all photons move at the speed of light, c, which is equal to 299,792,458 meters (approximately 300,000 kilometers) per second.\\n\\nA photon has a given frequency, which determines its color. Radio technology makes great use of frequency. Beyond the visible range, frequency is less discussed, for example it is little used in distinguishing between X-Ray photons and infrared. Frequency is equivalent to the quantum energy of the photon, as related by the Planck constant equation,\\n\\n,\\n\\nwhere is the photon\\'s energy, is the Plank constant, and is the frequency of the light associated with the photon. This frequency, , is typically measured in cycles per second, or equivalently, in Hz. The quantum energy of different photons is often used in cameras, and other machines that use visible and higher than visible radiation. This because these photons are energetic enough to ionize atoms. \\n\\nAnother property of a photon is its wavelength. The frequency , wavelength , and speed of light are related by the equation,\\n\\n,\\n\\nwhere (lambda) is the wavelength, or length of the wave (typically measured in meters.)\\n\\nAnother important property of a photon is its polarity. If you saw a giant photon coming straight at you, it could appear as a swath whipping vertically, horizontally, or somewhere in between. Polarized sunglasses stop photons swinging up and down from passing. This is how they reduce glare as light bouncing off of surfaces tend to fly that way. Liquid crystal displays also use polarity to control which light passes through. Some animals can see light polarization. \\n\\nFinally, a photon has a property called spin. Spin is related to light\\'s circular polarization.\\n\\nPhoton interactions with matter\\nLight is often created or absorbed when an electron gains or loses energy. This energy can be in the form of heat, kinetic energy, or other form. For example, an incandescent light bulb uses heat. The increase of energy can push an electron up one level in a shell called a \"valence\". This makes it unstable, and like everything, it wants to be in the lowest energy state. (If being in the lowest energy state is confusing, pick up a pencil and drop it. Once on the ground, the pencil will be in a lower energy state). When the electron drops back down to a lower energy state, it needs to release the energy that hit it, and it must obey the conservation of energy (energy can neither be created nor destroyed). Electrons release this energy as photons, and at higher intensities, this photon can be seen as visible light.\\n\\nPhotons and the electromagnetic force\\nIn particle physics, photons are responsible for electromagnetic force. Electromagnetism is an idea that combines electricity with magnetism. One common way that we experience electromagnetism in our daily lives is light, which is caused by electromagnetism. Electromagnetism is also responsible for charge, which is the reason that you can not push your hand through a table. Since photons are the force-carrying particle of electromagnetism, they are also gaug
" 'file_chunk_index': 0},\n",
" 'embedding': [0.006123070605099201,\n",
" 0.0076594278216362,\n",
" -0.012744419276714325,\n",
" ...]}"
]
},
"execution_count": 16,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Join up embeddings with our original list\n",
"embeddings_list = [{\"embedding\": v} for v in embeddings]\n",
"for i,x in enumerate(embeddings_list):\n",
" text_list[i].update(x)\n",
"text_list[0]"
]
},
{
"cell_type": "code",
"execution_count": 17,
"id": "932818e9",
"metadata": {},
"outputs": [],
"source": [
"# Create a Redis pipeline to load all the vectors and their metadata\n",
"def load_vectors(client:r, input_list, vector_field_name):\n",
" p = client.pipeline(transaction=False)\n",
" for text in input_list: \n",
" #hash key\n",
" key=f\"{PREFIX}:{text['id']}\"\n",
" \n",
" #hash values\n",
" item_metadata = text['metadata']\n",
" #\n",
" item_keywords_vector = np.array(text['embedding'],dtype= 'float32').tobytes()\n",
" item_metadata[vector_field_name]=item_keywords_vector\n",
" \n",
" # HSET\n",
" p.hset(key,mapping=item_metadata)\n",
" \n",
" p.execute()"
]
},
{
"cell_type": "code",
"execution_count": 18,
"id": "b218a207",
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"100%|████████████████████████████████████████████████████████████████████| 27/27 [00:07<00:00, 3.40it/s]\n"
]
}
],
"source": [
"batch_size = 100 # how many vectors we insert at once\n",
"\n",
"for i in tqdm(range(0, len(text_list), batch_size)):\n",
" # find end of batch\n",
" i_end = min(len(text_list), i+batch_size)\n",
" meta_batch = text_list[i:i_end]\n",
" \n",
" load_vectors(redis_client,meta_batch,vector_field_name=VECTOR_FIELD_NAME)"
]
},
{
"cell_type": "code",
"execution_count": 19,
"id": "d3466f7d",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"'2693'"
]
},
"execution_count": 19,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"redis_client.ft(INDEX_NAME).info()['num_docs']"
]
},
{
"cell_type": "markdown",
"id": "43839177",
"metadata": {},
"source": [
"### Search\n",
"\n",
"We can now use our knowledge base to bring back search results. This is one of the areas of highest friction in enterprise knowledge retrieval use cases, with the most common being that the system is not retrieving what you intuitively think are the most relevant documents. There are a few ways of tackling this - I'll share a few options here, as well as some resources to take your research further:\n",
"\n",
"#### Vector search, keyword search or a hybrid\n",
"\n",
"Despite the strong capabilities out of the box that vector search gives, search is still not a solved problem, and there are well proven [Lucene-based](https://en.wikipedia.org/wiki/Apache_Lucene) search solutions such Elasticsearch and Solr that use methods that work well for certain use cases, as well as the sparse vector methods of traditional NLP such as [TF-IDF](https://en.wikipedia.org/wiki/Tf%E2%80%93idf). If your retrieval is poor, the answer may be one of these in particular, or a combination:\n",
"- **Vector search:** Converts your text into vector embeddings which can be searched using KNN, SVM or some other model to return the most relevant results. This is the approach we take in this workbook, using a RediSearch vector DB which employs a KNN search under the hood.\n",
"- **Keyword search:** This method uses any keyword-based search approach to return a score - it could use Elasticsearch/Solr out-of-the-box, or a TF-IDF approach like BM25.\n",
"- **Hybrid search:** This last approach is a mix of the two, where you produce both a vector search and keyword search result, before using an ```alpha``` between 0 and 1 to weight the outputs. There is a great example of this explained by the Weaviate team [here](https://weaviate.io/blog/hybrid-search-explained).\n",
"\n",
"#### Hypothetical Document Embeddings (HyDE)\n",
"\n",
"This is a novel approach from [this paper](https://arxiv.org/abs/2212.10496), which states that a hypothetical answer to a question is more semantically similar to the real answer than the question is. In practice this means that your search would use GPT to generate a hypothetical answer, then embed that and use it for search. I've seen success with this both as a pure search, and as a retry step if the initial retrieval fails to retrieve relevant content. A simple example implementation is here:\n",
"```\n",
"def answer_question_hyde(question,prompt):\n",
" \n",
" hyde_prompt = '''You are OracleGPT, an helpful expert who answers user questions to the best of their ability.\n",
" Provide a confident answer to their question. If you don't know the answer, make the best guess you can based on the context of the question.\n",
"\n",
" User question: USER_QUESTION_HERE\n",
" \n",
" Answer:'''\n",
" \n",
" hypothetical_answer = openai.Completion.create(model=COMPLETIONS_MODEL,prompt=hyde_prompt.replace('USER_QUESTION_HERE',question))['choices'][0]['text']\n",
" \n",
" search_results = get_redis_results(redis_client,hypothetical_answer)\n",
" \n",
" return search_results\n",
"```\n",
"\n",
"#### Fine-tuning embeddings\n",
"\n",
"This next approach leverages the learning you gain from real question/answer pairs that your users will generate during the evaluation approach. It works by:\n",
"- Creating a dataset of positive (and optionally negative) question and answer pairs. Positive examples would be a correct retrieval to a question, while negative would be poor retrievals.\n",
"- Calculating the embeddings for both questions and answers and the cosine similarity between them.\n",
"- Train a model to optimize the embeddings matrix and test retrieval, picking the best one.\n",
"- Perform a matrix multiplication of the base Ada embeddings by this new best matrix, creating a new fine-tuned embedding to do for retrieval.\n",
"\n",
"There is a great walkthrough of both the approach and the code to perform it in [this cookbook](./Customizing_embeddings.ipynb).\n",
"\n",
"#### Reranking\n",
"\n",
"One other well-proven method from traditional search solutions that can be applied to any of the above approaches is reranking, where we over-fetch our search results, and then deterministically rerank based on a modifier or set of modifiers.\n",
"\n",
"An example is investor reports again - it is highly likely that if we have 3 reports on Apple, we'll want to make our investment decisions based on the latest one. In this instance a ```recency``` modifier could be applied to the vector scores to sort them, giving us the latest one on the top even if it is not the most semantically similar to our search question. "
]
},
{
"cell_type": "markdown",
"id": "9b2fdc7a",
"metadata": {},
"source": [
"For this walkthrough we'll stick with a basic semantic search bringing back the top 5 chunks for a user question, and providing a summarised response using GPT."
]
},
{
"cell_type": "code",
"execution_count": 20,
"id": "89da0c45",
"metadata": {},
"outputs": [],
"source": [
"# Make query to Redis\n",
"def query_redis(redis_conn,query,index_name, top_k=5):\n",
" \n",
" \n",
"\n",
" ## Creates embedding vector from user query\n",
" embedded_query = np.array(openai.Embedding.create(\n",
" input=query,\n",
" model=EMBEDDINGS_MODEL,\n",
" )[\"data\"][0]['embedding'], dtype=np.float32).tobytes()\n",
"\n",
" #prepare the query\n",
" q = Query(f'*=>[KNN {top_k} @{VECTOR_FIELD_NAME} $vec_param AS vector_score]').sort_by('vector_score').paging(0,top_k).return_fields('vector_score','url','title','content','text_chunk_index').dialect(2) \n",
" params_dict = {\"vec_param\": embedded_query}\n",
"\n",
" \n",
" #Execute the query\n",
" results = redis_conn.ft(index_name).search(q, query_params = params_dict)\n",
" \n",
" return results\n",
"\n",
"# Get mapped documents from Redis results\n",
"def get_redis_results(redis_conn,query,index_name):\n",
" \n",
" # Get most relevant documents from Redis\n",
" query_result = query_redis(redis_conn,query,index_name)\n",
" \n",
" # Extract info into a list\n",
" query_result_list = []\n",
" for i, result in enumerate(query_result.docs):\n",
" result_order = i\n",
" url = result.url\n",
" title = result.title\n",
" text = result.content\n",
" score = result.vector_score\n",
" query_result_list.append((result_order,url,title,text,score))\n",
" \n",
" # Display result as a DataFrame for ease of us\n",
" result_df = pd.DataFrame(query_result_list)\n",
" result_df.columns = ['id','url','title','result','certainty']\n",
" return result_df"
]
},
{
"cell_type": "code",
"execution_count": 21,
"id": "f0161a54",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"CPU times: user 7.1 ms, sys: 2.35 ms, total: 9.45 ms\n",
"Wall time: 495 ms\n"
]
},
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>id</th>\n",
" <th>url</th>\n",
" <th>title</th>\n",
" <th>result</th>\n",
" <th>certainty</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>0</th>\n",
" <td>0</td>\n",
" <td>https://simple.wikipedia.org/wiki/Thomas%20Dolby</td>\n",
" <td>Thomas Dolby</td>\n",
" <td>Title: Thomas Dolby;\\nThomas Dolby (born Thomas Morgan Robertson; 14 October 1958) is a British musican and computer designer. He is probably most famous for his 1982 hit, \"She Blinded me with Science\".\\n\\nHe married actress Kathleen Beller in 1988. The couple have three children together.\\n\\nDiscography\\n\\nSingles\\n\\nA Track did not chart in North America until 1983, after the success of \"She Blinded Me With Science\".\\n\\nAlbums\\n\\nStudio albums\\n\\nEPs\\n\\nReferences\\n\\nEnglish musicians\\nLiving people\\n1958 births\\nNew wave musicians\\nWarner Bros. Records artists</td>\n",
" <td>0.132723689079</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1</th>\n",
" <td>1</td>\n",
" <td>https://simple.wikipedia.org/wiki/Synthesizer</td>\n",
" <td>Synthesizer</td>\n",
" <td>Title: Synthesizer;\\nAudio technology</td>\n",
" <td>0.223129153252</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"</div>"
],
"text/plain": [
" id url title \\\n",
"0 0 https://simple.wikipedia.org/wiki/Thomas%20Dolby Thomas Dolby \n",
"1 1 https://simple.wikipedia.org/wiki/Synthesizer Synthesizer \n",
"\n",
" result \\\n",
"0 Title: Thomas Dolby;\\nThomas Dolby (born Thomas Morgan Robertson; 14 October 1958) is a British musican and computer designer. He is probably most famous for his 1982 hit, \"She Blinded me with Science\".\\n\\nHe married actress Kathleen Beller in 1988. The couple have three children together.\\n\\nDiscography\\n\\nSingles\\n\\nA Track did not chart in North America until 1983, after the success of \"She Blinded Me With Science\".\\n\\nAlbums\\n\\nStudio albums\\n\\nEPs\\n\\nReferences\\n\\nEnglish musicians\\nLiving people\\n1958 births\\nNew wave musicians\\nWarner Bros. Records artists \n",
"1 Title: Synthesizer;\\nAudio technology \n",
"\n",
" certainty \n",
"0 0.132723689079 \n",
"1 0.223129153252 "
]
},
"execution_count": 21,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"%%time\n",
"\n",
"wiki_query='What is Thomas Dolby known for?'\n",
"\n",
"result_df = get_redis_results(redis_client,wiki_query,index_name=INDEX_NAME)\n",
"result_df.head(2)"
]
},
{
"cell_type": "code",
"execution_count": 22,
"id": "48d136b0",
"metadata": {},
"outputs": [],
"source": [
"# Build a prompt to provide the original query, the result and ask to summarise for the user\n",
"retrieval_prompt = '''Use the content to answer the search query the customer has sent.\n",
"If you can't answer the user's question, say \"Sorry, I am unable to answer the question with the content\". Do not guess.\n",
"\n",
"Search query: \n",
"\n",
"SEARCH_QUERY_HERE\n",
"\n",
"Content: \n",
"\n",
"SEARCH_CONTENT_HERE\n",
"\n",
"Answer:\n",
"'''\n",
"\n",
"def answer_user_question(query):\n",
" \n",
" results = get_redis_results(redis_client,query,INDEX_NAME)\n",
" \n",
" retrieval_prepped = retrieval_prompt.replace('SEARCH_QUERY_HERE',query).replace('SEARCH_CONTENT_HERE',results['result'][0])\n",
" retrieval = openai.ChatCompletion.create(model=CHAT_MODEL,messages=[{'role':\"user\",'content': retrieval_prepped}],max_tokens=500)\n",
" \n",
" # Response provided by GPT-3.5\n",
" return retrieval['choices'][0]['message']['content']"
]
},
{
"cell_type": "code",
"execution_count": 23,
"id": "06f6e6ed",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Thomas Dolby is known for his music, particularly his 1982 hit \"She Blinded Me With Science\". He is also a computer designer.\n"
]
}
],
"source": [
"print(answer_user_question(wiki_query))"
]
},
{
"cell_type": "markdown",
"id": "70e9b239",
"metadata": {},
"source": [
"### Answer\n",
"\n",
"We've now created a knowledge base that can answer user questions on Wikipedia. However, the user experience could be better, and this is where the Answer layer comes in, where an LLM Agent is used to interact with the user.\n",
"\n",
"There are different level of complexity in building a knowledge retrieval experience leveraging an LLM; there is an experience vs. effort trade-off to consider when selecting the right type of interaction. There are many patterns, but I'll highlight a few of the most common here:\n",
"\n",
"#### Choosing the user experience and architecture\n",
"\n",
"There are different level of complexity in building a knowledge retrieval experience leveraging an LLM; there is an experience vs. effort trade-off to consider when selecting the right type of interaction. There are many patterns, but I'll highlight a few of the most common here:\n",
"- **Q&A:** Your classic search engine use case, where the user inputs a question and your LLM gives them an answer either using its knowledge or, much more commonly, using a knowledge base that you prepare using the steps we've covered already. This simple use case assumes no memory of past queries is required, and no ability to clarify with the human or ask for more information.\n",
"- **Chat:** I think of Chat as being Q&A + memory - this is a slightly more sophisticated interaction where the LLM remembers what was previously asked and can delve deeper on something already covered.\n",
"- **Agent:** The most sophisticated is what LangChain calls an Agent, they leverage large language models to process and produce human-like results through a variety of tools, and will chain queries together dynamically until it has an answer that the LLM feels is appropriate to answer the user's question. However, for every \"turn\" you allow between Agent and user you increase the risks of loss of context, hallucination, or parsing errors, so be clear about the exact requirements your users have before embarking on building the Answer layer.\n",
"\n",
"Q&A use cases are the simplest to implement, while Agents can give the most sophisticated user experience - in this notebook we'll build an Agent with memory and a single Tool to give an appreciation for the flexibilty prompt chaining gives you in getting a more complete answer for your users.\n",
"\n",
"#### Ensuring reliability\n",
"\n",
"The more complexity you add, the more chance your LLM will fail to respond correctly, or a response will come back in the wrong format and break your Answer pipeline. We'll share a few methods our customers have used elsewhere to help \"channel\" the Agent down a more deterministic path, and to deal with issues when they do crop up:\n",
"- **Prompt chaining:** Prompting the model to take a step-by-step approach and think aloud using a scratchpad has been proven to deliver more consistent results. It also means that as a developer you can break up one complex prompt into many simpler, more deterministic prompts, with the output of one prompt becoming the input for the next. This approach is known as Chain-of-Thought (CoT) reasoning - I'd suggest digging deeper as this is a dynamic new area of research, with a few of the key papers referenced here:\n",
" - Chain of thought prompting [paper](https://arxiv.org/abs/2201.11903)\n",
" - Self-reflecting agent [paper](https://arxiv.org/abs/2303.11366)\n",
"- **Self-referencing:** You can return references for the LLM's answer through either your application logic, or by prompt engineering it to return references. I would generally suggest doing it in your application logic, although if you have multiple chunks then a hybrid approach where you ask the LLM to return the key of the chunk it used could be advisable. I view this as a UX opportunity, where for many search use cases giving the \"raw\" output of the chunks retrieved as well as the summarised answer can give the user the best of both worlds, but please go with whatever is most appropriate for your users.\n",
"- **Discriminator models:** The best control for unwanted outputs is undoubtably through preventing it from happening with prompt engineering, prompt chaining and retrieval. However, when all these fail then a discriminator model is a useful detective control. This is a classifier trained on past unwanted outputs, that flags the Agent's response to the user as Safe or Not, enabling you to perform some business logic to either retry, pass to a human, or say it doesn't know. \n",
" - There is an example in our [Help Center](https://help.openai.com/en/articles/5528730-fine-tuning-a-classifier-to-improve-truthfulness).\n",
"\n",
"This is a dynamic topic that has still not consolidated to a clear design that works best above all others, so for ease of implementation we will use LangChain, which supplies a framework with implementations for most of the concepts we've discussed above.\n",
"\n",
"We'll create an Agent with access to our knowledge base, give it a prompt template and a custom parser for extracting the answers, set up a prompt chain and then let it answer our Wikipedia questions.\n",
"\n",
"Our work here draws heavily on LangChain's great documentation, in particular [this guide](https://python.langchain.com/en/latest/modules/agents/agents/custom_llm_chat_agent.html)."
]
},
{
"cell_type": "code",
"execution_count": 24,
"id": "0ccca3da",
"metadata": {},
"outputs": [],
"source": [
"from langchain.agents import Tool, AgentExecutor, LLMSingleActionAgent, AgentOutputParser\n",
"from langchain.prompts import BaseChatPromptTemplate\n",
"from langchain import SerpAPIWrapper, LLMChain\n",
"from langchain.chat_models import ChatOpenAI\n",
"from typing import List, Union\n",
"from langchain.schema import AgentAction, AgentFinish, HumanMessage\n",
"from langchain.memory import ConversationBufferWindowMemory\n",
"import re"
]
},
{
"cell_type": "code",
"execution_count": 25,
"id": "63455a8e",
"metadata": {},
"outputs": [],
"source": [
"def ask_gpt(query):\n",
" response = openai.ChatCompletion.create(model=CHAT_MODEL,messages=[{\"role\":\"user\",\"content\":\"Please answer my question.\\nQuestion: {}\".format(query)}],temperature=0)\n",
" return response['choices'][0]['message']['content']"
]
},
{
"cell_type": "code",
"execution_count": 26,
"id": "68a0b8dd",
"metadata": {},
"outputs": [],
"source": [
"# Define which tools the agent can use to answer user queries\n",
"tools = [\n",
" Tool(\n",
" name = \"Search\",\n",
" func=answer_user_question,\n",
" description=\"Useful for when you need to answer general knowledge questions. Input should be a fully formed question.\"\n",
" ),\n",
" Tool(\n",
" name = \"Knowledge\",\n",
" func = ask_gpt,\n",
" description = \"Useful for any other questions. Input should be a fully formed question.\"\n",
" )\n",
"]"
]
},
{
"cell_type": "code",
"execution_count": 27,
"id": "39e101ee",
"metadata": {},
"outputs": [],
"source": [
"# Set up the base template\n",
"template = \"\"\"You are WikiGPT, a helpful bot who answers question using your tools or your own knowledge.\n",
"You have access to the following tools::\n",
"\n",
"{tools}\n",
"\n",
"Use the following format:\n",
"\n",
"Question: the input question you must answer\n",
"Thought: you should always think about what to do\n",
"Action: the action to take, should be one of [{tool_names}]\n",
"Action Input: the input to the action\n",
"Observation: the result of the action\n",
"... (this Thought/Action/Action Input/Observation can repeat N times)\n",
"Thought: I now know the final answer\n",
"Final Answer: the final answer to the original input question\n",
"\n",
"Begin!\n",
"\n",
"Previous conversation history:\n",
"{history}\n",
"\n",
"New question: {input}\n",
"{agent_scratchpad}\"\"\""
]
},
{
"cell_type": "code",
"execution_count": 28,
"id": "a2b9f271",
"metadata": {},
"outputs": [],
"source": [
"# Set up a prompt template\n",
"class CustomPromptTemplate(BaseChatPromptTemplate):\n",
" # The template to use\n",
" template: str\n",
" # The list of tools available\n",
" tools: List[Tool]\n",
" \n",
" def format_messages(self, **kwargs) -> str:\n",
" # Get the intermediate steps (AgentAction, Observation tuples)\n",
" # Format them in a particular way\n",
" intermediate_steps = kwargs.pop(\"intermediate_steps\")\n",
" thoughts = \"\"\n",
" for action, observation in intermediate_steps:\n",
" thoughts += action.log\n",
" thoughts += f\"\\nObservation: {observation}\\nThought: \"\n",
" # Set the agent_scratchpad variable to that value\n",
" kwargs[\"agent_scratchpad\"] = thoughts\n",
" # Create a tools variable from the list of tools provided\n",
" kwargs[\"tools\"] = \"\\n\".join([f\"{tool.name}: {tool.description}\" for tool in self.tools])\n",
" # Create a list of tool names for the tools provided\n",
" kwargs[\"tool_names\"] = \", \".join([tool.name for tool in self.tools])\n",
" formatted = self.template.format(**kwargs)\n",
" return [HumanMessage(content=formatted)]\n",
" \n",
" \n",
"class CustomOutputParser(AgentOutputParser):\n",
" \n",
" def parse(self, llm_output: str) -> Union[AgentAction, AgentFinish]:\n",
" # Check if agent should finish\n",
" if \"Final Answer:\" in llm_output:\n",
" return AgentFinish(\n",
" # Return values is generally always a dictionary with a single `output` key\n",
" # It is not recommended to try anything else at the moment :)\n",
" return_values={\"output\": llm_output.split(\"Final Answer:\")[-1].strip()},\n",
" log=llm_output,\n",
" )\n",
" # Parse out the action and action input\n",
" regex = r\"Action\\s*\\d*\\s*:(.*?)\\nAction\\s*\\d*\\s*Input\\s*\\d*\\s*:[\\s]*(.*)\"\n",
" match = re.search(regex, llm_output, re.DOTALL)\n",
" if not match:\n",
" raise ValueError(f\"Could not parse LLM output: `{llm_output}`\")\n",
" action = match.group(1).strip()\n",
" action_input = match.group(2)\n",
" # Return the action and action input\n",
" return AgentAction(tool=action, tool_input=action_input.strip(\" \").strip('\"'), log=llm_output)"
]
},
{
"cell_type": "code",
"execution_count": 29,
"id": "454c3ca9",
"metadata": {},
"outputs": [],
"source": [
"prompt = CustomPromptTemplate(\n",
" template=template,\n",
" tools=tools,\n",
" # This omits the `agent_scratchpad`, `tools`, and `tool_names` variables because those are generated dynamically\n",
" # The history template includes \"history\" as an input variable so we can interpolate it into the prompt\n",
" input_variables=[\"input\", \"intermediate_steps\", \"history\"]\n",
")\n",
"\n",
"# Initiate the memory with k=2 to keep the last two turns\n",
"# Provide the memory to the agent\n",
"memory = ConversationBufferWindowMemory(k=2)"
]
},
{
"cell_type": "code",
"execution_count": 30,
"id": "34de07d2",
"metadata": {},
"outputs": [],
"source": [
"output_parser = CustomOutputParser()\n",
"\n",
"llm = ChatOpenAI(temperature=0)\n",
"\n",
"# LLM chain consisting of the LLM and a prompt\n",
"llm_chain = LLMChain(llm=llm, prompt=prompt)\n",
"\n",
"tool_names = [tool.name for tool in tools]\n",
"agent = LLMSingleActionAgent(\n",
" llm_chain=llm_chain, \n",
" output_parser=output_parser,\n",
" stop=[\"\\nObservation:\"], \n",
" allowed_tools=tool_names\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 31,
"id": "d3603d58",
"metadata": {},
"outputs": [],
"source": [
"agent_executor = AgentExecutor.from_agent_and_tools(agent=agent, tools=tools, verbose=True, memory=memory)"
]
},
{
"cell_type": "code",
"execution_count": 32,
"id": "6bfb594b",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
"\u001b[32;1m\u001b[1;3mThought: I'm not sure who Thomas Dolby is, I should probably search for more information.\n",
"Action: Search\n",
"Action Input: \"What is Thomas Dolby known for?\"\u001b[0m\n",
"\n",
"Observation:\u001b[36;1m\u001b[1;3mThomas Dolby is known for being a British musician and computer designer, and his 1982 hit \"She Blinded Me With Science\".\u001b[0m\u001b[32;1m\u001b[1;3mNow that I know who Thomas Dolby is, I can answer the question.\n",
"Final Answer: Thomas Dolby is known for being a British musician and computer designer, and his 1982 hit \"She Blinded Me With Science\".\u001b[0m\n",
"\n",
"\u001b[1m> Finished chain.\u001b[0m\n"
]
},
{
"data": {
"text/plain": [
"'Thomas Dolby is known for being a British musician and computer designer, and his 1982 hit \"She Blinded Me With Science\".'"
]
},
"execution_count": 32,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"agent_executor.run(wiki_query)"
]
},
{
"cell_type": "code",
"execution_count": 33,
"id": "ba65b7e3",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
"\u001b[32;1m\u001b[1;3mThought: This is a simple math question.\n",
"Action: Knowledge\n",
"Action Input: What is the sum of 5 and 5?\u001b[0m\n",
"\n",
"Observation:\u001b[33;1m\u001b[1;3mThe sum of 5 and 5 is 10.\u001b[0m\u001b[32;1m\u001b[1;3mI now know the final answer.\n",
"Final Answer: The sum of 5 and 5 is 10.\u001b[0m\n",
"\n",
"\u001b[1m> Finished chain.\u001b[0m\n"
]
},
{
"data": {
"text/plain": [
"'The sum of 5 and 5 is 10.'"
]
},
"execution_count": 33,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"agent_executor.run('What is 5 + 5')"
]
},
{
"cell_type": "markdown",
"id": "f19f7a7e",
"metadata": {},
"source": [
"### Evaluation\n",
"\n",
"Last comes the not-so-fun bit that will make the difference between nifty prototype and production application - the process of evaluating and tuning your results. \n",
"\n",
"The key takeaway here is to make a framework that saves the results of each evaluation, as well as the parameters. Evaluation can be a difficult task that takes significant resources, so it is best to start prepared to handle multiple iterations. Some useful principles we've seen successful deployments use are:\n",
"- **Assign clear product ownership and metrics:** Ensure you have a team aligned from the start to annotate the outputs and determine whether they're bad or good. This may seem an obvious step, but too often the focus is on the engineering challenge of successfully retrieving content rather than the product challenge of providing retrieval results that are useful.\n",
"- **Log everything:** Store all requests and responses to and from your LLM and retrieval service if you can, it builds a great base for fine-tuning both the embeddings and any fine-tuned models or few-shot LLMs in future.\n",
"- **Use GPT-4 as a labeller:** When running evaluations, it can help to use GPT-4 as a gatekeeper for human annotation. Human annotation is costly and time-consuming, so doing an initial evaluation run with GPT-4 can help set a quality bar that needs to be met to justify human labeling. At this stage I would not suggest using GPT-4 as your only labeler, but it can certainly ease the burden.\n",
" - This approach is outlined further in [this paper](https://arxiv.org/abs/2108.13487).\n",
"\n",
"We'll use these principles to make a quick evaluation framework where we will:\n",
"- Use GPT-4 to make a list of hypothetical questions on our topic\n",
"- Ask our Agent the questions and save question/answer tuples\n",
" - These two above steps simulate the actual users interacting with your application\n",
"- Get GPT-4 to evaluate whether the answers correctly respond to the questions\n",
"- Look at our results to measure how well the Agent answered the questions\n",
"- Plan remedial action"
]
},
{
"cell_type": "code",
"execution_count": 34,
"id": "b8314f7b",
"metadata": {},
"outputs": [],
"source": [
"import time\n",
"\n",
"# Build a prompt to provide the original query, the result and ask to summarise for the user\n",
"evaluation_question_prompt = '''You are a helpful Wikipedia assistant who will generate a list of 10 creative general knowledge questions in markdown format.\n",
"\n",
"Example:\n",
"- Explain how photons work\n",
"- What is Thomas Dolby known for?\n",
"- What are some key events of the 20th century?\n",
"\n",
"Begin!\n",
"'''\n",
"\n",
"try:\n",
" # We'll use our model to generate 10 hypothetical questions to evaluate\n",
" question = openai.ChatCompletion.create(model=CHAT_MODEL\n",
" ,messages=[{\"role\":\"user\",\"content\":evaluation_question_prompt}]\n",
" ,temperature=0.9)\n",
" evaluation_questions = question['choices'][0]['message']['content']\n",
"except Exception as e:\n",
" print(e)\n"
]
},
{
"cell_type": "code",
"execution_count": 35,
"id": "3e0ee344",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"['1. What is the difference between weather and climate?', '2. Who designed the Eiffel Tower?', '3. What is the capital of Australia?', '4. What is the chemical symbol for gold?', '5. Who invented the telephone?', '6. What is the largest organ in the human body?', '7. Which famous artist painted the Mona Lisa?', '8. What is the highest mountain in Africa?', '9. What famous building was destroyed during the September 11th attacks?', '10. Who wrote the novel \"To Kill a Mockingbird\"?']\n"
]
}
],
"source": [
"cleaned_questions = evaluation_questions.split('\\n')\n",
"print(cleaned_questions)"
]
},
{
"cell_type": "code",
"execution_count": 36,
"id": "4446041c",
"metadata": {},
"outputs": [],
"source": [
"# We'll use our agent to answer the generated questions to simulate users interacting with the system\n",
"question_answer_pairs = []\n",
"\n",
"for question in cleaned_questions:\n",
" memory = ConversationBufferWindowMemory(k=2)\n",
" \n",
" agent_executor = AgentExecutor.from_agent_and_tools(agent=agent\n",
" , tools=tools\n",
" , verbose=False\n",
" ,memory=memory)\n",
" try:\n",
" \n",
" answer = agent_executor.run(question)\n",
" except Exception as e:\n",
" print(question)\n",
" print(e)\n",
" answer = 'Unable to answer question'\n",
" question_answer_pairs.append((question,answer))\n",
" time.sleep(2)"
]
},
{
"cell_type": "code",
"execution_count": 37,
"id": "b8ea3f6a",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"(10,\n",
" [('1. What is the difference between weather and climate?',\n",
" 'Weather refers to short-term atmospheric conditions in a specific area, while climate refers to long-term patterns and trends of weather in a particular region over a period of time.'),\n",
" ('2. Who designed the Eiffel Tower?',\n",
" 'Gustave Eiffel designed the Eiffel Tower.'),\n",
" ('3. What is the capital of Australia?',\n",
" 'The capital of Australia is Canberra.'),\n",
" ('4. What is the chemical symbol for gold?',\n",
" 'The chemical symbol for gold is Au.'),\n",
" ('5. Who invented the telephone?',\n",
" 'Alexander Graham Bell invented the telephone.')])"
]
},
"execution_count": 37,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"len(question_answer_pairs), question_answer_pairs[:5]"
]
},
{
"cell_type": "code",
"execution_count": 38,
"id": "1cf3c9e1",
"metadata": {},
"outputs": [],
"source": [
"# Build a prompt to provide the original query, the result and ask to evaluate for the user\n",
"gpt_evaluator_system = '''You are WikiGPT, a helpful Wikipedia expert.\n",
"You will be presented with general knowledge questions our users have asked.\n",
"\n",
"Think about this step by step:\n",
"- You need to decide whether the answer adequately answers the question\n",
"- If it answers the question, you will say \"Correct\"\n",
"- If it doesn't answer the question, you will say one of the following:\n",
" - If it couldn't answer at all, you will say \"Unable to answer\"\n",
" - If the answer was provided but was incorrect, you will say \"Incorrect\" \n",
"- If none of these rules are met, say \"Unable to evaluate\"\n",
"\n",
"Evaluation can only be \"Correct\", \"Incorrect\", \"Unable to answer\", and \"Unable to evaluate\"\n",
"\n",
"Example 1:\n",
"\n",
"Question: What is the cost cap for the 2023 season of Formula 1?\n",
"\n",
"Answer: The cost cap for 2023 is 95m USD.\n",
"\n",
"Evaluation: Correct\n",
"\n",
"Example 2:\n",
"\n",
"Question: What is Thomas Dolby known for?\n",
"\n",
"Answer: Inventing electricity\n",
"\n",
"Evaluation: Incorrect\n",
"\n",
"Begin!'''\n",
"\n",
"# We'll provide our evaluator the questions and answers we've generated and get it to evaluate them as one of our four evaluation categories.\n",
"gpt_evaluator_message = '''\n",
"Question: {question}\n",
"\n",
"Answer: {answer}\n",
"\n",
"Evaluation:'''"
]
},
{
"cell_type": "code",
"execution_count": 39,
"id": "8a351793",
"metadata": {},
"outputs": [],
"source": [
"evaluation_output = []"
]
},
{
"cell_type": "code",
"execution_count": 40,
"id": "3b286e5f",
"metadata": {},
"outputs": [],
"source": [
"for pair in question_answer_pairs:\n",
" \n",
" message = gpt_evaluator_message.format(question=pair[0]\n",
" ,answer=pair[1])\n",
" evaluation = openai.ChatCompletion.create(model=CHAT_MODEL\n",
" ,messages=[{\"role\":\"system\",\"content\":gpt_evaluator_system}\n",
" ,{\"role\":\"user\",\"content\":message}]\n",
" ,temperature=0)\n",
" \n",
" evaluation_output.append((pair[0]\n",
" ,pair[1]\n",
" ,evaluation['choices'][0]['message']['content']))"
]
},
{
"cell_type": "code",
"execution_count": 41,
"id": "f18678e0",
"metadata": {},
"outputs": [],
"source": [
"# We'll smooth the results for a simpler evaluation matrix\n",
"# In a real scenario we would take time and tune our prompt/add few shot examples to ensure consistent output from the evaluation step\n",
"def collate_results(x):\n",
" text = x.lower()\n",
" \n",
" if 'incorrect' in text:\n",
" return 'incorrect'\n",
" elif 'correct' in text:\n",
" return 'correct'\n",
" else: \n",
" return 'unable to answer'"
]
},
{
"cell_type": "code",
"execution_count": 42,
"id": "cd29fc90",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"correct 10\n",
"Name: evaluation, dtype: int64"
]
},
"execution_count": 42,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"eval_df = pd.DataFrame(evaluation_output)\n",
"eval_df.columns = ['question','answer','evaluation']\n",
"# Replacing all the \"unable to evaluates\" with \"unable to answer\"\n",
"eval_df['evaluation'] = eval_df['evaluation'].apply(lambda x: collate_results(x))\n",
"eval_df.evaluation.value_counts()"
]
},
{
"cell_type": "markdown",
"id": "566053d6",
"metadata": {},
"source": [
"#### Analysis\n",
"\n",
"Depending on how GPT did here you may have actually gotten some good responses, but in all likelihood in the real world you'll end up with incorrect or unable to answer results, and will need to tune your search, LLM or another aspect of the pipeline.\n",
"\n",
"Your remediation plan could be as follows:\n",
"- **Incorrect answers:** Either prompt engineering to help the model work out how to answer better (maybe even a bigger model like GPT-4), or search optimisation to return more relevant chunks. Chunking/embedding changes may help this as well - larger chunks may give more context, allowing the model to formulate a better answer.\n",
"- **Unable to answer:** This is either a retrieval problem, or the data doesn't exist in our knowledge base. We can prompt engineer to classify questions that are \"out-of-bounds\" and give the user a stock reply, or we can tune our search so the relevant data is returned.\n",
"\n",
"This is the framework we'll build on to get our knowledge retrieval solution to production - again, log everything and store each run down to a question level so you can track regressions and iterate towards your production solution."
]
},
{
"cell_type": "markdown",
"id": "5016fcb8",
"metadata": {},
"source": [
"## Conclusion\n",
"\n",
"This concludes our Enterprise Knowledge Retrieval walkthrough. We hope you've found it useful, and that you're now in a position to build enterprise knowledge retrieval solutions, and have a few tricks to start you down the road of putting them into production."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "pchain_env",
"language": "python",
"name": "pchain_env"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.10"
}
},
"nbformat": 4,
"nbformat_minor": 5
}