langchain/docs/modules/chains/examples/graph_nebula_qa.ipynb

271 lines
7.0 KiB
Plaintext
Raw Normal View History

{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"id": "c94240f5",
"metadata": {},
"source": [
"# NebulaGraphQAChain\n",
"\n",
"This notebook shows how to use LLMs to provide a natural language interface to NebulaGraph database."
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "dbc0ee68",
"metadata": {},
"source": [
"You will need to have a running NebulaGraph cluster, for which you can run a containerized cluster by running the following script:\n",
"\n",
"```bash\n",
"curl -fsSL nebula-up.siwei.io/install.sh | bash\n",
"```\n",
"\n",
"Other options are:\n",
"- Install as a [Docker Desktop Extension](https://www.docker.com/blog/distributed-cloud-native-graph-database-nebulagraph-docker-extension/). See [here](https://docs.nebula-graph.io/3.5.0/2.quick-start/1.quick-start-workflow/)\n",
"- NebulaGraph Cloud Service. See [here](https://www.nebula-graph.io/cloud)\n",
"- Deploy from package, source code, or via Kubernetes. See [here](https://docs.nebula-graph.io/)\n",
"\n",
"Once the cluster is running, we could create the SPACE and SCHEMA for the database."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "c82f4141",
"metadata": {},
"outputs": [],
"source": [
"%pip install ipython-ngql\n",
"%load_ext ngql\n",
"\n",
"# connect ngql jupyter extension to nebulagraph\n",
"%ngql --address 127.0.0.1 --port 9669 --user root --password nebula\n",
"# create a new space\n",
"%ngql CREATE SPACE IF NOT EXISTS langchain(partition_num=1, replica_factor=1, vid_type=fixed_string(128));\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "eda0809a",
"metadata": {},
"outputs": [],
"source": [
"# Wait for a few seconds for the space to be created.\n",
"%ngql USE langchain;"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "119fe35c",
"metadata": {},
"source": [
"Create the schema, for full dataset, refer [here](https://www.siwei.io/en/nebulagraph-etl-dbt/)."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5aa796ee",
"metadata": {},
"outputs": [],
"source": [
"%%ngql\n",
"CREATE TAG IF NOT EXISTS movie(name string);\n",
"CREATE TAG IF NOT EXISTS person(name string, birthdate string);\n",
"CREATE EDGE IF NOT EXISTS acted_in();\n",
"CREATE TAG INDEX IF NOT EXISTS person_index ON person(name(128));\n",
"CREATE TAG INDEX IF NOT EXISTS movie_index ON movie(name(128));"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "66e4799a",
"metadata": {},
"source": [
"Wait for schema creation to complete, then we can insert some data."
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "d8eea530",
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"UsageError: Cell magic `%%ngql` not found.\n"
]
}
],
"source": [
"%%ngql\n",
"INSERT VERTEX person(name, birthdate) VALUES \"Al Pacino\":(\"Al Pacino\", \"1940-04-25\");\n",
"INSERT VERTEX movie(name) VALUES \"The Godfather II\":(\"The Godfather II\");\n",
"INSERT VERTEX movie(name) VALUES \"The Godfather Coda: The Death of Michael Corleone\":(\"The Godfather Coda: The Death of Michael Corleone\");\n",
"INSERT EDGE acted_in() VALUES \"Al Pacino\"->\"The Godfather II\":();\n",
"INSERT EDGE acted_in() VALUES \"Al Pacino\"->\"The Godfather Coda: The Death of Michael Corleone\":();"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "62812aad",
"metadata": {},
"outputs": [],
"source": [
"from langchain.chat_models import ChatOpenAI\n",
"from langchain.chains import NebulaGraphQAChain\n",
"from langchain.graphs import NebulaGraph"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "0928915d",
"metadata": {},
"outputs": [],
"source": [
"graph = NebulaGraph(\n",
" space=\"langchain\",\n",
" username=\"root\",\n",
" password=\"nebula\",\n",
" address=\"127.0.0.1\",\n",
" port=9669,\n",
" session_pool_size=30,\n",
")"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "58c1a8ea",
"metadata": {},
"source": [
"## Refresh graph schema information\n",
"\n",
"If the schema of database changes, you can refresh the schema information needed to generate nGQL statements."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "4e3de44f",
"metadata": {},
"outputs": [],
"source": [
"# graph.refresh_schema()"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "1fe76ccd",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Node properties: [{'tag': 'movie', 'properties': [('name', 'string')]}, {'tag': 'person', 'properties': [('name', 'string'), ('birthdate', 'string')]}]\n",
"Edge properties: [{'edge': 'acted_in', 'properties': []}]\n",
"Relationships: ['(:person)-[:acted_in]->(:movie)']\n",
"\n"
]
}
],
"source": [
"print(graph.get_schema)"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "68a3c677",
"metadata": {},
"source": [
"## Querying the graph\n",
"\n",
"We can now use the graph cypher QA chain to ask question of the graph"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "7476ce98",
"metadata": {},
"outputs": [],
"source": [
"chain = NebulaGraphQAChain.from_llm(\n",
" ChatOpenAI(temperature=0), graph=graph, verbose=True\n",
")\n"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "ef8ee27b",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"\u001b[1m> Entering new NebulaGraphQAChain chain...\u001b[0m\n",
"Generated nGQL:\n",
"\u001b[32;1m\u001b[1;3mMATCH (p:`person`)-[:acted_in]->(m:`movie`) WHERE m.`movie`.`name` == 'The Godfather II'\n",
"RETURN p.`person`.`name`\u001b[0m\n",
"Full Context:\n",
"\u001b[32;1m\u001b[1;3m{'p.person.name': ['Al Pacino']}\u001b[0m\n",
"\n",
"\u001b[1m> Finished chain.\u001b[0m\n"
]
},
{
"data": {
"text/plain": [
"'Al Pacino played in The Godfather II.'"
]
},
"execution_count": 6,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"chain.run(\"Who played in The Godfather II?\")"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
}
},
"nbformat": 4,
"nbformat_minor": 5
}