fix: some typo and add execution command sample (#299)

* fix: some typo and add execution command sample

* fix: some typo and add execution command sample

1. hybrid-search-with-weaviate-and-openai.ipynb

* fix: some typo and add execution command sample

1. question-answering-with-weaviate-and-openai.ipynb
pull/1077/head
liuliu 1 year ago committed by GitHub
parent ded5b0cb3e
commit 37fd3a6052

@ -22,14 +22,14 @@
"\n",
"Weaviate uses KNN algorithms to create an vector-optimized index, which allows your queries to run extremely fast. Learn more [here](https://weaviate.io/blog/why-is-vector-search-so-fast).\n",
"\n",
"Weaviate let's you use your favorite ML-models, and scale seamlessly into billions of data objects.\n",
"Weaviate let you use your favorite ML-models, and scale seamlessly into billions of data objects.\n",
"\n",
"### Deployment options\n",
"\n",
"Whatever your scenario or production setup, Weaviate has an option for you. You can deploy Weaviate in the following setups:\n",
"* Self-hosted you can deploy Weaviate with docker locally, or any server you want.\n",
"* SaaS you can use [Weaviate Cloud Service (WCS)](https://console.weaviate.io/) to host your Weaviate instances.\n",
"* Hybrid-Saas you can deploy Weaviate in your own private Cloud Service \n",
"* Hybrid-SaaS you can deploy Weaviate in your own private Cloud Service.\n",
"\n",
"### Programming languages\n",
"\n",
@ -39,7 +39,7 @@
"* [Java](https://weaviate.io/developers/weaviate/client-libraries/java)\n",
"* [Go](https://weaviate.io/developers/weaviate/client-libraries/go)\n",
"\n",
"Additionally, Weavaite has a [REST layer](https://weaviate.io/developers/weaviate/api/rest/objects). Basically you can call Weaviate from any language that supports REST requests."
"Additionally, Weaviate has a [REST layer](https://weaviate.io/developers/weaviate/api/rest/objects). Basically you can call Weaviate from any language that supports REST requests."
]
},
{
@ -49,16 +49,16 @@
"source": [
"## Demo Flow\n",
"The demo flow is:\n",
"- **Prerequisites Setup**: Create a Weaviate instance and install required libraries\n",
"- **Prerequisites Setup**: Create a Weaviate instance and install the required libraries\n",
"- **Connect**: Connect to your Weaviate instance \n",
"- **Schema Configuration**: Configure the schema of your data\n",
" - *Note*: Here we can define which OpenAI Embedding Model to use\n",
" - *Note*: Here we can configure which properties to index on\n",
" - *Note*: Here we can configure which properties to index\n",
"- **Import data**: Load a demo dataset and import it into Weaviate\n",
" - *Note*: The import process will automatically index your data - based on the configuration in the schema\n",
" - *Note*: You don't need to explicitly vectorize your data, Weaviate will communicate with OpenAI to do it for you.\n",
" - *Note*: You don't need to explicitly vectorize your data, Weaviate will communicate with OpenAI to do it for you\n",
"- **Run Queries**: Query \n",
" - *Note*: You don't need to explicitly vectorize your queries, Weaviate will communicate with OpenAI to do it for you.\n",
" - *Note*: You don't need to explicitly vectorize your queries, Weaviate will communicate with OpenAI to do it for you\n",
"\n",
"Once you've run through this notebook you should have a basic understanding of how to setup and use vector databases, and can move on to more complex use cases making use of our embeddings."
]
@ -69,9 +69,9 @@
"metadata": {},
"source": [
"## OpenAI Module in Weaviate\n",
"All Weaviate instances come equiped with the [text2vec-openai](https://weaviate.io/developers/weaviate/modules/retriever-vectorizer-modules/text2vec-openai) module.\n",
"All Weaviate instances come equipped with the [text2vec-openai](https://weaviate.io/developers/weaviate/modules/retriever-vectorizer-modules/text2vec-openai) module.\n",
"\n",
"This module is responsible handling vectorization at import (or any CRUD operations) and when you run a query.\n",
"This module is responsible for handling vectorization during import (or any CRUD operations) and when you run a query.\n",
"\n",
"### No need to manually vectorize data\n",
"This is great news for you. With [text2vec-openai](https://weaviate.io/developers/weaviate/modules/retriever-vectorizer-modules/text2vec-openai) you don't need to manually vectorize your data, as Weaviate will call OpenAI for you whenever necessary.\n",
@ -120,7 +120,7 @@
"\n",
"Install and run Weaviate locally with Docker.\n",
"1. Download the [./docker-compose.yml](./docker-compose.yml) file\n",
"2. Then open your terminal, navigate to where your docker-compose.yml folder, and start docker with: `docker-compose up -d`\n",
"2. Then open your terminal, navigate to where your docker-compose.yml file is located, and start docker with: `docker-compose up -d`\n",
"3. Once this is ready, your instance should be available at [http://localhost:8080](http://localhost:8080)\n",
"\n",
"Note. To shut down your docker instance you can call: `docker-compose down`\n",
@ -145,7 +145,7 @@
"\n",
"### datasets & apache-beam\n",
"\n",
"To load sample data, you need the `datasets` library and its' dependency `apache-beam`."
"To load sample data, you need the `datasets` library and its dependency `apache-beam`."
]
},
{
@ -170,13 +170,24 @@
"===========================================================\n",
"## Prepare your OpenAI API key\n",
"\n",
"The `OpenAI API key` is used for vectorization of your data at import, and for queries.\n",
"The `OpenAI API key` is used for vectorization of your data at import, and for running queries.\n",
"\n",
"If you don't have an OpenAI API key, you can get one from [https://beta.openai.com/account/api-keys](https://beta.openai.com/account/api-keys).\n",
"\n",
"Once you get your key, please add it to your environment variables as `OPENAI_API_KEY`."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "43395339",
"metadata": {},
"outputs": [],
"source": [
"# Export OpenAI API Key\n",
"!export OPENAI_API_KEY=\"your key\""
]
},
{
"cell_type": "code",
"execution_count": null,
@ -207,7 +218,7 @@
"In this section, we will:\n",
"\n",
"1. test env variable `OPENAI_API_KEY` **make sure** you completed the step in [#Prepare-your-OpenAI-API-key](#Prepare-your-OpenAI-API-key)\n",
"2. connect to your Weaviate your `OpenAI API Key`\n",
"2. connect to your Weaviate with your `OpenAI API Key`\n",
"3. and test the client connection\n",
"\n",
"### The client \n",
@ -229,7 +240,7 @@
"# Connect to your Weaviate instance\n",
"client = weaviate.Client(\n",
" url=\"https://your-wcs-instance-name.weaviate.network/\",\n",
"# url=\"http://localhost:8080/\",\n",
" # url=\"http://localhost:8080/\",\n",
" additional_headers={\n",
" \"X-OpenAI-Api-Key\": os.getenv(\"OPENAI_API_KEY\")\n",
" }\n",

@ -22,14 +22,14 @@
"\n",
"Weaviate uses KNN algorithms to create an vector-optimized index, which allows your queries to run extremely fast. Learn more [here](https://weaviate.io/blog/why-is-vector-search-so-fast).\n",
"\n",
"Weaviate let's you use your favorite ML-models, and scale seamlessly into billions of data objects.\n",
"Weaviate let you use your favorite ML-models, and scale seamlessly into billions of data objects.\n",
"\n",
"### Deployment options\n",
"\n",
"Whatever your scenario or production setup, Weaviate has an option for you. You can deploy Weaviate in the following setups:\n",
"* Self-hosted you can deploy Weaviate with docker locally, or any server you want.\n",
"* SaaS you can use [Weaviate Cloud Service (WCS)](https://console.weaviate.io/) to host your Weaviate instances.\n",
"* Hybrid-Saas you can deploy Weaviate in your own private Cloud Service \n",
"* Hybrid-SaaS you can deploy Weaviate in your own private Cloud Service \n",
"\n",
"### Programming languages\n",
"\n",
@ -39,7 +39,7 @@
"* [Java](https://weaviate.io/developers/weaviate/client-libraries/java)\n",
"* [Go](https://weaviate.io/developers/weaviate/client-libraries/go)\n",
"\n",
"Additionally, Weavaite has a [REST layer](https://weaviate.io/developers/weaviate/api/rest/objects). Basically you can call Weaviate from any language that supports REST requests."
"Additionally, Weaviate has a [REST layer](https://weaviate.io/developers/weaviate/api/rest/objects). Basically you can call Weaviate from any language that supports REST requests."
]
},
{
@ -53,12 +53,12 @@
"- **Connect**: Connect to your Weaviate instance \n",
"- **Schema Configuration**: Configure the schema of your data\n",
" - *Note*: Here we can define which OpenAI Embedding Model to use\n",
" - *Note*: Here we can configure which properties to index on\n",
" - *Note*: Here we can configure which properties to index\n",
"- **Import data**: Load a demo dataset and import it into Weaviate\n",
" - *Note*: The import process will automatically index your data - based on the configuration in the schema\n",
" - *Note*: You don't need to explicitly vectorize your data, Weaviate will communicate with OpenAI to do it for you.\n",
" - *Note*: You don't need to explicitly vectorize your data, Weaviate will communicate with OpenAI to do it for you\n",
"- **Run Queries**: Query \n",
" - *Note*: You don't need to explicitly vectorize your queries, Weaviate will communicate with OpenAI to do it for you.\n",
" - *Note*: You don't need to explicitly vectorize your queries, Weaviate will communicate with OpenAI to do it for you\n",
"\n",
"Once you've run through this notebook you should have a basic understanding of how to setup and use vector databases, and can move on to more complex use cases making use of our embeddings."
]
@ -69,9 +69,9 @@
"metadata": {},
"source": [
"## OpenAI Module in Weaviate\n",
"All Weaviate instances come equiped with the [text2vec-openai](https://weaviate.io/developers/weaviate/modules/retriever-vectorizer-modules/text2vec-openai) module.\n",
"All Weaviate instances come equipped with the [text2vec-openai](https://weaviate.io/developers/weaviate/modules/retriever-vectorizer-modules/text2vec-openai) module.\n",
"\n",
"This module is responsible handling vectorization at import (or any CRUD operations) and when you run a query.\n",
"This module is responsible for handling vectorization during import (or any CRUD operations) and when you run a query.\n",
"\n",
"### No need to manually vectorize data\n",
"This is great news for you. With [text2vec-openai](https://weaviate.io/developers/weaviate/modules/retriever-vectorizer-modules/text2vec-openai) you don't need to manually vectorize your data, as Weaviate will call OpenAI for you whenever necessary.\n",
@ -120,7 +120,7 @@
"\n",
"Install and run Weaviate locally with Docker.\n",
"1. Download the [./docker-compose.yml](./docker-compose.yml) file\n",
"2. Then open your terminal, navigate to where your docker-compose.yml folder, and start docker with: `docker-compose up -d`\n",
"2. Then open your terminal, navigate to where your docker-compose.yml file is located, and start docker with: `docker-compose up -d`\n",
"3. Once this is ready, your instance should be available at [http://localhost:8080](http://localhost:8080)\n",
"\n",
"Note. To shut down your docker instance you can call: `docker-compose down`\n",
@ -170,13 +170,24 @@
"===========================================================\n",
"## Prepare your OpenAI API key\n",
"\n",
"The `OpenAI API key` is used for vectorization of your data at import, and for queries.\n",
"The `OpenAI API key` is used for vectorization of your data at import, and for running queries.\n",
"\n",
"If you don't have an OpenAI API key, you can get one from [https://beta.openai.com/account/api-keys](https://beta.openai.com/account/api-keys).\n",
"\n",
"Once you get your key, please add it to your environment variables as `OPENAI_API_KEY`."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "09fefff0",
"metadata": {},
"outputs": [],
"source": [
"# Export OpenAI API Key\n",
"!export OPENAI_API_KEY=\"your key\""
]
},
{
"cell_type": "code",
"execution_count": null,

@ -20,14 +20,14 @@
"\n",
"Weaviate uses KNN algorithms to create an vector-optimized index, which allows your queries to run extremely fast. Learn more [here](https://weaviate.io/blog/why-is-vector-search-so-fast).\n",
"\n",
"Weaviate let's you use your favorite ML-models, and scale seamlessly into billions of data objects.\n",
"Weaviate let you use your favorite ML-models, and scale seamlessly into billions of data objects.\n",
"\n",
"### Deployment options\n",
"\n",
"Whatever your scenario or production setup, Weaviate has an option for you. You can deploy Weaviate in the following setups:\n",
"* Self-hosted you can deploy Weaviate with docker locally, or any server you want.\n",
"* SaaS you can use [Weaviate Cloud Service (WCS)](https://console.weaviate.io/) to host your Weaviate instances.\n",
"* Hybrid-Saas you can deploy Weaviate in your own private Cloud Service \n",
"* Hybrid-SaaS you can deploy Weaviate in your own private Cloud Service \n",
"\n",
"### Programming languages\n",
"\n",
@ -37,7 +37,7 @@
"* [Java](https://weaviate.io/developers/weaviate/client-libraries/java)\n",
"* [Go](https://weaviate.io/developers/weaviate/client-libraries/go)\n",
"\n",
"Additionally, Weavaite has a [REST layer](https://weaviate.io/developers/weaviate/api/rest/objects). Basically you can call Weaviate from any language that supports REST requests."
"Additionally, Weaviate has a [REST layer](https://weaviate.io/developers/weaviate/api/rest/objects). Basically you can call Weaviate from any language that supports REST requests."
]
},
{
@ -51,13 +51,13 @@
"- **Connect**: Connect to your Weaviate instance \n",
"- **Schema Configuration**: Configure the schema of your data\n",
" - *Note*: Here we can define which OpenAI Embedding Model to use\n",
" - *Note*: Here we can configure which properties to index on\n",
" - *Note*: Here we can configure which properties to index\n",
"- **Import data**: Load a demo dataset and import it into Weaviate\n",
" - *Note*: The import process will automatically index your data - based on the configuration in the schema\n",
" - *Note*: You don't need to explicitly vectorize your data, Weaviate will communicate with OpenAI to do it for you.\n",
" - *Note*: You don't need to explicitly vectorize your data, Weaviate will communicate with OpenAI to do it for you\n",
"- **Run Queries**: Query \n",
" - *Note*: You don't need to explicitly vectorize your queries, Weaviate will communicate with OpenAI to do it for you.\n",
" - *Note*: The `qna-openai` module automatically communicates with the OpenAI completions endpoint.\n",
" - *Note*: You don't need to explicitly vectorize your queries, Weaviate will communicate with OpenAI to do it for you\n",
" - *Note*: The `qna-openai` module automatically communicates with the OpenAI completions endpoint\n",
"\n",
"Once you've run through this notebook you should have a basic understanding of how to setup and use vector databases for question answering."
]
@ -68,7 +68,7 @@
"metadata": {},
"source": [
"## OpenAI Module in Weaviate\n",
"All Weaviate instances come equiped with the [text2vec-openai](https://weaviate.io/developers/weaviate/modules/retriever-vectorizer-modules/text2vec-openai) and the [qna-openai](https://weaviate.io/developers/weaviate/modules/reader-generator-modules/qna-openai) modules.\n",
"All Weaviate instances come equipped with the [text2vec-openai](https://weaviate.io/developers/weaviate/modules/retriever-vectorizer-modules/text2vec-openai) and the [qna-openai](https://weaviate.io/developers/weaviate/modules/reader-generator-modules/qna-openai) modules.\n",
"\n",
"The first module is responsible for handling vectorization at import (or any CRUD operations) and when you run a search query. The second module communicates with the OpenAI completions endpoint.\n",
"\n",
@ -119,7 +119,7 @@
"\n",
"Install and run Weaviate locally with Docker.\n",
"1. Download the [./docker-compose.yml](./docker-compose.yml) file\n",
"2. Then open your terminal, navigate to where your docker-compose.yml folder, and start docker with: `docker-compose up -d`\n",
"2. Then open your terminal, navigate to where your docker-compose.yml file is located, and start docker with: `docker-compose up -d`\n",
"3. Once this is ready, your instance should be available at [http://localhost:8080](http://localhost:8080)\n",
"\n",
"Note. To shut down your docker instance you can call: `docker-compose down`\n",
@ -176,6 +176,17 @@
"Once you get your key, please add it to your environment variables as `OPENAI_API_KEY`."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5a2ded4b",
"metadata": {},
"outputs": [],
"source": [
"# Export OpenAI API Key\n",
"!export OPENAI_API_KEY=\"your key\""
]
},
{
"cell_type": "code",
"execution_count": null,

Loading…
Cancel
Save