docs: Updating documentation for Konko provider (#16953)

- **Description:** A small update to the Konko provider documentation.

---------

Co-authored-by: Shivani Modi <shivanimodi@Shivanis-MacBook-Pro.local>
pull/17075/head
Shivani Modi 5 months ago committed by GitHub
parent 973ba0d84b
commit fcb875629d
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

@ -15,39 +15,23 @@
"source": [
"# ChatKonko\n",
"\n",
">[Konko](https://www.konko.ai/) API is a fully managed Web API designed to help application developers:\n",
"\n",
"Konko API is a fully managed API designed to help application developers:\n",
"\n",
"1. Select the right LLM(s) for their application\n",
"2. Prototype with various open-source and proprietary LLMs\n",
"3. Access Fine Tuning for open-source LLMs to get industry-leading performance at a fraction of the cost\n",
"4. Setup low-cost production APIs according to security, privacy, throughput, latency SLAs without infrastructure set-up or administration using Konko AI's SOC 2 compliant, multi-cloud infrastructure\n",
"\n",
"### Steps to Access Models\n",
"1. **Explore Available Models:** Start by browsing through the [available models](https://docs.konko.ai/docs/list-of-models) on Konko. Each model caters to different use cases and capabilities.\n",
"\n",
"2. **Identify Suitable Endpoints:** Determine which [endpoint](https://docs.konko.ai/docs/list-of-models#list-of-available-models) (ChatCompletion or Completion) supports your selected model.\n",
"# Konko\n",
"\n",
"3. **Selecting a Model:** [Choose a model](https://docs.konko.ai/docs/list-of-models#list-of-available-models) based on its metadata and how well it fits your use case.\n",
">[Konko](https://www.konko.ai/) API is a fully managed Web API designed to help application developers:\n",
"\n",
"4. **Prompting Guidelines:** Once a model is selected, refer to the [prompting guidelines](https://docs.konko.ai/docs/prompting) to effectively communicate with it.\n",
"\n",
"5. **Using the API:** Finally, use the appropriate Konko [API endpoint](https://docs.konko.ai/docs/quickstart-for-completion-and-chat-completion-endpoint) to call the model and receive responses.\n",
"1. **Select** the right open source or proprietary LLMs for their application\n",
"2. **Build** applications faster with integrations to leading application frameworks and fully managed APIs\n",
"3. **Fine tune** smaller open-source LLMs to achieve industry-leading performance at a fraction of the cost\n",
"4. **Deploy production-scale APIs** that meet security, privacy, throughput, and latency SLAs without infrastructure set-up or administration using Konko AI's SOC 2 compliant, multi-cloud infrastructure\n",
"\n",
"To run this notebook, you'll need Konko API key. You can create one by signing up on [Konko](https://www.konko.ai/).\n",
"\n",
"This example goes over how to use LangChain to interact with `Konko` ChatCompletion [models](https://docs.konko.ai/docs/list-of-models#konko-hosted-models-for-chatcompletion)\n",
"\n",
"To run this notebook, you'll need Konko API key. Sign in to our web app to [create an API key](https://platform.konko.ai/settings/api-keys) to access models\n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To run this notebook, you'll need Konko API key. You can create one by signing up on [Konko](https://www.konko.ai/)."
]
},
{
"cell_type": "code",
"execution_count": 1,
@ -64,11 +48,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## 2. Set API Keys\n",
"\n",
"<br />\n",
"\n",
"### Option 1: Set Environment Variables\n",
"#### Set Environment Variables\n",
"\n",
"1. You can set environment variables for \n",
" 1. KONKO_API_KEY (Required)\n",
@ -78,18 +58,7 @@
"```shell\n",
"export KONKO_API_KEY={your_KONKO_API_KEY_here}\n",
"export OPENAI_API_KEY={your_OPENAI_API_KEY_here} #Optional\n",
"```\n",
"\n",
"Alternatively, you can add the above lines directly to your shell startup script (such as .bashrc or .bash_profile for Bash shell and .zshrc for Zsh shell) to have them set automatically every time a new shell session starts.\n",
"\n",
"### Option 2: Set API Keys Programmatically\n",
"\n",
"If you prefer to set your API keys directly within your Python script or Jupyter notebook, you can use the following commands:\n",
"\n",
"```python\n",
"konko.set_api_key('your_KONKO_API_KEY_here') \n",
"konko.set_openai_api_key('your_OPENAI_API_KEY_here') # Optional\n",
"```\n"
"```"
]
},
{
@ -98,7 +67,7 @@
"source": [
"## Calling a model\n",
"\n",
"Find a model on the [Konko overview page](https://docs.konko.ai/v0.5.0/docs/list-of-models)\n",
"Find a model on the [Konko overview page](https://docs.konko.ai/docs/list-of-models)\n",
"\n",
"Another way to find the list of models running on the Konko instance is through this [endpoint](https://docs.konko.ai/reference/get-models).\n",
"\n",

@ -1,20 +1,27 @@
{
"cells": [
{
"cell_type": "raw",
"metadata": {},
"source": [
"---\n",
"sidebar_label: Konko\n",
"---"
]
},
{
"cell_type": "markdown",
"id": "136d9ba6-c42a-435b-9e19-77ebcc7a3145",
"metadata": {},
"source": [
"# ChatKonko\n",
"# Konko\n",
"\n",
">[Konko](https://www.konko.ai/) API is a fully managed Web API designed to help application developers:\n",
"\n",
"Konko API is a fully managed API designed to help application developers:\n",
"\n",
"1. Select the right LLM(s) for their application\n",
"2. Prototype with various open-source and proprietary LLMs\n",
"3. Access Fine Tuning for open-source LLMs to get industry-leading performance at a fraction of the cost\n",
"4. Setup low-cost production APIs according to security, privacy, throughput, latency SLAs without infrastructure set-up or administration using Konko AI's SOC 2 compliant, multi-cloud infrastructure\n"
"1. **Select** the right open source or proprietary LLMs for their application\n",
"2. **Build** applications faster with integrations to leading application frameworks and fully managed APIs\n",
"3. **Fine tune** smaller open-source LLMs to achieve industry-leading performance at a fraction of the cost\n",
"4. **Deploy production-scale APIs** that meet security, privacy, throughput, and latency SLAs without infrastructure set-up or administration using Konko AI's SOC 2 compliant, multi-cloud infrastructure\n"
]
},
{
@ -22,25 +29,44 @@
"id": "0d896d07-82b4-4f38-8c37-f0bc8b0e4fe1",
"metadata": {},
"source": [
"### Steps to Access Models\n",
"1. **Explore Available Models:** Start by browsing through the [available models](https://docs.konko.ai/docs/list-of-models) on Konko. Each model caters to different use cases and capabilities.\n",
"This example goes over how to use LangChain to interact with `Konko` completion [models](https://docs.konko.ai/docs/list-of-models#konko-hosted-models-for-completion)\n",
"\n",
"2. **Identify Suitable Endpoints:** Determine which [endpoint](https://docs.konko.ai/docs/list-of-models#list-of-available-models) (ChatCompletion or Completion) supports your selected model.\n",
"To run this notebook, you'll need Konko API key. Sign in to our web app to [create an API key](https://platform.konko.ai/settings/api-keys) to access models"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Set Environment Variables\n",
"\n",
"3. **Selecting a Model:** [Choose a model](https://docs.konko.ai/docs/list-of-models#list-of-available-models) based on its metadata and how well it fits your use case.\n",
"1. You can set environment variables for \n",
" 1. KONKO_API_KEY (Required)\n",
" 2. OPENAI_API_KEY (Optional)\n",
"2. In your current shell session, use the export command:\n",
"\n",
"4. **Prompting Guidelines:** Once a model is selected, refer to the [prompting guidelines](https://docs.konko.ai/docs/prompting) to effectively communicate with it.\n",
"```shell\n",
"export KONKO_API_KEY={your_KONKO_API_KEY_here}\n",
"export OPENAI_API_KEY={your_OPENAI_API_KEY_here} #Optional\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Calling a model\n",
"\n",
"5. **Using the API:** Finally, use the appropriate Konko [API endpoint](https://docs.konko.ai/docs/quickstart-for-completion-and-chat-completion-endpoint) to call the model and receive responses.\n",
"Find a model on the [Konko overview page](https://docs.konko.ai/docs/list-of-models)\n",
"\n",
"This example goes over how to use LangChain to interact with `Konko` completion [models](https://docs.konko.ai/docs/list-of-models#konko-hosted-models-for-completion)\n",
"Another way to find the list of models running on the Konko instance is through this [endpoint](https://docs.konko.ai/reference/get-models).\n",
"\n",
"To run this notebook, you'll need Konko API key. You can create one by signing up on [Konko](https://www.konko.ai/)."
"From here, we can initialize our model:"
]
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": null,
"id": "dd70bccb-7a65-42d0-a3f2-8116f3549da7",
"metadata": {},
"outputs": [

@ -1,86 +1,65 @@
# Konko
This page covers how to run models on Konko within LangChain.
All functionality related to Konko
Konko API is a fully managed API designed to help application developers:
>[Konko AI](https://www.konko.ai/) provides a fully managed API to help application developers
Select the right LLM(s) for their application
Prototype with various open-source and proprietary LLMs
Move to production in-line with their security, privacy, throughput, latency SLAs without infrastructure set-up or administration using Konko AI's SOC 2 compliant infrastructure
>1. **Select** the right open source or proprietary LLMs for their application
>2. **Build** applications faster with integrations to leading application frameworks and fully managed APIs
>3. **Fine tune** smaller open-source LLMs to achieve industry-leading performance at a fraction of the cost
>4. **Deploy production-scale APIs** that meet security, privacy, throughput, and latency SLAs without infrastructure set-up or administration using Konko AI's SOC 2 compliant, multi-cloud infrastructure
## Installation and Setup
### First you'll need an API key
You can request it by messaging [support@konko.ai](mailto:support@konko.ai)
1. Sign in to our web app to [create an API key](https://platform.konko.ai/settings/api-keys) to access models via our endpoints for [chat completions](https://docs.konko.ai/reference/post-chat-completions) and [completions](https://docs.konko.ai/reference/post-completions).
2. Enable a Python3.8+ environment
3. Install the SDK
### Install Konko AI's Python SDK
#### 1. Enable a Python3.8+ environment
#### 2. Set API Keys
##### Option 1: Set Environment Variables
1. You can set environment variables for
1. KONKO_API_KEY (Required)
2. OPENAI_API_KEY (Optional)
```bash
pip install konko
```
2. In your current shell session, use the export command:
4. Set API Keys as environment variables(`KONKO_API_KEY`,`OPENAI_API_KEY`)
```shell
```bash
export KONKO_API_KEY={your_KONKO_API_KEY_here}
export OPENAI_API_KEY={your_OPENAI_API_KEY_here} #Optional
```
Alternatively, you can add the above lines directly to your shell startup script (such as .bashrc or .bash_profile for Bash shell and .zshrc for Zsh shell) to have them set automatically every time a new shell session starts.
##### Option 2: Set API Keys Programmatically
Please see [the Konko docs](https://docs.konko.ai/docs/getting-started) for more details.
If you prefer to set your API keys directly within your Python script or Jupyter notebook, you can use the following commands:
```python
konko.set_api_key('your_KONKO_API_KEY_here')
konko.set_openai_api_key('your_OPENAI_API_KEY_here') # Optional
```
## LLM
#### 3. Install the SDK
**Explore Available Models:** Start by browsing through the [available models](https://docs.konko.ai/docs/list-of-models) on Konko. Each model caters to different use cases and capabilities.
Another way to find the list of models running on the Konko instance is through this [endpoint](https://docs.konko.ai/reference/get-models).
```shell
pip install konko
```
See a usage [example](/docs/integrations/llms/konko).
#### 4. Verify Installation & Authentication
### Examples of Endpoint Usage
```python
#Confirm konko has installed successfully
import konko
#Confirm API keys from Konko and OpenAI are set properly
konko.Model.list()
```
## Calling a model
- **Completion with mistralai/Mistral-7B-v0.1:**
Find a model on the [Konko Introduction page](https://docs.konko.ai/docs/list-of-models)
```python
from langchain.llms import Konko
llm = Konko(max_tokens=800, model='mistralai/Mistral-7B-v0.1')
prompt = "Generate a Product Description for Apple Iphone 15"
response = llm(prompt)
```
Another way to find the list of models running on the Konko instance is through this [endpoint](https://docs.konko.ai/reference/listmodels).
## Chat Models
## Examples of Endpoint Usage
See a usage [example](/docs/integrations/chat/konko).
- **ChatCompletion with Mistral-7B:**
```python
from langchain.schema import HumanMessage
from langchain_community.chat_models import ChatKonko
chat_instance = ChatKonko(max_tokens=10, model = 'mistralai/mistral-7b-instruct-v0.1')
msg = HumanMessage(content="Hi")
chat_response = chat_instance([msg])
```
- **Completion with mistralai/Mistral-7B-v0.1:**
```python
from langchain.llms import Konko
llm = Konko(max_tokens=800, model='mistralai/Mistral-7B-v0.1')
prompt = "Generate a Product Description for Apple Iphone 15"
response = llm(prompt)
```
For further assistance, contact [support@konko.ai](mailto:support@konko.ai) or join our [Discord](https://discord.gg/TXV2s3z7RZ).
Loading…
Cancel
Save