You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
langchain/docs/docs/integrations/providers/konko.mdx

65 lines
2.5 KiB
Markdown

# Konko
All functionality related to Konko
>[Konko AI](https://www.konko.ai/) provides a fully managed API to help application developers
>1. **Select** the right open source or proprietary LLMs for their application
>2. **Build** applications faster with integrations to leading application frameworks and fully managed APIs
>3. **Fine tune** smaller open-source LLMs to achieve industry-leading performance at a fraction of the cost
>4. **Deploy production-scale APIs** that meet security, privacy, throughput, and latency SLAs without infrastructure set-up or administration using Konko AI's SOC 2 compliant, multi-cloud infrastructure
## Installation and Setup
1. Sign in to our web app to [create an API key](https://platform.konko.ai/settings/api-keys) to access models via our endpoints for [chat completions](https://docs.konko.ai/reference/post-chat-completions) and [completions](https://docs.konko.ai/reference/post-completions).
2. Enable a Python3.8+ environment
3. Install the SDK
```bash
pip install konko
```
4. Set API Keys as environment variables(`KONKO_API_KEY`,`OPENAI_API_KEY`)
```bash
export KONKO_API_KEY={your_KONKO_API_KEY_here}
export OPENAI_API_KEY={your_OPENAI_API_KEY_here} #Optional
```
Please see [the Konko docs](https://docs.konko.ai/docs/getting-started) for more details.
## LLM
**Explore Available Models:** Start by browsing through the [available models](https://docs.konko.ai/docs/list-of-models) on Konko. Each model caters to different use cases and capabilities.
Another way to find the list of models running on the Konko instance is through this [endpoint](https://docs.konko.ai/reference/get-models).
See a usage [example](/docs/integrations/llms/konko).
### Examples of Endpoint Usage
- **Completion with mistralai/Mistral-7B-v0.1:**
```python
from langchain.llms import Konko
llm = Konko(max_tokens=800, model='mistralai/Mistral-7B-v0.1')
prompt = "Generate a Product Description for Apple Iphone 15"
response = llm.invoke(prompt)
```
## Chat Models
See a usage [example](/docs/integrations/chat/konko).
- **ChatCompletion with Mistral-7B:**
```python
from langchain_core.messages import HumanMessage
from langchain_community.chat_models import ChatKonko
chat_instance = ChatKonko(max_tokens=10, model = 'mistralai/mistral-7b-instruct-v0.1')
msg = HumanMessage(content="Hi")
chat_response = chat_instance([msg])
```
For further assistance, contact [support@konko.ai](mailto:support@konko.ai) or join our [Discord](https://discord.gg/TXV2s3z7RZ).