langchain/docs/extras/integrations/chat
Michael Haddad c6b27b3692
add konko chat_model files (#10267)
_Thank you to the LangChain team for the great project and in advance
for your review. Let me know if I can provide any other additional
information or do things differently in the future to make your lives
easier 🙏 _

@hwchase17 please let me know if you're not the right person to review 😄

This PR enables LangChain to access the Konko API via the chat_models
API wrapper.

Konko API is a fully managed API designed to help application
developers:

1. Select the right LLM(s) for their application
2. Prototype with various open-source and proprietary LLMs
3. Move to production in-line with their security, privacy, throughput,
latency SLAs without infrastructure set-up or administration using Konko
AI's SOC 2 compliant infrastructure

_Note on integration tests:_ 
We added 14 integration tests. They will all fail unless you export the
right API keys. 13 will pass with a KONKO_API_KEY provided and the other
one will pass with a OPENAI_API_KEY provided. When both are provided,
all 14 integration tests pass. If you would like to test this yourself,
please let me know and I can provide some temporary keys.

### Installation and Setup

1. **First you'll need an API key**
2. **Install Konko AI's Python SDK**
    1. Enable a Python3.8+ environment
    
    `pip install konko`
    
3.  **Set API Keys**
    
          **Option 1:** Set Environment Variables
    
    You can set environment variables for
    
    1. KONKO_API_KEY (Required)
    2. OPENAI_API_KEY (Optional)
    
    In your current shell session, use the export command:
    
    `export KONKO_API_KEY={your_KONKO_API_KEY_here}`
    `export OPENAI_API_KEY={your_OPENAI_API_KEY_here} #Optional`
    
Alternatively, you can add the above lines directly to your shell
startup script (such as .bashrc or .bash_profile for Bash shell and
.zshrc for Zsh shell) to have them set automatically every time a new
shell session starts.
    
    **Option 2:** Set API Keys Programmatically
    
If you prefer to set your API keys directly within your Python script or
Jupyter notebook, you can use the following commands:
    
    ```python
    konko.set_api_key('your_KONKO_API_KEY_here')
    konko.set_openai_api_key('your_OPENAI_API_KEY_here') # Optional
    
    ```
    

### Calling a model

Find a model on the [[Konko Introduction
page](https://docs.konko.ai/docs#available-models)](https://docs.konko.ai/docs#available-models)

For example, for this [[LLama 2
model](https://docs.konko.ai/docs/meta-llama-2-13b-chat)](https://docs.konko.ai/docs/meta-llama-2-13b-chat).
The model id would be: `"meta-llama/Llama-2-13b-chat-hf"`

Another way to find the list of models running on the Konko instance is
through this
[[endpoint](https://docs.konko.ai/reference/listmodels)](https://docs.konko.ai/reference/listmodels).

From here, we can initialize our model:

```python
chat_instance = ChatKonko(max_tokens=10, model = 'meta-llama/Llama-2-13b-chat-hf')

```

And run it:

```python
msg = HumanMessage(content="Hi")
chat_response = chat_instance([msg])

```
2023-09-08 10:00:55 -07:00
..
anthropic_functions.ipynb add anthropic functions wrapper (#8475) 2023-07-30 07:23:46 -07:00
anthropic.ipynb mv module integrations docs (#8101) 2023-07-23 23:23:16 -07:00
anyscale.ipynb Create ChatAnyscale (#8770) 2023-08-07 13:21:05 -07:00
azure_chat_openai.ipynb Feature/fix azureopenai model mappings (#8621) 2023-08-09 10:56:15 -07:00
azureml_chat_endpoint.ipynb Add LLaMa Formatter and AzureML Chat Endpoint (#8382) 2023-07-31 16:26:25 -07:00
bedrock.ipynb feat: add bedrock chat model (#8017) 2023-09-01 13:16:57 -07:00
ernie.ipynb fix(llms): improve the ernie chat model (#9289) 2023-08-16 00:48:42 -07:00
google_vertex_ai_palm.ipynb Updated docs on Vertex AI going GA (#8531) 2023-07-31 17:15:04 -07:00
index.mdx mv module integrations docs (#8101) 2023-07-23 23:23:16 -07:00
jinachat.ipynb mv module integrations docs (#8101) 2023-07-23 23:23:16 -07:00
konko.ipynb add konko chat_model files (#10267) 2023-09-08 10:00:55 -07:00
litellm.ipynb Adding ChatLiteLLM model (#9020) 2023-08-14 07:43:40 -07:00
llama_api.ipynb mv module integrations docs (#8101) 2023-07-23 23:23:16 -07:00
ollama.ipynb Adds ChatOllama (#9628) 2023-08-23 13:02:26 -07:00
openai.ipynb Update ChatOpenAI docs with fine-tuning example (#9632) 2023-08-22 16:56:53 -07:00
promptlayer_chatopenai.ipynb mv module integrations docs (#8101) 2023-07-23 23:23:16 -07:00