mirror of
https://github.com/hwchase17/langchain
synced 2024-10-31 15:20:26 +00:00
165 lines
6.1 KiB
Plaintext
165 lines
6.1 KiB
Plaintext
|
{
|
||
|
"cells": [
|
||
|
{
|
||
|
"cell_type": "markdown",
|
||
|
"metadata": {},
|
||
|
"source": [
|
||
|
"# Konko\n",
|
||
|
"\n",
|
||
|
">[Konko](https://www.konko.ai/) API is a fully managed Web API designed to help application developers:\n",
|
||
|
"\n",
|
||
|
"Konko API is a fully managed API designed to help application developers:\n",
|
||
|
"\n",
|
||
|
"1. Select the right LLM(s) for their application\n",
|
||
|
"2. Prototype with various open-source and proprietary LLMs\n",
|
||
|
"3. Move to production in-line with their security, privacy, throughput, latency SLAs without infrastructure set-up or administration using Konko AI's SOC 2 compliant infrastructure\n",
|
||
|
"\n",
|
||
|
"\n",
|
||
|
"This example goes over how to use LangChain to interact with `Konko` [models](https://docs.konko.ai/docs/overview)"
|
||
|
]
|
||
|
},
|
||
|
{
|
||
|
"cell_type": "markdown",
|
||
|
"metadata": {},
|
||
|
"source": [
|
||
|
"To run this notebook, you'll need Konko API key. You can request it by messaging support@konko.ai."
|
||
|
]
|
||
|
},
|
||
|
{
|
||
|
"cell_type": "code",
|
||
|
"execution_count": 1,
|
||
|
"metadata": {
|
||
|
"tags": []
|
||
|
},
|
||
|
"outputs": [],
|
||
|
"source": [
|
||
|
"from langchain.chat_models import ChatKonko\n",
|
||
|
"from langchain.prompts.chat import (\n",
|
||
|
" ChatPromptTemplate,\n",
|
||
|
" SystemMessagePromptTemplate,\n",
|
||
|
" AIMessagePromptTemplate,\n",
|
||
|
" HumanMessagePromptTemplate,\n",
|
||
|
")\n",
|
||
|
"from langchain.schema import AIMessage, HumanMessage, SystemMessage"
|
||
|
]
|
||
|
},
|
||
|
{
|
||
|
"cell_type": "markdown",
|
||
|
"metadata": {},
|
||
|
"source": [
|
||
|
"## 2. Set API Keys\n",
|
||
|
"\n",
|
||
|
"<br />\n",
|
||
|
"\n",
|
||
|
"### Option 1: Set Environment Variables\n",
|
||
|
"\n",
|
||
|
"1. You can set environment variables for \n",
|
||
|
" 1. KONKO_API_KEY (Required)\n",
|
||
|
" 2. OPENAI_API_KEY (Optional)\n",
|
||
|
"2. In your current shell session, use the export command:\n",
|
||
|
"\n",
|
||
|
"```shell\n",
|
||
|
"export KONKO_API_KEY={your_KONKO_API_KEY_here}\n",
|
||
|
"export OPENAI_API_KEY={your_OPENAI_API_KEY_here} #Optional\n",
|
||
|
"```\n",
|
||
|
"\n",
|
||
|
"Alternatively, you can add the above lines directly to your shell startup script (such as .bashrc or .bash_profile for Bash shell and .zshrc for Zsh shell) to have them set automatically every time a new shell session starts.\n",
|
||
|
"\n",
|
||
|
"### Option 2: Set API Keys Programmatically\n",
|
||
|
"\n",
|
||
|
"If you prefer to set your API keys directly within your Python script or Jupyter notebook, you can use the following commands:\n",
|
||
|
"\n",
|
||
|
"```python\n",
|
||
|
"konko.set_api_key('your_KONKO_API_KEY_here') \n",
|
||
|
"konko.set_openai_api_key('your_OPENAI_API_KEY_here') # Optional\n",
|
||
|
"```\n"
|
||
|
]
|
||
|
},
|
||
|
{
|
||
|
"cell_type": "markdown",
|
||
|
"metadata": {},
|
||
|
"source": [
|
||
|
"## Calling a model\n",
|
||
|
"\n",
|
||
|
"Find a model on the [Konko overview page](https://docs.konko.ai/docs/overview)\n",
|
||
|
"\n",
|
||
|
"For example, for this [LLama 2 model](https://docs.konko.ai/docs/meta-llama-2-13b-chat). The model id would be: `\"meta-llama/Llama-2-13b-chat-hf\"`\n",
|
||
|
"\n",
|
||
|
"Another way to find the list of models running on the Konko instance is through this [endpoint](https://docs.konko.ai/reference/listmodels).\n",
|
||
|
"\n",
|
||
|
"From here, we can initialize our model:\n"
|
||
|
]
|
||
|
},
|
||
|
{
|
||
|
"cell_type": "code",
|
||
|
"execution_count": 5,
|
||
|
"metadata": {},
|
||
|
"outputs": [],
|
||
|
"source": [
|
||
|
"chat = ChatKonko(max_tokens=400, model = 'meta-llama/Llama-2-13b-chat-hf')"
|
||
|
]
|
||
|
},
|
||
|
{
|
||
|
"cell_type": "code",
|
||
|
"execution_count": 7,
|
||
|
"metadata": {},
|
||
|
"outputs": [
|
||
|
{
|
||
|
"data": {
|
||
|
"text/plain": [
|
||
|
"AIMessage(content=\" Sure, I'd be happy to explain the Big Bang Theory briefly!\\n\\nThe Big Bang Theory is the leading explanation for the origin and evolution of the universe, based on a vast amount of observational evidence from many fields of science. In essence, the theory posits that the universe began as an infinitely hot and dense point, known as a singularity, around 13.8 billion years ago. This singularity expanded rapidly, and as it did, it cooled and formed subatomic particles, which eventually coalesced into the first atoms, and later into the stars and galaxies we see today.\\n\\nThe theory gets its name from the idea that the universe began in a state of incredibly high energy and temperature, and has been expanding and cooling ever since. This expansion is thought to have been driven by a mysterious force known as dark energy, which is thought to be responsible for the accelerating expansion of the universe.\\n\\nOne of the key predictions of the Big Bang Theory is that the universe should be homogeneous and isotropic on large scales, meaning that it should look the same in all directions and have the same properties everywhere. This prediction has been confirmed by a wealth of observational evidence, including the cosmic microwave background radiation, which is thought to be a remnant of the early universe.\\n\\nOverall, the Big Bang Theory is a well-established and widely accepted explanation for the origins of the universe, and it has been supported by a vast amount of observational evidence from many fields of science.\", additional_kwargs={}, example=False)"
|
||
|
]
|
||
|
},
|
||
|
"execution_count": 7,
|
||
|
"metadata": {},
|
||
|
"output_type": "execute_result"
|
||
|
}
|
||
|
],
|
||
|
"source": [
|
||
|
"messages = [\n",
|
||
|
" SystemMessage(\n",
|
||
|
" content=\"You are a helpful assistant.\"\n",
|
||
|
" ),\n",
|
||
|
" HumanMessage(\n",
|
||
|
" content=\"Explain Big Bang Theory briefly\"\n",
|
||
|
" ),\n",
|
||
|
"]\n",
|
||
|
"chat(messages)"
|
||
|
]
|
||
|
},
|
||
|
{
|
||
|
"cell_type": "code",
|
||
|
"execution_count": null,
|
||
|
"metadata": {},
|
||
|
"outputs": [],
|
||
|
"source": []
|
||
|
}
|
||
|
],
|
||
|
"metadata": {
|
||
|
"kernelspec": {
|
||
|
"display_name": "Python 3 (ipykernel)",
|
||
|
"language": "python",
|
||
|
"name": "python3"
|
||
|
},
|
||
|
"language_info": {
|
||
|
"codemirror_mode": {
|
||
|
"name": "ipython",
|
||
|
"version": 3
|
||
|
},
|
||
|
"file_extension": ".py",
|
||
|
"mimetype": "text/x-python",
|
||
|
"name": "python",
|
||
|
"nbconvert_exporter": "python",
|
||
|
"pygments_lexer": "ipython3",
|
||
|
"version": "3.11.3"
|
||
|
},
|
||
|
"vscode": {
|
||
|
"interpreter": {
|
||
|
"hash": "a0a0263b650d907a3bfe41c0f8d6a63a071b884df3cfdc1579f00cdc1aed6b03"
|
||
|
}
|
||
|
}
|
||
|
},
|
||
|
"nbformat": 4,
|
||
|
"nbformat_minor": 4
|
||
|
}
|