2023-08-03 17:24:51 +00:00
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Eden AI"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
"Eden AI is revolutionizing the AI landscape by uniting the best AI providers, empowering users to unlock limitless possibilities and tap into the true potential of artificial intelligence. With an all-in-one comprehensive and hassle-free platform, it allows users to deploy AI features to production lightning fast, enabling effortless access to the full breadth of AI capabilities via a single API. (website: https://edenai.co/)"
2023-08-03 17:24:51 +00:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This example goes over how to use LangChain to interact with Eden AI models\n",
"\n",
"-----------------------------------------------------------------------------------\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Accessing the EDENAI's API requires an API key, \n",
"\n",
"which you can get by creating an account https://app.edenai.run/user/register and heading here https://app.edenai.run/admin/account/settings\n",
"\n",
"Once we have a key we'll want to set it as an environment variable by running:\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"export EDENAI_API_KEY=\"...\""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"If you'd prefer not to set an environment variable you can pass the key in directly via the edenai_api_key named parameter\n",
"\n",
" when initiating the EdenAI LLM class:\n",
"\n"
]
},
{
"cell_type": "code",
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
"execution_count": 1,
2023-08-03 17:24:51 +00:00
"metadata": {},
"outputs": [],
"source": [
"from langchain.llms import EdenAI"
]
},
{
"cell_type": "code",
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
"execution_count": 2,
2023-08-03 17:24:51 +00:00
"metadata": {},
"outputs": [],
"source": [
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
"llm = EdenAI(edenai_api_key=\"...\",provider=\"openai\", temperature=0.2, max_tokens=250)"
2023-08-03 17:24:51 +00:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Calling a model\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The EdenAI API brings together various providers, each offering multiple models.\n",
"\n",
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
"To access a specific model, you can simply add 'model' during instantiation.\n",
2023-08-03 17:24:51 +00:00
"\n",
"For instance, let's explore the models provided by OpenAI, such as GPT3.5 "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### text generation"
]
},
{
"cell_type": "code",
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
"execution_count": 3,
2023-08-03 17:24:51 +00:00
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
"\" No, a dog cannot drive a car.\\n\\nReasoning: \\n1. Driving a car requires a driver's license, which is only issued to humans. \\n2. Dogs do not have the physical capability to operate a car, as they do not have hands to steer or feet to operate the pedals. \\n3. Dogs also do not have the mental capacity to understand the rules of the road and operate a car safely. \\n4. Therefore, a dog cannot drive a car.\""
2023-08-03 17:24:51 +00:00
]
},
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
"execution_count": 3,
2023-08-03 17:24:51 +00:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"from langchain import PromptTemplate, LLMChain\n",
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
"llm=EdenAI(feature=\"text\",provider=\"openai\",model=\"text-davinci-003\",temperature=0.2, max_tokens=250)\n",
2023-08-03 17:24:51 +00:00
"\n",
"prompt = \"\"\"\n",
"User: Answer the following yes/no question by reasoning step by step. Can a dog drive a car?\n",
"Assistant:\n",
"\"\"\"\n",
"\n",
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
"llm(prompt)"
2023-08-03 17:24:51 +00:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### image generation"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"import base64\n",
"from io import BytesIO\n",
"from PIL import Image\n",
"import json\n",
"def print_base64_image(base64_string):\n",
" # Decode the base64 string into binary data\n",
" decoded_data = base64.b64decode(base64_string)\n",
"\n",
" # Create an in-memory stream to read the binary data\n",
" image_stream = BytesIO(decoded_data)\n",
"\n",
" # Open the image using PIL\n",
" image = Image.open(image_stream)\n",
"\n",
" # Display the image\n",
" image.show()"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"text2image = EdenAI(\n",
" feature=\"image\" ,\n",
" provider= \"openai\",\n",
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
" resolution=\"512x512\"\n",
2023-08-03 17:24:51 +00:00
")"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"image_output = text2image(\"A cat riding a motorcycle by Picasso\")"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"print_base64_image(image_output)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### text generation with callback"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
" No, a dog cannot drive a car.\n",
"\n",
"Reasoning:\n",
"\n",
"1. Driving a car requires a driver's license, which only humans can obtain. \n",
"2. Driving a car requires the ability to understand and follow traffic laws, which only humans can do. \n",
"3. Driving a car requires the ability to operate the car's controls, which only humans can do. \n",
"4. Therefore, a dog cannot drive a car.\n"
]
}
],
"source": [
"from langchain.llms import EdenAI\n",
"from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler\n",
"\n",
"llm = EdenAI(\n",
" callbacks=[StreamingStdOutCallbackHandler()],\n",
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
" feature=\"text\",provider=\"openai\", temperature=0.2,max_tokens=250\n",
2023-08-03 17:24:51 +00:00
")\n",
"prompt = \"\"\"\n",
"User: Answer the following yes/no question by reasoning step by step. Can a dog drive a car?\n",
"Assistant:\n",
"\"\"\"\n",
"print(llm(prompt))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Chaining Calls"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [],
"source": [
"from langchain.chains import SimpleSequentialChain\n",
"from langchain.prompts import PromptTemplate\n",
"from langchain.chains import LLMChain"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [],
"source": [
"llm = EdenAI(\n",
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
"feature=\"text\", provider=\"openai\", temperature=0.2, max_tokens=250\n",
2023-08-03 17:24:51 +00:00
")\n",
"text2image = EdenAI(\n",
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
"feature=\"image\", provider=\"openai\", resolution=\"512x512\"\n",
2023-08-03 17:24:51 +00:00
")"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [],
"source": [
"prompt = PromptTemplate(\n",
" input_variables=[\"product\"],\n",
" template=\"What is a good name for a company that makes {product}?\",\n",
")\n",
"\n",
"chain = LLMChain(llm=llm, prompt=prompt)"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [],
"source": [
"second_prompt = PromptTemplate(\n",
" input_variables=[\"company_name\"],\n",
" template=\"Write a description of a logo for this company: {company_name}, the logo should not contain text at all \",\n",
")\n",
"chain_two = LLMChain(llm=llm, prompt=second_prompt)"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [],
"source": [
"third_prompt = PromptTemplate(\n",
" input_variables=[\"company_logo_description\"],\n",
" template=\"{company_logo_description}\",\n",
")\n",
"chain_three = LLMChain(llm=text2image, prompt=third_prompt)"
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
"\u001b[1m> Entering new SimpleSequentialChain chain...\u001b[0m\n",
2023-08-03 17:24:51 +00:00
"\u001b[36;1m\u001b[1;3m\n",
"\n",
"Headwear Haven\u001b[0m\n",
"\u001b[33;1m\u001b[1;3m. It should be a simple and modern design that conveys the idea of a safe and comfortable place to purchase headwear. The logo should feature a stylized image of a hat, with a warm and inviting color palette. The design should be eye-catching and memorable, and should evoke a sense of trust and reliability.\u001b[0m\n",
"\u001b[38;5;200m\u001b[1;3miVBORw0KGgoAAAANSUhEUgAAAgAAAAIACAIAAAB7GkOtAAAAaGVYSWZNTQAqAAAACAACknwAAgAAACkAAAAmkoYAAgAAABgAAABQAAAAAE9wZW5BSS0tNThmMWVjYjVjNTNkOGY0NjFjMjBjNGI0YzRjMjllZGEAAE1hZGUgd2l0aCBPcGVuQUkgREFMTC1FALjVgBgAAQAASURBVHgBABaE6XsB9vX2AAD//wAAAAD/AAAAAP8AAAEBAP8AAAEAAAD//wABAf8A/wAAAAAAAAL/Af4AAAAA/wEAAP8AAAEAAP8BAAD/AAEAAAD/AP8CAAAAAQEA/wH/Af8AAAABAAD//wAAAQAA/v8AAwEA/v8BAgEA////AAABAAEAAAAAAP8AAAAA/wD/AQEAAAABAP8AAAEA////AQECAP//AAAAAAEA/wAAAAAAAf8AAAH//wD/AP8CAQAA/wEAAQEA//8AAAAAAv///gABAAH/AQABAAAA/wD+AQACAQH//wACAf//AAAA//8BAAH9/v8AAgEBAAAB/wD/AQEAAAAA//8AAAAA/wAAAQD/AQABAAAA/wAAAgAA/wABAAD///8AAQIB//8AAQAAAAAA////AQEBAAD/AAABAAD/AP8AAAEAAAAB/wAAAQEAAP//AAAAAAABAQAA/gAAAAAAAQEAAP8A/wD+AgEC/////wEBAf8AAAEAAAAAAAD/Af8A/gAAAAH/AAACAf7/AAEA/wEBAQAAAP8A/wAAAQD/AAEBAAAAAAAA/wD/Af8B/wH/Af8B/wD/AQAA/wAAAQABAAAA/wH/AP8AAQAAAQAA/gAAAgAC/wD+AAABAAAAAP8AAAH///8AAAEAAf8AAAABAAEAAP//AAEBAP//AQH//v8AAQECAAIAAP7/////AAEBAQD/AP8AAAEBAP8BAAIAAf8A//8AAAL/AAAA//8BAAH/AQABAAD/AAAAAP//AAABAQEA/wH//wABAf4AAAIBAP7//wAAAQEBAQAA//8AAf8AAAEB/gEAAgAA//8AAAD+AQEA/wAAAAABAP8AAAABAAD+AAEBAAAAAQD///8AAQAAAAAB/wAAAP8BAAH+AQAA/wEBAP8B/wD/AgAA/wAB///+AQAAAAEBAQEAAP8A/wAAAQAAAAAA////AAEBAQAA/wEA////AQABAP//AAEAAQEBAAD/AP8B/wEBAf4A/wH+AAABAAEA/v//AQAAAf8AAAEAAAAAAAABAAEAAAD/AP8AAQEB/wAAAP//AAEBAAAAAP8AAQAA/wD//wAAAQAAAAEA/gAAAf//AQEBAAD//wABAP8AAAD/AAACAQH/AP8A/wEBAf4B/wH/AQAAAAD/AAEB/wD/Af8B////AgIB/////wAAAQEAAP4BAAH/AQEAAAAB////AAAAAAAA/wAAAQAA/wAAAQAAAAECAQH//v3/AQEBAAD//wAA/wAAAQD/AQEC///+AAABAQEB//8AAQEAAAD/AP8BAP8A/wH///8AAAEAAgD/AAACAAD//wAAAQABAf8A/wH/AAAAAP8AAAAAAAEAAAD//wACAQAB/wD/AQD/AAAAAAABAAD//wABAQD+AQACAAAA/wABAQD+AAABAQD//v8BAAH/AgEB/gAAAP8AAAAAAAD//wAAAQABAAD/AQEB//8AAQAA/gD/AP8AAQH//wEBAf4BAAEAAP//AAEA/wD/AQABAAAAAP8BAAEA//8AAAABAAH/Af/+/wEBAgABAAAA/wAAAAD/AQAAAP8BAAL///8BAP///wEAAv8B/wH/AAEAAP8AAAH/AP8B//8AAQEAAAABAAAB/gH/Av8A/wD+AQEBAP4BAAH/AQAC/wAAAAH/AP//AQAB/wAAAP8A/wD/AQEA/wEAAf4AAAAA/wABAQAAAQH//wAAAf8AAAEB/wAAAAH/AP4BAQEA/gD/Af8BAAH/AAACAAH/AAAAAP7/AAABAAAAAQIC/v/9AAAAAgEA//8AAAAAAP8AAAEB/wH/AQAAAP8BAAD//wABAP/+AQEBAAAB/gH/Av4BAQH//v8BAQL/AP8BAP//AAEA/wABAf8AAQEA/wAAAQAA/gD/AgEB//8AAP8AAAH//wEBAQAAAf8B/gD+AQEBAP//AQAB/wH/AP8B/wAAAgEA/wAAAP8AAQAA/wABAAD/AAH/AP8AAAACAAD+AAABAQH//wAA//8AAQEBAP8A/wAAAQAAAP8AAAAAAAEAAAAAAP8BAQH/AAABAP///gEAAv8B/gD+AfX19QAA//8AAAEAAAEAAP//AQAB/wAAAAAAAAAAAQD//wAAAAABAAD/Af8B/wH/AAEBAf8B//7//wMBAf//AAAAAP//AAAAAAEBAP8AAAAAAQH//wAB/wD/AgAAAAABAAABAAD+//8A/wEAAgEBAP//AAECAP//AAAA/wEAAP8BAAH+AQABAAAA////AQAB////AAECAgD+/wAAAP8AAAEBAgEA/v8AAAAA/wAAAQAAAAD//wABAQAA////AQAAAAEB/wAAAf8A/wEAAAAAAQEA/wAAAf///wAAAAAAAQEA/wABAf8A/wEBAAD/AAD///4BAQEBAQAA/////wEAAQAAAAD/Af8BAAAA/gEAAAD/AQACAf/+AAEA/gAAAQAAAP8BAQEAAP8A/wL/Af8B/wAAAAAAAAAAAf8A/wEBAAD+/wABAgAB////AAIAAf8AAAAAAAD//wABAf8AAQEA/wAAAP8AAAAAAAABAAH/AAEB//8AAAAAAQH///4AAAEBAQAAAP8A///+AAEAAf8B/wIBAAD+AAACAAAAAAD/AQAAAAAB//7+AQICAAAA/wD/AP8BAAD/AAAAAQEBAP//AAIBAP8AAAD+/wAAAQAAAAABAAABAAH/AP8B/wH/Av8A/wAAAAAA/wABAQD//wABAAH/AQAA/////wABAf8AAAABAAEAAAEAAP///wAAAQAAAAD///8AAgEAAAABAAAAAAAB/wEAAP8AAAAAAgH///8B/wD/AAAB/wEAAv8A/wAAAAAAAAH/Af8BAAAAAAEAAAAAAP7/AQEA/wAAAAEAAP8AAAAAAAAA/wEBAAD/Af8AAP8AAAEB////AAEAAAAAAQEA/wAAAf//AAACAAH//wD/Af4B/wEAAAEAAf8AAAAAAAAAAAEAAAAA/wABAQD//wAAAAAAAQABAAD/AP8AAAD//wABAP8BAQEAAf///wAB/wL/AP8AAQAB/wD/AAAAAQAA/wD/AQAB/wAAAAEAAQABAAD+/wABAf4A/wABAAD/AAEA/wD/AgACAAAAAQAA/////wEBAAH/Af4AAAH//wABAAIBAf4A////AAEBAP8AAAEAAAD/AAABAAD+AAACAAAAAAD/Af8AAAEAAAD///8BAQEA/wEAAAAAAf8A//8A/wEAAf//AQEB/wD/AAEBAQAA/v8AAQEAAP8A/wAAAQAAAgAA/wAA/wABAQD/AAAA/wAAAAAAAQAAAAAAAAAA/wAAAf8B/wD/AAABAAH+AAABAAAAAQAA//8AAAEAAAH/AP4BAQAA/wH/AgEC/v8AAQAA//8AAAEAAAD//wEAAf//AP8BAQEBAAH+//8BAQABAQD//gEAAP///wEBAv//AP8BAAAA/wEBAQD/AAH/AP8BAAH///8BAAEAAAAAAP//AAEB/wD/AQABAP8AAQAAAAEAAP8B/wD/AQD//wAAAAEBAAEAAf8A/wABAQD/AAAA//4AAQEA//8AAAEBAQD/AAH///8BAAD/AAEBAf//AAEBAP//AAAAAAABAAEAAP8AAQAA/gAAAQAAAAD///8BAAEAAAEAAP8AAQAA/gEBAv//AAAB/wD+AQACAAD/AAAAAAAA/gAAAgABAAD/AQD//gACAQD//wEBAf/+AAABAAEAAAAA//8BAQD/AAEBAAD/AAAAAAEAAP8AAAAAAP8AAAAAAAEBAv8A/gH/AP8B/wD/AQAA/wAAAQD//wABAQABAAAA/wEAAf//////AQICAAD/AAAAAAAAAP8B/wD/AP8A/wAAAQIBAf7+/wICAP7/A
"\n",
"\u001b[1m> Finished chain.\u001b[0m\n"
]
}
],
"source": [
"# Run the chain specifying only the input variable for the first chain.\n",
"overall_chain = SimpleSequentialChain(\n",
" chains=[chain, chain_two, chain_three],verbose=True\n",
")\n",
"output = overall_chain.run(\"hats\")\n"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {},
"outputs": [],
"source": [
"#print the image\n",
"print_base64_image(output)"
]
}
],
"metadata": {
"kernelspec": {
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
"display_name": "Python 3 (ipykernel)",
2023-08-03 17:24:51 +00:00
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
EdenAI LLM update. Add models name option (#8963)
This PR follows the **Eden AI (LLM + embeddings) integration**. #8633
We added an optional parameter to choose different AI models for
providers (like 'text-bison' for provider 'google', 'text-davinci-003'
for provider 'openai', etc.).
Usage:
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"model": "text-bison", # new
"temperature": 0.2,
"max_tokens": 250,
},
)
```
You can also change the provider + model after initialization
```python
llm = EdenAI(
feature="text",
provider="google",
params={
"temperature": 0.2,
"max_tokens": 250,
},
)
prompt = """
hi
"""
llm(prompt, providers='openai', model='text-davinci-003') # change provider & model
```
The jupyter notebook as been updated with an example well.
Ping: @hwchase17, @baskaryan
---------
Co-authored-by: RedhaWassim <rwasssim@gmail.com>
Co-authored-by: sam <melaine.samy@gmail.com>
2023-09-01 19:11:33 +00:00
"version": "3.10.12"
}
2023-08-03 17:24:51 +00:00
},
"nbformat": 4,
"nbformat_minor": 2
}