|
|
|
@ -4,8 +4,9 @@ This page covers how to use the Banana ecosystem within LangChain.
|
|
|
|
|
It is broken into two parts: installation and setup, and then references to specific Banana wrappers.
|
|
|
|
|
|
|
|
|
|
## Installation and Setup
|
|
|
|
|
|
|
|
|
|
- Install with `pip3 install banana-dev`
|
|
|
|
|
- Get an CerebriumAI api key and set it as an environment variable (`BANANA_API_KEY`)
|
|
|
|
|
- Get an Banana api key and set it as an environment variable (`BANANA_API_KEY`)
|
|
|
|
|
|
|
|
|
|
## Define your Banana Template
|
|
|
|
|
|
|
|
|
@ -15,13 +16,16 @@ You can check out an example Banana repository [here](https://github.com/concept
|
|
|
|
|
|
|
|
|
|
## Build the Banana app
|
|
|
|
|
|
|
|
|
|
You must include a output in the result. There is a rigid response structure.
|
|
|
|
|
Banana Apps must include the "output" key in the return json.
|
|
|
|
|
There is a rigid response structure.
|
|
|
|
|
|
|
|
|
|
```python
|
|
|
|
|
# Return the results as a dictionary
|
|
|
|
|
result = {'output': result}
|
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
An example inference function would be:
|
|
|
|
|
|
|
|
|
|
```python
|
|
|
|
|
def inference(model_inputs:dict) -> dict:
|
|
|
|
|
global model
|
|
|
|
@ -58,17 +62,18 @@ def inference(model_inputs:dict) -> dict:
|
|
|
|
|
|
|
|
|
|
You can find a full example of a Banana app [here](https://github.com/conceptofmind/serverless-template-palmyra-base/blob/main/app.py).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
## Wrappers
|
|
|
|
|
|
|
|
|
|
### LLM
|
|
|
|
|
|
|
|
|
|
There exists an Banana LLM wrapper, which you can access with
|
|
|
|
|
|
|
|
|
|
```python
|
|
|
|
|
from langchain.llms import Banana
|
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
You need to provide a model key located in the dashboard:
|
|
|
|
|
|
|
|
|
|
```python
|
|
|
|
|
llm = Banana(model_key="YOUR_MODEL_KEY")
|
|
|
|
|
```
|