Harrison/banana fix (#1311)

Co-authored-by: Erik Dunteman <44653944+erik-dunteman@users.noreply.github.com>
docker-utility-pexpect
Harrison Chase 1 year ago committed by GitHub
parent 648b3b3909
commit 81abcae91a
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -4,8 +4,9 @@ This page covers how to use the Banana ecosystem within LangChain.
It is broken into two parts: installation and setup, and then references to specific Banana wrappers. It is broken into two parts: installation and setup, and then references to specific Banana wrappers.
## Installation and Setup ## Installation and Setup
- Install with `pip3 install banana-dev` - Install with `pip3 install banana-dev`
- Get an CerebriumAI api key and set it as an environment variable (`BANANA_API_KEY`) - Get an Banana api key and set it as an environment variable (`BANANA_API_KEY`)
## Define your Banana Template ## Define your Banana Template
@ -15,13 +16,16 @@ You can check out an example Banana repository [here](https://github.com/concept
## Build the Banana app ## Build the Banana app
You must include a output in the result. There is a rigid response structure. Banana Apps must include the "output" key in the return json.
There is a rigid response structure.
```python ```python
# Return the results as a dictionary # Return the results as a dictionary
result = {'output': result} result = {'output': result}
``` ```
An example inference function would be: An example inference function would be:
```python ```python
def inference(model_inputs:dict) -> dict: def inference(model_inputs:dict) -> dict:
global model global model
@ -58,17 +62,18 @@ def inference(model_inputs:dict) -> dict:
You can find a full example of a Banana app [here](https://github.com/conceptofmind/serverless-template-palmyra-base/blob/main/app.py). You can find a full example of a Banana app [here](https://github.com/conceptofmind/serverless-template-palmyra-base/blob/main/app.py).
## Wrappers ## Wrappers
### LLM ### LLM
There exists an Banana LLM wrapper, which you can access with There exists an Banana LLM wrapper, which you can access with
```python ```python
from langchain.llms import Banana from langchain.llms import Banana
``` ```
You need to provide a model key located in the dashboard: You need to provide a model key located in the dashboard:
```python ```python
llm = Banana(model_key="YOUR_MODEL_KEY") llm = Banana(model_key="YOUR_MODEL_KEY")
``` ```

@ -100,10 +100,15 @@ class Banana(LLM, BaseModel):
response = banana.run(api_key, model_key, model_inputs) response = banana.run(api_key, model_key, model_inputs)
try: try:
text = response["modelOutputs"][0]["output"] text = response["modelOutputs"][0]["output"]
except KeyError: except (KeyError, TypeError):
returned = response["modelOutputs"][0]
raise ValueError( raise ValueError(
f"Response should be {'modelOutputs': [{'output': 'text'}]}." "Response should be of schema: {'output': 'text'}."
f"Response was: {response}" f"\nResponse was: {returned}"
"\nTo fix this:"
"\n- fork the source repo of the Banana model"
"\n- modify app.py to return the above schema"
"\n- deploy that as a custom repo"
) )
if stop is not None: if stop is not None:
# I believe this is required since the stop tokens # I believe this is required since the stop tokens

Loading…
Cancel
Save