@ -28,14 +28,14 @@ Manifest is meant to be a very light weight package to help with prompt iteratio
## Prompts
A Manifest prompt is a function that accepts a single input to generate a string prompt to send to a model.
```
```python
from manifest import Prompt
prompt = Prompt(lambda x: "Hello, my name is {x}")
print(prompt("Laurel"))
>>> "Hello, my name is Laurel"
```
We also let you use static strings
```
```python
prompt = Prompt("Hello, my name is static")
print(prompt())
>>> "Hello, my name is static"
@ -46,13 +46,13 @@ print(prompt())
## Sessions
Each Manifest run is a session that connects to a model endpoint and backend database to record prompt queries. To start a Manifest session for OpenAI, make sure you run
```
```bash
export OPENAI_API_KEY=<OPENAIKEY>
```
so we can access OpenAI.
Then, in a notebook, run:
```
```python
from manifest import Manifest
manifest = Manifest(
@ -64,7 +64,7 @@ manifest = Manifest(
This will start a session with OpenAI and save all results to a local file called `sqlite.cache`.
We also support a Redis backend. If you have a Redis database running on port 6379, run
```
```python
manifest = Manifest(
client_name = "openai",
cache_name = "redis",
@ -77,18 +77,18 @@ We will explain [below](#huggingface-models) how to use Manifest for a locally h
Once you have a session open, you can write and develop prompts.
```
```python
prompt = Prompt(lambda x: "Hello, my name is {x}")
By default, we do not truncate results based on a stop token. You can change this by either passing a new stop token to a Manifest session or to a `run` or `batch_run`. If you set the stop token to `""`, we will not truncate the model output.
```
```python
result = manifest.run(prompt, "Laurel", stop_token="and")
```
If you want to change default parameters to a model, we pass those as `kwargs` to the client.
```
```python
result = manifest.run(prompt, "Laurel", max_tokens=50)
```
# Huggingface Models
To use a HuggingFace generative model, in `manifest/api` we have a Falsk application that hosts the models for you.
In a separate terminal or Tmux/Screen session, run
You will see the Flask session start and output a URL `http://127.0.0.1:5000`. Pass this in to Manifest. If you want to use a different port, set the `FLASK_PORT` environment variable.
```
```python
manifest = Manifest(
client_name = "huggingface",
client_connection = "http://127.0.0.1:5000",
@ -122,11 +122,13 @@ manifest = Manifest(
)
```
If you have a custom model you trained, pass the model path to `--model_name`.
**Auto deployment coming soon**
# Development
Before submitting a PR, run
```
```bash
export REDIS_PORT="6380" # or whatever PORT local redis is running for those tests