updated readme

model_as_env_variable
Jonathan Dunn 7 months ago
parent 38c09afc85
commit 39c4636148

@ -194,10 +194,8 @@ Once you have it all set up, here's how to use it.
`fabric -h`
```bash
fabric [-h] [--text TEXT] [--copy] [--agents {trip_planner,ApiKeys}]
[--output [OUTPUT]] [--stream] [--list] [--update]
[--pattern PATTERN] [--setup] [--local] [--model MODEL]
[--listmodels] [--context]
fabric [-h] [--text TEXT] [--copy] [--agents {trip_planner,ApiKeys}] [--output [OUTPUT]] [--stream] [--list] [--update] [--pattern PATTERN] [--setup] [--local] [--model MODEL] [--listmodels]
[--context]
An open source framework for augmenting humans using AI.
@ -206,14 +204,10 @@ options:
--text TEXT, -t TEXT Text to extract summary from
--copy, -C Copy the response to the clipboard
--agents {trip_planner,ApiKeys}, -a {trip_planner,ApiKeys}
Use an AI agent to help you with a task. Acceptable
values are 'trip_planner' or 'ApiKeys'. This option
cannot be used with any other flag.
Use an AI agent to help you with a task. Acceptable values are 'trip_planner' or 'ApiKeys'. This option cannot be used with any other flag.
--output [OUTPUT], -o [OUTPUT]
Save the response to a file
--stream, -s Use this option if you want to see the results in
realtime. NOTE: You will not be able to pipe the
output into another command.
--stream, -s Use this option if you want to see the results in realtime. NOTE: You will not be able to pipe the output into another command.
--list, -l List available patterns
--update, -u Update patterns
--pattern PATTERN, -p PATTERN
@ -221,10 +215,9 @@ options:
--setup Set up your fabric instance
--local, -L Use local LLM. Default is llama2
--model MODEL, -m MODEL
Select the model to use (GPT-4 by default)
Select the model to use (GPT-4 by default for chatGPT and llama2 for Ollama)
--listmodels List all available models
--context, -c Use Context file (context.md) to add context to your
pattern
--context, -c Use Context file (context.md) to add context to your pattern
```
#### Example commands

@ -46,7 +46,7 @@ def main():
parser.add_argument(
'--local', '-L', help="Use local LLM. Default is llama2", action="store_true")
parser.add_argument(
"--model", "-m", help="Select the model to use (GPT-4 by default)", default="gpt-4-turbo-preview"
"--model", "-m", help="Select the model to use (GPT-4 by default for chatGPT and llama2 for Ollama)", default="gpt-4-turbo-preview"
)
parser.add_argument(
"--listmodels", help="List all available models", action="store_true"

Loading…
Cancel
Save