mirror of
https://github.com/danielmiessler/fabric
synced 2024-11-08 07:11:06 +00:00
commit
42d31ecbfe
50
README.md
50
README.md
@ -93,15 +93,15 @@ One of <code>fabric</code>'s primary features is helping people collect and inte
|
||||
|
||||
Fabric has Patterns for all sorts of life and work activities, including:
|
||||
|
||||
- Extracting the most interesting parts of YouTube videos and podcasts
|
||||
- Writing an essay in your own voice with just an idea as an input
|
||||
- Summarizing opaque academic papers
|
||||
- Creating perfectly matched AI art prompts for a piece of writing
|
||||
- Rating the quality of content to see if you want to read/watch the whole thing
|
||||
- Getting summaries of long, boring content
|
||||
- Explaining code to you
|
||||
- Turning bad documentation into usable documentation
|
||||
- Creating social media posts from any content input
|
||||
- Extracting the most interesting parts of YouTube videos and podcasts.
|
||||
- Writing an essay in your own voice with just an idea as an input.
|
||||
- Summarizing opaque academic papers.
|
||||
- Creating perfectly matched AI art prompts for a piece of writing.
|
||||
- Rating the quality of content to see if you want to read/watch the whole thing.
|
||||
- Getting summaries of long, boring content.
|
||||
- Explaining code to you.
|
||||
- Turning bad documentation into usable documentation.
|
||||
- Creating social media posts from any content input.
|
||||
- And a million more…
|
||||
|
||||
### Our approach to prompting
|
||||
@ -110,7 +110,7 @@ Fabric _Patterns_ are different than most prompts you'll see.
|
||||
|
||||
- **First, we use `Markdown` to help ensure maximum readability and editability**. This not only helps the creator make a good one, but also anyone who wants to deeply understand what it does. _Importantly, this also includes the AI you're sending it to!_
|
||||
|
||||
Here's an example of a Fabric Pattern.
|
||||
Here's an example of a Fabric Pattern
|
||||
|
||||
```bash
|
||||
https://github.com/danielmiessler/fabric/blob/main/patterns/extract_wisdom/system.md
|
||||
@ -127,11 +127,11 @@ https://github.com/danielmiessler/fabric/blob/main/patterns/extract_wisdom/syste
|
||||
The most feature-rich way to use Fabric is to use the `fabric` client, which can be found under <a href="https://github.com/danielmiessler/fabric/tree/main/installer/client">`/client`</a> directory in this repository.
|
||||
|
||||
### Required Python Version
|
||||
Ensure you have at least python3.10 installed on you operating system. Otherwise, when you attempt to run the pip install commands, the project will fail to build due to certain dependencies.
|
||||
Ensure you have at least python3.10 installed on your operating system. Otherwise, when you attempt to run the pip install commands, the project will fail to build due to certain dependencies.
|
||||
|
||||
### Setting up the fabric commands
|
||||
|
||||
Follow these steps to get all fabric related apps installed and configured.
|
||||
Follow these steps to get all fabric-related apps installed and configured.
|
||||
|
||||
1. Navigate to where you want the Fabric project to live on your system in a semi-permanent place on your computer.
|
||||
|
||||
@ -147,7 +147,7 @@ cd /where/you/keep/code
|
||||
git clone https://github.com/danielmiessler/fabric.git
|
||||
```
|
||||
|
||||
3. Enter Fabric's main directory
|
||||
3. Enter Fabric's main directory.
|
||||
|
||||
```bash
|
||||
# Enter the project folder (where you cloned it)
|
||||
@ -172,7 +172,7 @@ Windows:
|
||||
|
||||
Use WSL and follow the Linux instructions.
|
||||
|
||||
5. Install fabric
|
||||
5. Install fabric:
|
||||
|
||||
```bash
|
||||
pipx install .
|
||||
@ -198,16 +198,16 @@ fabric --help
|
||||
|
||||
### Using the `fabric` client
|
||||
|
||||
If you want to use it with OpenAI API compatible inference servers, such as [FastChat](https://github.com/lm-sys/FastChat), [Helmholtz Blablador](http://helmholtz-blablador.fz-juelich.de), [LM Studio](https://lmstudio.ai) and others, simply export the following environment variables:
|
||||
If you want to use it with OpenAI API-compatible inference servers, such as [FastChat](https://github.com/lm-sys/FastChat), [Helmholtz Blablador](http://helmholtz-blablador.fz-juelich.de), [LM Studio](https://lmstudio.ai) and others, simply export the following environment variables:
|
||||
|
||||
- `export OPENAI_BASE_URL=https://YOUR-SERVER:8000/v1/`
|
||||
- `export DEFAULT_MODEL="YOUR_MODEL"`
|
||||
|
||||
And if your server needs authentication tokens, like Blablador does, you export the token the same way you would with OpenAI:
|
||||
And if your server needs authentication tokens, as Blablador does, you export the token the same way you would with OpenAI:
|
||||
|
||||
- `export OPENAI_API_KEY="YOUR TOKEN"`
|
||||
|
||||
Once you have it all set up, here's how to use it.
|
||||
Once you have it all set up, here's how to use it:
|
||||
|
||||
1. Check out the options
|
||||
`fabric -h`
|
||||
@ -218,7 +218,7 @@ usage: fabric [-h] [--text TEXT] [--copy] [--agents] [--output [OUTPUT]] [--sess
|
||||
[--presence_penalty PRESENCE_PENALTY] [--update] [--pattern PATTERN] [--setup] [--changeDefaultModel CHANGEDEFAULTMODEL] [--model MODEL] [--listmodels]
|
||||
[--remoteOllamaServer REMOTEOLLAMASERVER] [--context]
|
||||
|
||||
An open source framework for augmenting humans using AI.
|
||||
An open-source framework for augmenting humans using AI.
|
||||
|
||||
options:
|
||||
-h, --help show this help message and exit
|
||||
@ -232,13 +232,13 @@ options:
|
||||
--gui Use the GUI (Node and npm need to be installed)
|
||||
--stream, -s Use this option if you want to see the results in realtime. NOTE: You will not be able to pipe the output into another command.
|
||||
--list, -l List available patterns
|
||||
--temp TEMP set the temperature for the model. Default is 0
|
||||
--temp TEMP sets the temperature for the model. Default is 0
|
||||
--top_p TOP_P set the top_p for the model. Default is 1
|
||||
--frequency_penalty FREQUENCY_PENALTY
|
||||
set the frequency penalty for the model. Default is 0.1
|
||||
sets the frequency penalty for the model. Default is 0.1
|
||||
--presence_penalty PRESENCE_PENALTY
|
||||
set the presence penalty for the model. Default is 0.1
|
||||
--update, -u Update patterns. NOTE: This will revert the default model to gpt4-turbo. please run --changeDefaultModel to once again set default model
|
||||
sets the presence penalty for the model. Default is 0.1
|
||||
--update, -u Update patterns. NOTE: This will revert the default model to gpt4-turbo. please run --changeDefaultModel to once again set the default model
|
||||
--pattern PATTERN, -p PATTERN
|
||||
The pattern (prompt) to use
|
||||
--setup Set up your fabric instance
|
||||
@ -248,7 +248,7 @@ options:
|
||||
Select the model to use
|
||||
--listmodels List all available models
|
||||
--remoteOllamaServer REMOTEOLLAMASERVER
|
||||
The URL of the remote ollamaserver to use. ONLY USE THIS if you are using a local ollama server in an non-default location or port
|
||||
The URL of the remote ollamaserver to use. ONLY USE THIS if you are using a local ollama server in a non-default location or port
|
||||
--context, -c Use Context file (context.md) to add context to your pattern
|
||||
```
|
||||
|
||||
@ -324,7 +324,7 @@ One of the coolest parts of the project is that it's **command-line native**!
|
||||
|
||||
Each Pattern you see in the `/patterns` directory can be used in any AI application you use, but you can also set up your own server using the `/server` code and then call APIs directly!
|
||||
|
||||
Once you're set up, you can do things like:
|
||||
Once you're set-up, you can do things like:
|
||||
|
||||
```bash
|
||||
# Take any idea from `stdin` and send it to the `/write_essay` API!
|
||||
@ -423,7 +423,7 @@ The content features a conversation between two individuals discussing various t
|
||||
|
||||
1. "You can't necessarily think yourself into the answers. You have to create space for the answers to come to you."
|
||||
2. "The West is dying and we are killing her."
|
||||
3. "The American Dream has been replaced by mass packaged mediocrity porn, encouraging us to revel like happy pigs in our own meekness."
|
||||
3. "The American Dream has been replaced by mass-packaged mediocrity porn, encouraging us to revel like happy pigs in our own meekness."
|
||||
4. "There's just not that many people who have the courage to reach beyond consensus and go explore new ideas."
|
||||
5. "I'll start watching Netflix when I've read the whole of human history."
|
||||
6. "Rilke saw beauty in everything... He sees it's in one little thing, a representation of all things that are beautiful."
|
||||
|
Loading…
Reference in New Issue
Block a user