Skip to content

fabric is an open-source framework for augmenting humans using AI. It provides a modular framework for solving specific problems using a crowdsourced set of AI prompts that can be used anywhere.

License

Notifications You must be signed in to change notification settings

arvosai/fabric

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fabric is graciously supported by…

Github Repo Tagline

fabriclogo

fabric

Static Badge
GitHub top language GitHub last commit License: MIT

fabric is an open-source framework for augmenting humans using AI.

Updates • What and Why • Philosophy • Installation • Usage • Examples • Just Use the Patterns • Custom Patterns • Helper Apps • Meta

Screenshot of fabric

Navigation


Updates

Note

May 22, 2025

What and why

Since the start of 2023 and GenAI we've seen a massive number of AI applications for accomplishing tasks. It's powerful, but it's not easy to integrate this functionality into our lives.

In other words, AI doesn't have a capabilities problem—it has an integration problem.

Fabric was created to address this by enabling everyone to granularly apply AI to everyday challenges.

Intro videos

Keep in mind that many of these were recorded when Fabric was Python-based, so remember to use the current install instructions below.

Philosophy

AI isn't a thing; it's a magnifier of a thing. And that thing is human creativity.

We believe the purpose of technology is to help humans flourish, so when we talk about AI we start with the human problems we want to solve.

Breaking problems into components

Our approach is to break problems into individual pieces (see below) and then apply AI to them one at a time. See below for some examples.

augmented_challenges

Too many prompts

Prompts are good for this, but the biggest challenge I faced in 2023——which still exists today—is the sheer number of AI prompts out there. We all have prompts that are useful, but it's hard to discover new ones, know if they are good or not, and manage different versions of the ones we like.

One of fabric's primary features is helping people collect and integrate prompts, which we call Patterns, into various parts of their lives.

Fabric has Patterns for all sorts of life and work activities, including:

  • Extracting the most interesting parts of YouTube videos and podcasts
  • Writing an essay in your own voice with just an idea as an input
  • Summarizing opaque academic papers
  • Creating perfectly matched AI art prompts for a piece of writing
  • Rating the quality of content to see if you want to read/watch the whole thing
  • Getting summaries of long, boring content
  • Explaining code to you
  • Turning bad documentation into usable documentation
  • Creating social media posts from any content input
  • And a million more…

Installation

To install Fabric, you can use the latest release binaries or install it from the source.

Get Latest Release Binaries

Windows

https://github.com/danielmiessler/fabric/releases/latest/download/fabric-windows-amd64.exe

macOS (arm64)

curl -L https://github.com/danielmiessler/fabric/releases/latest/download/fabric-darwin-arm64 > fabric && chmod +x fabric && ./fabric --version

macOS (amd64)

curl -L https://github.com/danielmiessler/fabric/releases/latest/download/fabric-darwin-amd64 > fabric && chmod +x fabric && ./fabric --version

Linux (amd64)

curl -L https://github.com/danielmiessler/fabric/releases/latest/download/fabric-linux-amd64 > fabric && chmod +x fabric && ./fabric --version

Linux (arm64)

curl -L https://github.com/danielmiessler/fabric/releases/latest/download/fabric-linux-arm64 > fabric && chmod +x fabric && ./fabric --version

Using package managers

NOTE: using Homebrew or the Arch Linux package managers makes fabric available as fabric-ai, so add the following alias to your shell startup files to account for this:

alias fabric='fabric-ai'

macOS (Homebrew)

brew install fabric-ai

Arch Linux (AUR)

yay -S fabric-ai

From Source

To install Fabric, make sure Go is installed, and then run the following command.

# Install Fabric directly from the repo
go install github.com/danielmiessler/fabric@latest

Environment Variables

You may need to set some environment variables in your ~/.bashrc on linux or ~/.zshrc file on mac to be able to run the fabric command. Here is an example of what you can add:

For Intel based macs or linux

# Golang environment variables
export GOROOT=/usr/local/go
export GOPATH=$HOME/go

# Update PATH to include GOPATH and GOROOT binaries
export PATH=$GOPATH/bin:$GOROOT/bin:$HOME/.local/bin:$PATH

for Apple Silicon based macs

# Golang environment variables
export GOROOT=$(brew --prefix go)/libexec
export GOPATH=$HOME/go
export PATH=$GOPATH/bin:$GOROOT/bin:$HOME/.local/bin:$PATH

Setup

Now run the following command

# Run the setup to set up your directories and keys
fabric --setup

If everything works you are good to go.

Add aliases for all patterns

In order to add aliases for all your patterns and use them directly as commands ie. summarize instead of fabric --pattern summarize You can add the following to your .zshrc or .bashrc file.

# Loop through all files in the ~/.config/fabric/patterns directory
for pattern_file in $HOME/.config/fabric/patterns/*; do
    # Get the base name of the file (i.e., remove the directory path)
    pattern_name=$(basename "$pattern_file")

    # Create an alias in the form: alias pattern_name="fabric --pattern pattern_name"
    alias_command="alias $pattern_name='fabric --pattern $pattern_name'"

    # Evaluate the alias command to add it to the current shell
    eval "$alias_command"
done

yt() {
    if [ "$#" -eq 0 ] || [ "$#" -gt 2 ]; then
        echo "Usage: yt [-t | --timestamps] youtube-link"
        echo "Use the '-t' flag to get the transcript with timestamps."
        return 1
    fi

    transcript_flag="--transcript"
    if [ "$1" = "-t" ] || [ "$1" = "--timestamps" ]; then
        transcript_flag="--transcript-with-timestamps"
        shift
    fi
    local video_link="$1"
    fabric -y "$video_link" $transcript_flag
}

You can add the below code for the equivalent aliases inside PowerShell by running notepad $PROFILE inside a PowerShell window:

# Path to the patterns directory
$patternsPath = Join-Path $HOME ".config/fabric/patterns"
foreach ($patternDir in Get-ChildItem -Path $patternsPath -Directory) {
    $patternName = $patternDir.Name

    # Dynamically define a function for each pattern
    $functionDefinition = @"
function $patternName {
    [CmdletBinding()]
    param(
        [Parameter(ValueFromPipeline = `$true)]
        [string] `$InputObject,

        [Parameter(ValueFromRemainingArguments = `$true)]
        [String[]] `$patternArgs
    )

    begin {
        # Initialize an array to collect pipeline input
        `$collector = @()
    }

    process {
        # Collect pipeline input objects
        if (`$InputObject) {
            `$collector += `$InputObject
        }
    }

    end {
        # Join all pipeline input into a single string, separated by newlines
        `$pipelineContent = `$collector -join "`n"

        # If there's pipeline input, include it in the call to fabric
        if (`$pipelineContent) {
            `$pipelineContent | fabric --pattern $patternName `$patternArgs
        } else {
            # No pipeline input; just call fabric with the additional args
            fabric --pattern $patternName `$patternArgs
        }
    }
}
"@
    # Add the function to the current session
    Invoke-Expression $functionDefinition
}

# Define the 'yt' function as well
function yt {
    [CmdletBinding()]
    param(
        [Parameter()]
        [Alias("timestamps")]
        [switch]$t,

        [Parameter(Position = 0, ValueFromPipeline = $true)]
        [string]$videoLink
    )

    begin {
        $transcriptFlag = "--transcript"
        if ($t) {
            $transcriptFlag = "--transcript-with-timestamps"
        }
    }

    process {
        if (-not $videoLink) {
            Write-Error "Usage: yt [-t | --timestamps] youtube-link"
            return
        }
    }

    end {
        if ($videoLink) {
            # Execute and allow output to flow through the pipeline
            fabric -y $videoLink $transcriptFlag
        }
    }
}

This also creates a yt alias that allows you to use yt https://www.youtube.com/watch?v=4b0iet22VIk to get transcripts, comments, and metadata.

Save your files in markdown using aliases

If in addition to the above aliases you would like to have the option to save the output to your favorite markdown note vault like Obsidian then instead of the above add the following to your .zshrc or .bashrc file:

# Define the base directory for Obsidian notes
obsidian_base="/path/to/obsidian"

# Loop through all files in the ~/.config/fabric/patterns directory
for pattern_file in ~/.config/fabric/patterns/*; do
    # Get the base name of the file (i.e., remove the directory path)
    pattern_name=$(basename "$pattern_file")

    # Remove any existing alias with the same name
    unalias "$pattern_name" 2>/dev/null

    # Define a function dynamically for each pattern
    eval "
    $pattern_name() {
        local title=\$1
        local date_stamp=\$(date +'%Y-%m-%d')
        local output_path=\"\$obsidian_base/\${date_stamp}-\${title}.md\"

        # Check if a title was provided
        if [ -n \"\$title\" ]; then
            # If a title is provided, use the output path
            fabric --pattern \"$pattern_name\" -o \"\$output_path\"
        else
            # If no title is provided, use --stream
            fabric --pattern \"$pattern_name\" --stream
        fi
    }
    "
done

This will allow you to use the patterns as aliases like in the above for example summarize instead of fabric --pattern summarize --stream, however if you pass in an extra argument like this summarize "my_article_title" your output will be saved in the destination that you set in obsidian_base="/path/to/obsidian" in the following format YYYY-MM-DD-my_article_title.md where the date gets autogenerated for you. You can tweak the date format by tweaking the date_stamp format.

Migration

If you have the Legacy (Python) version installed and want to migrate to the Go version, here's how you do it. It's basically two steps: 1) uninstall the Python version, and 2) install the Go version.

# Uninstall Legacy Fabric
pipx uninstall fabric

# Clear any old Fabric aliases
(check your .bashrc, .zshrc, etc.)
# Install the Go version
go install github.com/danielmiessler/fabric@latest
# Run setup for the new version. Important because things have changed
fabric --setup

Then set your environmental variables as shown above.

Upgrading

The great thing about Go is that it's super easy to upgrade. Just run the same command you used to install it in the first place and you'll always get the latest version.

go install github.com/danielmiessler/fabric@latest

Shell Completions

Fabric provides shell completion scripts for Zsh, Bash, and Fish shells, making it easier to use the CLI by providing tab completion for commands and options.

Zsh Completion

To enable Zsh completion:

# Copy the completion file to a directory in your $fpath
mkdir -p ~/.zsh/completions
cp completions/_fabric ~/.zsh/completions/

# Add the directory to fpath in your .zshrc before compinit
echo 'fpath=(~/.zsh/completions $fpath)' >> ~/.zshrc
echo 'autoload -Uz compinit && compinit' >> ~/.zshrc

Bash Completion

To enable Bash completion:

# Source the completion script in your .bashrc
echo 'source /path/to/fabric/completions/fabric.bash' >> ~/.bashrc

# Or copy to the system-wide bash completion directory
sudo cp completions/fabric.bash /etc/bash_completion.d/

Fish Completion

To enable Fish completion:

# Copy the completion file to the fish completions directory
mkdir -p ~/.config/fish/completions
cp completions/fabric.fish ~/.config/fish/completions/

Usage

Once you have it all set up, here's how to use it.

fabric -h

Usage:
  fabric [OPTIONS]

Application Options:
  -p, --pattern=                    Choose a pattern from the available patterns
  -v, --variable=                   Values for pattern variables, e.g. -v=#role:expert -v=#points:30
  -C, --context=                    Choose a context from the available contexts
      --session=                    Choose a session from the available sessions
  -a, --attachment=                 Attachment path or URL (e.g. for OpenAI image recognition messages)
  -S, --setup                       Run setup for all reconfigurable parts of fabric
  -t, --temperature=                Set temperature (default: 0.7)
  -T, --topp=                       Set top P (default: 0.9)
  -s, --stream                      Stream
  -P, --presencepenalty=            Set presence penalty (default: 0.0)
  -r, --raw                         Use the defaults of the model without sending chat options (like temperature etc.) and use the user role instead of the system role for patterns.
  -F, --frequencypenalty=           Set frequency penalty (default: 0.0)
  -l, --listpatterns                List all patterns
  -L, --listmodels                  List all available models
  -x, --listcontexts                List all contexts
  -X, --listsessions                List all sessions
  -U, --updatepatterns              Update patterns
  -c, --copy                        Copy to clipboard
  -m, --model=                      Choose model
      --modelContextLength=         Model context length (only affects ollama)
  -o, --output=                     Output to file
      --output-session              Output the entire session (also a temporary one) to the output file
  -n, --latest=                     Number of latest patterns to list (default: 0)
  -d, --changeDefaultModel          Change default model
  -y, --youtube=                    YouTube video or play list "URL" to grab transcript, comments from it and send to chat or print it put to the console and store it in the output file
      --playlist                    Prefer playlist over video if both ids are present in the URL
      --transcript                  Grab transcript from YouTube video and send to chat (it is used per default).
      --transcript-with-timestamps  Grab transcript from YouTube video with timestamps and send to chat
      --comments                    Grab comments from YouTube video and send to chat
      --metadata                    Output video metadata
  -g, --language=                   Specify the Language Code for the chat, e.g. -g=en -g=zh
  -u, --scrape_url=                 Scrape website URL to markdown using Jina AI
  -q, --scrape_question=            Search question using Jina AI
  -e, --seed=                       Seed to be used for LMM generation
  -w, --wipecontext=                Wipe context
  -W, --wipesession=                Wipe session
      --printcontext=               Print context
      --printsession=               Print session
      --readability                 Convert HTML input into a clean, readable view
      --input-has-vars              Apply variables to user input
      --dry-run                     Show what would be sent to the model without actually sending it
      --serve                       Serve the Fabric Rest API
      --serveOllama                 Serve the Fabric Rest API with ollama endpoints
      --address=                    The address to bind the REST API (default: :8080)
      --api-key=                    API key used to secure server routes
      --config=                     Path to YAML config file
      --version                     Print current version
      --listextensions              List all registered extensions
      --addextension=               Register a new extension from config file path
      --rmextension=                Remove a registered extension by name
      --strategy=                   Choose a strategy from the available strategies
      --liststrategies              List all strategies
      --listvendors                 List all vendors
      --shell-complete-list         Output raw list without headers/formatting (for shell completion)

Help Options:
  -h, --help                        Show this help message

Our approach to prompting

Fabric Patterns are different than most prompts you'll see.

  • First, we use Markdown to help ensure maximum readability and editability. This not only helps the creator make a good one, but also anyone who wants to deeply understand what it does. Importantly, this also includes the AI you're sending it to!

Here's an example of a Fabric Pattern.

https://github.com/danielmiessler/fabric/blob/main/patterns/extract_wisdom/system.md

pattern-example

  • Next, we are extremely clear in our instructions, and we use the Markdown structure to emphasize what we want the AI to do, and in what order.

  • And finally, we tend to use the System section of the prompt almost exclusively. In over a year of being heads-down with this stuff, we've just seen more efficacy from doing that. If that changes, or we're shown data that says otherwise, we will adjust.

Examples

The following examples use the macOS pbpaste to paste from the clipboard. See the pbpaste section below for Windows and Linux alternatives.

Now let's look at some things you can do with Fabric.

  1. Run the summarize Pattern based on input from stdin. In this case, the body of an article.

    pbpaste | fabric --pattern summarize
  2. Run the analyze_claims Pattern with the --stream option to get immediate and streaming results.

    pbpaste | fabric --stream --pattern analyze_claims
  3. Run the extract_wisdom Pattern with the --stream option to get immediate and streaming results from any Youtube video (much like in the original introduction video).

    fabric -y "https://youtube.com/watch?v=uXs-zPc63kM" --stream --pattern extract_wisdom
  4. Create patterns- you must create a .md file with the pattern and save it to ~/.config/fabric/patterns/[yourpatternname].

  5. Run a analyze_claims pattern on a website. Fabric uses Jina AI to scrape the URL into markdown format before sending it to the model.

    fabric -u https://github.com/danielmiessler/fabric/ -p analyze_claims

Just use the Patterns

fabric-patterns-screenshot



If you're not looking to do anything fancy, and you just want a lot of great prompts, you can navigate to the /patterns directory and start exploring!

We hope that if you used nothing else from Fabric, the Patterns by themselves will make the project useful.

You can use any of the Patterns you see there in any AI application that you have, whether that's ChatGPT or some other app or website. Our plan and prediction is that people will soon be sharing many more than those we've published, and they will be way better than ours.

The wisdom of crowds for the win.

Prompt Strategies

Fabric also implements prompt strategies like "Chain of Thought" or "Chain of Draft" which can be used in addition to the basic patterns.

See the Thinking Faster by Writing Less paper and the Thought Generation section of Learn Prompting for examples of prompt strategies.

Each strategy is available as a small json file in the /strategies directory.

The prompt modification of the strategy is applied to the system prompt and passed on to the LLM in the chat session.

Use fabric -S and select the option to install the strategies in your ~/.config/fabric directory.

Custom Patterns