Skip to content

replicate/cli

Repository files navigation

Replicate CLI

demo

Install

If you're using macOS, you can install the Replicate CLI using Homebrew:

brew tap replicate/tap
brew install replicate

Or you can build from source and install it with these commands (requires Go 1.20 or later):

make
sudo make install

Upgrade

If you previously installed the CLI with Homebrew, you can upgrade to the latest version by running the following command:

brew upgrade replicate

Usage

Grab your API token from replicate.com/account and set the REPLICATE_API_TOKEN environment variable.

$ export REPLICATE_API_TOKEN=<your token here>

Usage:
  replicate [command]

Core commands:
  hardware    Interact with hardware
  model       Interact with models
  prediction  Interact with predictions
  scaffold    Create a new local development environment from a prediction
  training    Interact with trainings

Alias commands:
  run         Alias for "prediction create"
  stream      Alias for "prediction create --stream"
  train       Alias for "training create"

Additional Commands:
  completion  Generate the autocompletion script for the specified shell
  help        Help about any command

Flags:
  -h, --help      help for replicate
  -v, --version   version for replicate

Use "replicate [command] --help" for more information about a command.```

Create a prediction

Generate an image with SDXL.

$ replicate run stability-ai/sdxl \
      prompt="a studio photo of a rainbow colored corgi"
Prediction created: https://replicate.com/p/jpgp263bdekvxileu2ppsy46v4

Stream prediction output

Run LLaMA 2 and stream output tokens to your terminal.

$ replicate run meta/llama-2-70b-chat --stream \
    prompt="Tell me a joke about llamas"
Sure, here's a joke about llamas for you:

Why did the llama refuse to play poker?

Because he always got fleeced!

Create a local development environment from a prediction

Create a Node.js or Python project from a prediction.

$ replicate scaffold https://replicate.com/p/jpgp263bdekvxileu2ppsy46v4 --template=node
Cloning starter repo and installing dependencies...
Cloning into 'jpgp263bdekvxileu2ppsy46v4'...
Writing new index.js...
Running example prediction...
[
  'https://replicate.delivery/pbxt/P79eJmjeJsql40QpRbWVDtGJSoTtLTdJ494kpQexSDhYGy0jA/out-0.png'
]
Done!

Chain multiple predictions

Generate an image with SDXL and upscale that image with ESRGAN.

$ replicate run stability-ai/sdxl \
      prompt="a studio photo of a rainbow colored corgi" | \
  replicate run nightmareai/real-esrgan --web \
      image={{.output[0]}}
# opens prediction in browser (https://replicate.com/p/jpgp263bdekvxileu2ppsy46v4)

Create a model

Create a new model on Replicate.

$ replicate model create yourname/model --private --hardware gpu-a40-small

To list available hardware types:

$ replicate hardware list

After creating your model, you can fine-tune an existing model or build and push a custom model using Cog.

Fine-tune a model

Fine-tune SDXL with your own images:

$ replicate train --destination mattt/sdxl-dreambooth --web \
      stability-ai/sdxl \
      input_images=@path/to/pictures.zip \
      use_face_detection_instead=true
# opens the training in browser

Note

Use the @ prefix to upload a file from your local filesystem. It works like curl's --data-binary option.

For more information, see our blog post about fine-tuning with SDXL.

View a model's inputs and outputs

Get the schema for SunoAI Bark

$ replicate model schema suno-ai/bark
Inputs:
- prompt: Input prompt (type: string)
- history_prompt: history choice for audio cloning, choose from the list (type: )
- custom_history_prompt: Provide your own .npz file with history choice for audio cloning, this will override the previous history_prompt setting (type: string)
- text_temp: generation temperature (1.0 more diverse, 0.0 more conservative) (type: number)
- waveform_temp: generation temperature (1.0 more diverse, 0.0 more conservative) (type: number)
- output_full: return full generation as a .npz file to be used as a history prompt (type: boolean)

Output:
- type: object