Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

API request for backup volumes/snapshot to S3 object store? #33

Open
devops-42 opened this issue Mar 11, 2019 · 6 comments
Open

API request for backup volumes/snapshot to S3 object store? #33

devops-42 opened this issue Mar 11, 2019 · 6 comments

Comments

@devops-42
Copy link

Hi,

how can I use the "Backup to" functionality via CLI in order to initiate volumes/snapshot backup into a defined S3 object store?

Thanks for your help!

@scaleoutsean
Copy link
Contributor

scaleoutsean commented Dec 7, 2019

It seems this isn't documented (see #32 ).

This is an example you can get by enabling API logging in the UI and using the feature to backup to S3 (one of the options):

{
  "id": 126,
  "method": "StartBulkVolumeRead",
  "params": {
    "volumeID": 5,
    "format": "native",
    "script": "bv_internal.py",
    "scriptParameters": {
      "range": {
        "lba": 0,
        "blocks": 17090048
      },
      "write": {
        "awsAccessKeyID": "123123123",
        "awsSecretAccessKey": "41231231234",
        "bucket": "backoops",
        "prefix": "myClusterName-k3z3/boot-5",
        "endpoint": "s3",
        "format": "native",
        "hostname": "s3.my.org"
      }
    }
  }
}

It seems the params would be (I omitted explaining the obvious ones):

  • format: leave value as-is
  • script: leave value as-is
  • scriptParameters,blocks: number of blocks (divide volume size in bytes by 4096)
  • write,bucket: bucket you want to backup to
  • write,prefix: path (pre-populated but you can change, in this case myClusterName-k3z3/boot-5 is made up of $clusterName-$clusterID/$volumeName-$volumeID so your backups for volumeID 5 would end up in that bucket's "subdirectory"
  • write,endpoint: leave value as-is (for backup to S3)
  • write,format: leave as-is
  • write,hostname: S3 API endpoint/gateway hostname or IP

You'd have to get some of these values from CLI or config files, calculate some (like number of blocks) and build a nested dictionary of params:

{
  "volumeID": 5,
  "format": "native",
  "script": "bv_internal.py",
  "scriptParameters": {
    "range": {
      "lba": 0,
      "blocks": 17090048
    },
    "write": {
      "awsAccessKeyID": "123123123",
      "awsSecretAccessKey": "41231231234",
      "bucket": "backoops",
      "prefix": "myClusterName-k3z3/boot-5",
      "endpoint": "s3",
      "format": "native",
      "hostname": "s3.my.org"
    }
  }
}

Assuming the above is in /tmp/params.txt you could build parameters from bottom up (by going in reverse)

#/usr/bin/python3
import json
data = open('/tmp/params.txt')
json_array = json.load(data)
json_array
print(json_array['scriptParameters'])
# {'range': {'lba': 0, 'blocks': 17090048}, 'write': {'awsAccessKeyID': '123123123', 'awsSecretAccessKey': '41231231234', 'bucket': 'backoops', 'prefix': 'myClusterName-k3z3/boot-5', 'endpoint': 's3', 'format': 'native', 'hostname': 's3.my.org'}}
print(json_array['scriptParameters']['range'])
# {'lba': 0, 'blocks': 17090048}
print(json_array['scriptParameters']['write'])
# {'awsAccessKeyID': '123123123', 'awsSecretAccessKey': '41231231234', 'bucket': 'backoops', 'prefix': 'myClusterName-k3z3/boot-5', 'endpoint': 's3', 'format': 'native', 'hostname': 's3.my.org'}

You can try to backup and restore a volume from the UI and see what your parameters to backup and restore should be.

For more than 2-3 VMs I would suggest to use a backup software, either free or commercial.

@drose12
Copy link

drose12 commented Jan 2, 2022

Any update on this ? I too am having issues getting this to work.

@scaleoutsean
Copy link
Contributor

Any update on this ? I too am having issues getting this to work.

What doesn't work?

The same recipe given in README.md should work with Backup to S3, just pack example parameters from JSON above into parameters as per the example from readme: sfcli -c 0 SFApi Invoke --method GetAccountByID --parameters "{\"accountID\":94}"

@drose12
Copy link

drose12 commented Jan 4, 2022

the range lba and blocks part does not work
I am attempting to just do it in python SDK instead

@scaleoutsean
Copy link
Contributor

the range lba and blocks part does not work I am attempting to just do it in python SDK instead

The way the CLI works for me is I use SFApi Invoke, lba 0 (as I back up entire volumes) and volSizeBytes/4096 for the number of blocks. I described my approach here

@drose12
Copy link

drose12 commented Jan 5, 2022

Thank you, yes I found this and I'm choosing the Python method as it works and allows for some more sophistication.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants