导入数据库作请求
导入数据库请求
Redis 企业软件 |
---|
方法 | 路径 | 描述 |
---|---|---|
发布 | /v1/bdbs/{uid}/actions/import |
启动手动数据集导入 |
启动手动数据集导入
POST /v1/bdbs/{int: uid}/actions/import
Initiate a manual import process.
Permissions
Permission name
Roles
start_bdb_import
admin
cluster_member
db_member
Request
Example HTTP request
POST /v1/bdbs/1/actions/import
Headers
Key
Value
Description
Host
cnm.cluster.fqdn
Domain name
Accept
application/json
Accepted media type
Content-Length
0
Length of the request body in octets
URL parameters
Field
Type
Description
uid
integer
The unique ID of the database
Body
The request may contain a subset of the BDB JSON object, which includes the following import-related attributes:
Field
Type
Description
dataset_import_sources
array of dataset_import_sources objects
Details for the import sources. Call GET /v1/jsonschema
on the bdb object and review the dataset_import_sources
field to retrieve the object's structure.
email_notification
boolean
Enable/disable an email notification on import failure/ completion. (optional)
Note:
Other attributes are not allowed and will cause the request to fail.
Example JSON body
General example:
{
"dataset_import_sources": [
{
"type": "url",
"url": "http://..."
},
{
"type": "url",
"url": "redis://..."
}
],
"email_notification": true
}
This request initiates an import process using dataset_import_sources
values that were previously configured for the database.
FTP example:
{
"dataset_import_sources": [
{
"type": "url",
"url": "ftp://<ftp_user>:<ftp_password>@example.com/<path>/<filename>.rdb.gz"
}
]
}
SFTP example:
{
"dataset_import_sources": [
{
"type": "sftp",
"sftp_url": "sftp://<sftp_user>@example.com/<path>/<filename>.rdb"
}
]
}
AWS S3 example:
{
"dataset_import_sources": [
{
"type": "s3",
"bucket_name": "backups",
"subdir": "test-db",
"filename": "<filename>.rdb",
"access_key_id": "XXXXXXXXXXXXX",
"secret_access_key": "XXXXXXXXXXXXXXXX"
}
]
}
Google Cloud Storage example:
{
"dataset_import_sources": [
{
"type": "gs",
"bucket_name": "backups",
"client_id": "XXXXXXXX",
"client_email": "cloud-storage-client@my-project-id.iam.gserviceaccount.com",
"subdir": "test-db",
"filename": "<filename>.rdb",
"private_key_id": "XXXXXXXXXXXXX",
"private_key": "XXXXXXXXXXXXXXXX"
}
]
}
Azure Blob Storage example:
{
"dataset_import_sources": [
{
"type": "abs",
"container": "backups",
"subdir": "test-db",
"filename": "<filename>.rdb",
"account_name": "name",
"account_key": "XXXXXXXXXXXXXXXX" // Or you can use "sas_token": "XXXXXXXXXXXXXXXXXX" instead
}
]
}
Response
Returns a status code.
Status codes
Code
Description
200 OK
The request is accepted and is being processed. In order to monitor progress, the import_status
, import_progress
, and import_failure_reason
attributes can be consulted.
404 Not Found
Attempting to perform an action on a nonexistent database.
406 Not Acceptable
Not all the modules loaded to the database support 'backup_restore' capability.
409 Conflict
Database is currently busy with another action. In this context, this is a temporary condition and the request should be reattempted later.
On this page