将数据导入数据库
您可以导入特定 Redis Enterprise Software 数据库的导出或备份文件以还原数据。您可以从单个文件导入,也可以从多个文件导入,例如,当您要从集群数据库的备份导入时。
Redis 企业软件 |
---|
您可以导入、导出、 或特定 Redis Enterprise Software 数据库的备份文件以还原数据。 您可以从单个文件导入,也可以从多个文件导入。 例如,当您要从集群数据库的备份导入时。
将数据导入数据库
要使用 Cluster Manager UI 将数据导入数据库:
-
在 Databases (数据库) 屏幕上,从列表中选择数据库,然后选择 Configuration (配置)。
-
选择 Import。
-
选择与您的存储位置类型对应的选项卡,然后输入位置详细信息。
有关每种存储位置类型的更多信息,请参阅支持的存储位置。
-
选择 Import。
支持的存储位置
数据可以从本地挂载点导入,使用 FTP/SFTP 传输到 URI,或存储在云提供商存储中。
从本地挂载点或云提供商导入时,运行 Redis Enterprise Software 的组和用户需要可以使用导入位置。redislabs:redislabs
默认情况下。
Redis Enterprise Software 需要能够查看存储位置中的对象。实施详细信息因提供商和您的配置而异。要了解更多信息,请参阅提供商的文档。
以下部分提供了一般准则。由于提供程序功能经常更改,因此请使用提供程序的文档以获取最新信息。
FTP 服务器
从 FTP 服务器导入数据之前,请确保:
- 您的 Redis Enterprise 集群可以连接到 FTP 服务器并进行身份验证。
- 您在 FTP 服务器位置指定的用户有权从服务器读取文件。
要从 FTP 服务器导入数据,请使用以下语法设置 RDB 文件路径:
[protocol]://[username]:[password]@[host]:[port]/[path]/[filename].rdb
Where:
- protocol: the server's protocol, can be either
ftp
or ftps
.
- username: your username, if needed.
- password: your password, if needed.
- hostname: the hostname or IP address of the server.
- port: the port number of the server, if needed.
- path: the file's location path.
- filename: the name of the file.
Example: ftp://username:password@10.1.1.1/home/backups/<filename>.rdb
Select Add path to add another import file path.
Local mount point
Before importing data from a local mount point, make sure that:
-
The node can connect to the server hosting the mount point.
-
The redislabs:redislabs
user has permission to read files on the local mount point and on the destination server.
-
You must mount the storage in the same path on all cluster nodes. You can also use local storage, but you must copy the imported files manually to all nodes because the import source folders on the nodes are not synchronized.
To import from a local mount point:
-
On each node in the cluster, create the mount point:
-
Connect to the node's terminal.
-
Mount the remote storage to a local mount point.
For example:
sudo mount -t nfs 192.168.10.204:/DataVolume/Public /mnt/Public
-
In the path for the import location, enter the mount point.
For example: /mnt/Public/<filename>.rdb
As of version 6.2.12, Redis Enterprise reads files directly from the mount point using a symbolic link (symlink) instead of copying them to a temporary directory on the node.
Select Add path to add another import file path.
SFTP server
Before importing data from an SFTP server, make sure that:
-
Your Redis Enterprise cluster can connect and authenticate to the SFTP server.
-
The user that you specify in the SFTP server location has permission to read files from the server.
-
The SSH private keys are specified correctly. You can use the key generated by the cluster or specify a custom key.
To use the cluster auto generated key:
-
Go to Cluster > Security > Certificates.
-
Expand Cluster SSH Public Key.
-
Download or copy the cluster SSH public key to the appropriate location on the SFTP server.
Use the server documentation to determine the appropriate location for the SSH public key.
To import data from an SFTP server, enter the SFTP server location in the format:
[protocol]://[username]:[password]@[host]:[port]/[path]/[filename].rdb
Where:
- protocol: the server's protocol, can be either
ftp
or ftps
.
- username: your username, if needed.
- password: your password, if needed.
- hostname: the hostname or IP address of the server.
- port: the port number of the server, if needed.
- path: the file's location path.
- filename: the name of the file.
Example: sftp://username:password@10.1.1.1/home/backups/[filename].rdb
Select Add path to add another import file path.
AWS Simple Storage Service
Before you choose to import data from an Amazon Web Services (AWS) Simple Storage Service (S3) bucket, make sure you have:
- The path to the file in your bucket in the format:
s3://[bucketname]/[path]/[filename].rdb
- Access key ID and Secret access key for an IAM user with permission to read files from the bucket.
In the Redis Enterprise Software Cluster Manager UI, when you enter the export location details:
-
Select AWS S3.
-
In the RDB file path/s field, enter the path of your bucket. Select Add path to add another import file path.
-
In the Access key ID field, enter the access key ID.
-
In the Secret access key field, enter the secret access key.
You can also connect to a storage service that uses the S3 protocol but is not hosted by Amazon AWS. The storage service must have a valid SSL certificate. To connect to an S3-compatible storage location, run rladmin cluster config
:
rladmin cluster config s3_url <URL>
Replace <URL>
with the hostname or IP address of the S3-compatible storage location.
Google Cloud Storage
Before you import data from a Google Cloud storage bucket, make sure you have:
- Storage location path in the format:
/bucket_name/[path]/[filename].rdb
- A JSON service account key for your account
- A principal for your bucket with the
client_email
from the service account key and a role with permissions to get files from the bucket (such as the Storage Legacy Object Reader role, which grants storage.objects.get
permissions)
In the Redis Enterprise Software Cluster Manager UI, when you enter the import location details:
-
Select Google Cloud Storage.
-
In the RDB file path/s field, enter the path of your file. Select Add path to add another import file path.
-
In the Client ID field, enter the client_id
from the service account key.
-
In the Client email field, enter the client_email
from the service account key.
-
In the Private key id field, enter the private_key_id
from the service account key.
-
In the Private key field, enter the private_key
from the service account key.
Replace \n
with new lines.
Azure Blob Storage
Before you choose to import from Azure Blob Storage, make sure that you have:
-
Storage location path in the format: /container_name/[path/]/<filename>.rdb
-
Account name
-
An authentication token, either an account key or an Azure shared access signature (SAS).
To find the account name and account key, see Manage storage account access keys.
Azure SAS support requires Redis Software version 6.0.20. To learn more about Azure SAS, see Grant limited access to Azure Storage resources using shared access signatures.
In the Redis Enterprise Software Cluster Manager UI, when you enter the import location details:
-
Select Azure Blob Storage.
-
In the RDB file path/s field, enter the path of your file. Select Add path to add another import file path.
-
In the Azure Account Name field, enter your storage account name.
-
In the Azure Account Key field, enter the storage account key.
Importing into an Active-Active database
When importing data into an Active-Active database, there are two options:
- Flush all data from the Active-Active database, then import the data into the database.
- Import data but merge it into the existing database.
Because Active-Active databases have a numeric counter data type,
when you merge the imported data into the existing data RS increments counters by the value that is in the imported data.
The import through the Redis Enterprise Cluster Manager UI handles these data types for you.
You can import data into an Active-Active database from the Cluster Manager UI.
When you import data into an Active-Active database, there is a special prompt warning that the imported data will be merged into the existing database.
Continue learning with Redis University
See the Import data into a database on Redis Software course to learn more.
On this page