将数据导入数据库
Redis 云 |
---|
您可以将现有数据集从现有 Redis 服务器或 RDB 文件导入到 Redis Cloud 实例中。
先决条件
在准备过程中,请确保源数据库的 Redis 版本与将导入数据的数据库兼容。
从 Redis 服务器导入
要从任何公开可用的 Redis 服务器导入数据集:
- 从 Redis Cloud 控制台菜单中选择 Databases,然后从数据库列表中选择目标数据库。
- 在 Danger Zone(危险区域)中,选择 Import(导入)。
- 输入源数据库详细信息:
- 源类型 - 选择 Redis。
- Redis 主机名/IP 地址 - 输入源 Redis 服务器的主机名或公有 IP 地址。
- Redis 端口 - 如果源 Redis 服务器的端口不是默认值
6379
. - 密码 - 如果源 Redis 数据库需要,请输入密码。
- 选择 Import。
从 RDB 文件还原
如果您有来自先前备份的 RDB 或压缩的 RDB 文件,则可以将该文件中的数据还原到 Redis Cloud 数据库中。
通过 FTP 或 HTTP
要导入存储在 FTP 或 HTTP 服务器上的 RDB 文件:
-
从 Redis Cloud 控制台菜单中选择 Databases,然后从列表中选择您的数据库。
-
在 Danger Zone(危险区域)中,选择 Import(导入)。
-
输入 RDB 文件的详细信息:
- 源类型 - 选择 FTP 或 HTTP。
- 源路径 - 输入 RDB 文件的 URL:
<protocol>://[username][:password]@hostname[:port]/[path/]filename.rdb[.gz]
哪里:
protocol
- 服务器协议:ftp、ftps、http、httpsusername
- 您的用户名(如有必要)password
- 您的密码(如有必要)hostname
- 服务器的主机名或 IP 地址port
- 服务器的端口号(如果不是)6379
path
- 文件的路径(如有必要)filename
- RDB 文件的文件名,包括.gz
suffix(如果文件已压缩)
注意:如果您的 FTP 用户名或密码包含特殊字符,例如 、 、 或@
\
:
,则必须对这些特殊字符进行 URL 编码(也称为百分比编码)。否则,您的数据库可能会卡住。 -
对于具有多个 RDB 文件的分片数据库,请选择 Add source (添加源) 以添加另一个 RDB 文件。
-
选择 Import。
通过 AWS S3
要使用 Redis Cloud 控制台导入数据,您必须先从 Amazon Web Services (AWS) 管理控制台共享文件。
要共享和导入存储在 AWS Simple Storage Service (S3) 存储桶中的 RDB 文件,请执行以下作:
-
在 AWS Management Console中,配置存储桶的存储桶策略以授予对 Redis Cloud 的访问权限:
-
使用 Services (服务) 菜单找到并选择 Storage (存储) > S3.这将带您进入 Amazon S3 管理面板。
-
使用 Buckets (存储桶) 列表查找并选择您的存储桶。当设置出现时,选择 Permissions (权限) 选项卡,找到 Bucket policy (存储桶策略) 部分,然后单击 Edit (编辑)。
-
如果没有现有的 bucket policy,请添加以下 JSON bucket policy。取代
UNIQUE-BUCKET-NAME
替换为您的存储桶的名称。{ "Version": "2012-10-17", "Id": "MyBucketPolicy", "Statement": [ { "Sid": "RedisCloudBackupsAccess", "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::168085023892:root" }, "Action": [ "s3:PutObject", "s3:getObject", "s3:DeleteObject" ], "Resource": "arn:aws:s3:::UNIQUE-BUCKET-NAME/*" } ] }
-
If a bucket policy already exists, add the following JSON policy statement to the list of statements. Replace
UNIQUE-BUCKET-NAME
with the name of your bucket.{ "Sid": "RedisCloudBackupsAccess", "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::168085023892:root" }, "Action": [ "s3:PutObject", "s3:getObject", "s3:DeleteObject" ], "Resource": "arn:aws:s3:::UNIQUE-BUCKET-NAME/*" }
-
-
Save your changes.
-
If the bucket is encrypted using SSE-KMS, add the following statement to your key policy. If you do not have a key policy, see Creating a key policy. Replace
UNIQUE-BUCKET-NAME
with the name of your bucket andCUSTOM-KEY-ARN
with your key's Amazon Resource Name (ARN).{ "Sid": "Allow use of the key", "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::168085023892:root" }, "Action": [ "kms:Encrypt", "kms:Decrypt", "kms:ReEncrypt*", "kms:GenerateDataKey*", "kms:DescribeKey" ], "Resource": [ "arn:aws:s3:::UNIQUE-BUCKET-NAME/*", "CUSTOM-KEY-ARN" ] }
-
-
In the Redis Cloud console, select the target database from the database list.
-
In the Danger Zone, select Import.
-
Enter the details for the RDB file:
-
Source type - Select AWS S3.
-
Source path - Enter the URL for the RDB file:
s3://bucketname/[path/]filename.rdb[.gz]
Where:
bucketname
- Name of the S3 bucketpath
- Path to the file, if necessaryfilename
- Filename of the RDB file, including the .gz suffix if the file is compressed
-
-
For sharded databases with multiple RDB files, select Add source to add another RDB file.
-
Select Import.
Via Google Cloud Storage
To use the Redis Cloud console to import your data, you must first share the file from the Google Cloud console.
To share and import an RDB file that is stored in a Google Cloud Storage bucket:
-
In the Google Cloud Storage bucket, edit the file's Access Control List to give read access to Redis Cloud:
- Select Edit access in the RDB file menu.
- Select Add item.
- Enter the user details and access:
- In the Entity field of the new item, select User.
- In the Name field of the new item, enter:
service@redislabs-prod-clusters.iam.gserviceaccount.com
- In the Access field of the new item, select Reader.
- Select Save.
For more info, see Set ACLs.
-
In the Redis Cloud console, select the target database from the database list.
-
In the Danger Zone, select Import.
-
Enter the details for the RDB file:
-
Source type - Select Google Cloud Storage.
-
Source path - Enter the URL for the RDB file: gs://bucketname/[path/]filename.rdb[.gz]
Where:
bucketname
- Name of the GCS bucket
path
- Path to the file
filename
- Filename of the RDB file, including the .gz suffix if the file is compressed
-
For sharded databases with multiple RDB files, select Add source to add another RDB file.
-
Select Import.
Via Azure Blob Storage container
To import an RDB file stored in a Microsoft Azure Blog storage container:
-
In the Redis Cloud console, select the target database from the database list.
-
In the Danger Zone, select Import.
-
Enter the details for the RDB file:
-
Source type - Select Azure Blob Storage.
-
Source path - Enter the URL for the RDB file:
abs://:storage_account_access_key@storage_account_name/[container/]filename.rdb[.gz]
Where:
storage_account_access_key
- Primary access key to the storage account
storage_account_name
- Name of the storage account
url
- URL of the storage account
container
- Name of the container, if necessary
filename
- Filename of the RDB file, including the .gz suffix if the file is compressed
-
For sharded databases with multiple RDB files, select Add source to add another RDB file.
-
Select Import.
On this page