从数据库导出数据

您可以导出数据以将其导入到新数据库或进行备份。本文将介绍如何执行此作。

Redis 企业软件

您可以随时从特定数据库导出数据。支持以下目标:

  • FTP 服务器
  • SFTP 服务器
  • 亚马逊 AWS S3
  • 本地挂载点
  • Azure Blob 存储
  • Google 云存储

如果您导出为数据库集群配置的数据库,则会为每个分片创建导出文件。

存储空间要求

在导出数据之前,请验证您在存储目标和与托管数据库的节点关联的本地存储上是否有足够的可用空间。

导出过程分为两个步骤:将数据的临时副本保存到节点的本地存储,然后复制到存储目标。(复制作后,临时文件将被删除。

当没有足够的空间用于任一步骤时,导出将失败。

导出数据库数据

要使用 Cluster Manager UI 从数据库导出数据:

  1. Databases (数据库) 屏幕上,从列表中选择数据库,然后选择 Configuration (配置)。

  2. 单击 Toggle actions (切换作) 按钮 以打开其他作的列表。

  3. 选择 Export (导出)。

  4. 选择与您的存储位置类型对应的选项卡,然后输入位置详细信息。

    有关每种存储位置类型的更多信息,请参阅支持的存储位置

  5. 选择 Export (导出)。

支持的存储位置

数据可以导出到本地挂载点,使用 FTP/SFTP 传输到 URI,或存储在云提供商存储上。

当保存到本地挂载点或云提供商时,导出位置需要可供运行 Redis Enterprise Software 的组和用户使用。redislabs:redislabs默认情况下。

Redis Enterprise Software 需要能够查看权限和更新存储位置中的对象。实施详细信息因提供商和您的配置而异。要了解更多信息,请参阅提供商的文档。

以下部分提供了一般准则。由于提供程序功能经常更改,因此请使用提供程序的文档以获取最新信息。

FTP 服务器

在将数据导出到 FTP 服务器之前,请验证:

  • 您的 Redis Enterprise 集群可以连接到 FTP 服务器并进行身份验证。
  • 在 FTP 服务器位置指定的用户具有向服务器读取和写入文件的权限。

要将数据导出到 FTP 服务器,请使用以下语法设置 Path

[protocol]://[username]:[password]@[host]:[port]/[path]/

Where:

  • protocol: the server's protocol, can be either ftp or ftps.
  • username: your username, if needed.
  • password: your password, if needed.
  • hostname: the hostname or IP address of the server.
  • port: the port number of the server, if needed.
  • path: the export destination path, if needed.

Example: ftp://username:password@10.1.1.1/home/exports/

Local mount point

Before exporting data to a local mount point, verify that:

  • The node can connect to the server hosting the mount point.
  • The redislabs:redislabs user has permission to read and write files to the local mount point and to the destination server.
  • The export location has enough disk space for your exported data.

To export to a local mount point:

  1. On each node in the cluster, create the mount point:

    1. Connect to the node's terminal.

    2. Mount the remote storage to a local mount point.

      For example:

      sudo mount -t nfs 192.168.10.204:/DataVolume/Public /mnt/Public
      
  2. In the path for the export location, enter the mount point.

    For example: /mnt/Public

SFTP server

Before exporting data to an SFTP server, make sure that:

  • Your Redis Enterprise cluster can connect and authenticate to the SFTP server.

  • The user specified in the SFTP server location has permission to read and write files to the server.

  • The SSH private keys are specified correctly. You can use the key generated by the cluster or specify a custom key.

    To use the cluster auto generated key:

    1. Go to Cluster > Security > Certificates.

    2. Expand Cluster SSH Public Key.

    3. Download or copy the cluster SSH public key to the appropriate location on the SFTP server.

      Use the server documentation to determine the appropriate location for the SSH public key.

To export data to an SFTP server, enter the SFTP server location in the format:

sftp://[username]:[password]@[host]:[port]/[path]/

Where:

  • username: your username, if needed.
  • password: your password, if needed.
  • hostname: the hostname or IP address of the server.
  • port: the port number of the server, if needed.
  • path: the export destination path, if needed.

For example: sftp://username:password@10.1.1.1/home/exports/

AWS Simple Storage Service

To export data to an Amazon Web Services (AWS) Simple Storage Service (S3) bucket:

  1. Sign in to the AWS console.

  2. Create an S3 bucket if you do not already have one.

  3. Create an IAM User with permission to add objects to the bucket.

  4. Create an access key for that user if you do not already have one.

  5. In the Redis Enterprise Software Cluster Manager UI, when you enter the export location details:

    • Select AWS S3.

    • In the Path field, enter the path of your bucket.

    • In the Access key ID field, enter the access key ID.

    • In the Secret access key field, enter the secret access key.

You can also connect to a storage service that uses the S3 protocol but is not hosted by Amazon AWS. The storage service must have a valid SSL certificate. To connect to an S3-compatible storage location, run rladmin cluster config:

rladmin cluster config s3_url <URL>

Replace <URL> with the hostname or IP address of the S3-compatible storage location.

Google Cloud Storage

To export to a Google Cloud storage bucket:

  1. Sign in to the Google Cloud console.

  2. Create a JSON service account key if you do not already have one.

  3. Create a bucket if you do not already have one.

  4. Add a principal to your bucket:

    • In the New principals field, add the client_email from the service account key.

    • Select "Storage Legacy Bucket Writer" from the Role list.

  5. In the Redis Enterprise Software Cluster Manager UI, when you enter the export location details:

    • Select Google Cloud Storage.

    • In the Path field, enter the path of your bucket.

    • In the Client ID field, enter the client_id from the service account key.

    • In the Client Email field, enter the client_email from the service account key.

    • In the Private Key ID field, enter the private_key_id from the service account key.

    • In the Private key field, enter the private_key from the service account key. Replace \n with new lines.

Azure Blob Storage

To export to Microsoft Azure Blob Storage, sign in to the Azure portal and then:

  1. Create an Azure Storage account if you do not already have one.

  2. Create a container if you do not already have one.

  3. Manage storage account access keys to find the storage account name and account keys.

  4. In the Redis Enterprise Software Cluster Manager UI, when you enter the export location details:

    • Select Azure Blob Storage.

    • In the Path field, enter the path of your bucket.

    • In the Account name field, enter your storage account name.

    • In the Account key field, enter the storage account key.

To learn more, see Authorizing access to data in Azure Storage.

RATE THIS PAGE
Back to top ↑