The capacity of a TiKV cluster can be increased or decreased without interrupting online services.

This document describes how to scale a TiKV or PD cluster using TiUP.

For example, assume that the topology of the cluster is as follows:

Host IPService
10.0.1.1Monitor
10.0.1.2PD
10.0.1.3PD
10.0.1.4PD
10.0.1.5TiKV
10.0.1.6TiKV
10.0.1.7TiKV

Scale out a TiKV cluster

If you want to add a TiKV node to the 10.0.1.8 host, take the following steps.

  1. Configure the scale-out topology

    Put the following contents in the scale-out-tikv.yaml file:

    1. tikv_servers:
    2. - host: 10.0.1.8
    3. ssh_port: 22
    4. port: 20160
    5. status_port: 20180
    6. deploy_dir: /data/deploy/install/deploy/tikv-20160
    7. data_dir: /data/deploy/install/data/tikv-20160
    8. log_dir: /data/deploy/install/log/tikv-20160

    To view the configuration of the current cluster, run tiup cluster edit-config <cluster-name>. Because the parameter configuration of global and server_configs is inherited by scale-out-tikv.yaml and thus also takes effect in scale-out-tikv.yaml.

  2. Run the scale-out command

    1. tiup cluster scale-out <cluster-name> scale-out-tikv.yaml

    If you see the message “Scaled cluster out successfully”, it means that the scale-out operation is successfully completed.

  3. Check the cluster status

    1. tiup cluster display <cluster-name>

    Access the monitoring platform at http://10.0.1.1:3000 using your browser to monitor the status of the cluster and the new node.

After the scale-out, the cluster topology is as follows:

Host IPService
10.0.1.1Monitor
10.0.1.2PD
10.0.1.3PD
10.0.1.4PD
10.0.1.5TiKV
10.0.1.6TiKV
10.0.1.7TiKV
10.0.1.8TiKV

Scale out a PD cluster

If you want to add a PD node to the 10.0.1.9 host, take the following steps.

  1. Configure the scale-out topology

    Put the following contents in the scale-out-pd.yaml file:

    1. pd_servers:
    2. - host: 10.0.1.9
    3. ssh_port: 22
    4. client_port: 2379
    5. peer_port: 2380
    6. deploy_dir: /data/deploy/install/deploy/pd-2379
    7. data_dir: /data/deploy/install/data/pd-2379
    8. log_dir: /data/deploy/install/log/pd-2379

    To view the configuration of the current cluster, run tiup cluster edit-config <cluster-name>. Because the parameter configuration of global and server_configs is inherited by scale-out-pd.yaml and thus also takes effect in scale-out-pd.yaml.

  2. Run the scale-out command

    1. tiup cluster scale-out <cluster-name> scale-out-pd.yaml

    If you see the message “Scaled cluster out successfully”, it means that the scale-out operation is successfully completed.

  3. Check the cluster status

    1. tiup cluster display <cluster-name>

    Access the monitoring platform at http://10.0.1.1:3000 using your browser to monitor the status of the cluster and the new node.

After the scale-out, the cluster topology is as follows:

Host IPService
10.0.1.1Monitor
10.0.1.2PD
10.0.1.3PD
10.0.1.4PD
10.0.1.5TiKV
10.0.1.6TiKV
10.0.1.7TiKV
10.0.1.8TiKV
10.0.1.9PD

Scale in a TiKV cluster

If you want to remove a TiKV node from the 10.0.1.5 host, take the following steps.

You can take similar steps to remove a PD node.

  1. View the node ID information:

    1. tiup cluster display <cluster-name>
  2. Run the scale-in command:

    1. tiup cluster scale-in <cluster-name> --node 10.0.1.5:20160

    The --node parameter is the ID of the node to be taken offline.

    If you see the message “Scaled cluster in successfully”, it means that the scale-in operation is successfully completed.

    Besides, if the status of the node to be taken offline becomes Tombstone, it also indicates that the scale-in operation is successfully completed because the scale-in process takes some time.

  3. Check the cluster status:

    To check the scale-in status, run the following command:

    1. tiup cluster display <cluster-name>

    Access the monitoring platform at http://10.0.1.1:3000 using your browser, and view the status of the cluster.

After the scale-in, the current topology is as follows:

Host IPService
10.0.1.1Monitor
10.0.1.2PD
10.0.1.3PD
10.0.1.4PD
10.0.1.6TiKV
10.0.1.7TiKV
10.0.1.8TiKV
10.0.1.9PD