Actual data backup and data restore for Elasticsearch cluster using Elasticdump tool

Wen/Zhu Jiqian

  • Catalog
    1. Introduction of Elasticdump tool
    2. Elasticdump Tool Installation
    3. Use of Elasticdump tools

Recently, some work has been done in the development of Elasticsearch mapping structure and data export and import, for fear that the process will be forgotten in the future. It is not as good as bad memory, so it is recorded as a blog post.

Playing with Elasticsearch's kids'shoes is probably going to encounter the problem of how to quickly map the index structure in Elasticsearch and quickly back up and restore the corresponding data.

At this point, Elasticsearch's import and export tool, Elasticdump, can be used to index and restore data from different clusters of Elasticsearch.


1. Introduction of Elasticdump tool

Return website www.cpa5.cn

On npm's English website about Elasticdump, you can see an English introduction about Elasticdump. The logo of this tool is interesting. It is a toolbar that can carry (migrate) things (data). This logo indicates that the Elasticdump tool can be used to migrate backup and restore data.

A special requirement when using Elasticdump is that if you use npm install elasticdump -g directly, the node version needs to be in v10.0.0 or more to support, otherwise execution of the command will error.

Elasticdump works by sending an input to output, and its standard directive is

elasticdump --input SOURCE --output DESTINATION [OPTIONS]
  • input SOURCE represents a read data source SOURCE
  • output DESTINATION means that the data source is transferred to the destination DESTINATION.
  • SOURCE/DESTINATION Both can be Elasticsearch URLs or files, if Elasticsearch URLs, for example http://127.0.0.1/index , which means that the address is http://127.0.0.1ES Import or export index-related data from a library.
  • [OPTIONS] is an operation option. Typee and limit are commonly used. Other operations are not described here.

Typee is the ES data export import type. The Elasticdum tool supports import and export of the following data types:

Type typeExplain
mappingIndex Mapping Structure Data for ES
dataData for ES
settingsIndex Library Default Configuration for ES
analyzerES word splitter
templateES's Template Structure Data
aliasIndex Alias for ES

The number of objects that limit backs up from SOURCE to DESTINATION, defaulting to 100, can be customized.


2. Elasticdump Tool Installation

1. Installing the Elasticdum tool online requires a node dependency, so install V10 first. Nodes above 0.0.

[root@zhu opt]# wget https://nodejs.org/dist/v12.18.3/node-v12.18.3-linux-x64.tar.xz
[root@zhu opt]# tar xvf  node-v12.18.3-linux-x64.tar.xz -C /usr/local/
[root@zhu opt]# mv /usr/local/node-v12.18.3-linux-x64 /usr/local/nodejs
[root@zhu opt]# echo export NODEJS_HOME=/usr/local/nodejs >> /etc/profile
[root@zhu opt]# echo export PATH=$PATH:$NODEJS_HOME/bin >> /etc/profile
[root@zhu opt]# echo export NODEJS_PATH=$NODEJS_HOME/lib/node_modules >>/etc/profile
[root@zhu opt]# source /etc/profile
[root@zhu opt]# ln -s /usr/local/nodejs/bin/node /usr/local/bin/node
[root@zhu opt]# ln -s /usr/local/nodejs/bin/npm /usr/local/bin/npm
[root@zhu opt]# npm -v
6.14.6
[root@zhu opt]# node -v
v12.18.3

2. Install elasticdump via npm

[root@zhu opt]# npm install elasticdump -g

After successful installation, enter

[root@zhu opt]#cd /usr/local/nodejs/lib/node_modules/elasticdump/bin

You can see two commands, elasticdump to back up a single index and multielasticdump to back up multiple indexes in parallel:

root@zhu bin]# ll
 Total usage 20
-rwxr-xr-x. 1 1001 1001  4026 4 September 14:38 elasticdump
-rwxr-xr-x. 1 1001 1001 14598 10 February 26, 1985 multielasticdump

3. Use of Elasticdump tools

Single index backup restore operation using elasticdump -


-Export index test_ mapping structure of event:

[root@zhu opt]# elasticdump --input=http://127.0.0.1:9200/test_event  --output=/opt/test_event_mapping.json --type=mapping 

Check the current file and find that it has been backed up as a json file:

[root@zhu opt]# ll
 Total usage 14368
-rw-r--r--. 1 root root     6200 4 September 11:30 ucas_hisevenr_mapping.json

You can also import directly into another es cluster:

[root@zhu opt]# elasticdump --input=http://127.0.0.1:9200/test_event   --output=http://127.0.0.2:9200/test_event --type=mapping

-Export index test_ Evet data:

[root@zhu opt]# elasticdump --input=http://127.0.0.1:9200/test_event  --output=/opt/data.json --type=data

Similarly, you can import backup data directly into another es cluster:

[root@zhu opt]# elasticdump --input=http://127.0.0.1:9200/test_event   --output=http://127.0.0.2:9200/test_event --type=data

elasticdump for data restore

- Map map structure restore:

[root@zhu opt]# elasticdump --input=/opt/test_event_mapping.json --output http://127.0.0.1:9200/ --type=mapping

- Data data data restore

[root@zhu opt]# elasticdump --input=/opt/data.json    --output=http://127.0.0.1:9200/test_event    --type=data

Use elasticdump for multiple index backup operations:

#take ES Index and all its types backed up to es_backup In folder
multielasticdump direction = dump match ='^.*$'  input = http://127.0.0.1:9200   output =/tmp/es_backup
#Backup only ES Index to " -index"(Match regular expression) is the end of the prefix. Back up index data only. All other types will be ignored.#Note: Parser and alias types are ignored by default
multielasticdump --direction=dump --match='^.*-index$' --input=http://127.0.0.1:9200 --ignoreType='mapping,settings,template'  --output=/tmp/es_backup

Multiple index restore operations using elasticdump:

multielasticdump --direction=load --input=/tmp/es_backup --output=http://127.0.0.1:9200

According to the introduction of npm's elasticdump website in English, one thing to note here is that even with multielasticdump, there is a difference between--direction parameter setting and--ignoreType parameter setting.
  • When backing up, --direction=dump is the default, then--input must be the URL of the ElasticSearch server base location (that is, the http://localhost:9200 ) and -- output must be a directory. Each matching index creates a data, map, and parser file.

  • When restoring, to load a file dumped from multi-elasticsearch, --direction should set it to load, --input must be the directory of the multi elasticsearch dump, and--output must be the Elasticsearch server URL.

  • --match` is used to filter the index (regular expression) that should be dumped/loaded.

  • --ignoreType allows types to be ignored from dump/load. Six options are supported. data,mapping,analyzer,alias,settings,template. Multiple types of support are provided, each type must be separated by commas when used, and interval allows control over the dump/load interval at which a new index is generated.

  • --IncudeType allows types to be included in dump/load. Six options are supported - data,mapping,analyzer,alias,settings,template.

Added by lachhekumar on Tue, 08 Mar 2022 19:26:57 +0200