Practice content
Please record and share the following:
- 1 (required) use mysqldump to synchronize the table structure and data of MySQL to the MySQL tenant of OceanBase.
- 2 (required) use datax to configure the offline synchronization from MySQL of at least one table to MySQL tenants of OceanBase.
1. Basic environment
mysql stand-alone 192.168 237.137:22
OceanBase zone1 single copy 192.168 two hundred and thirty-seven point one three two
2. Create test table
mysql> CREATE DATABASE IF NOT EXISTS RUNOOB DEFAULT CHARSET utf8 COLLATE utf8_general_ci; mysql> create table Student(Sno char(9) primary key,Sname char(20) not null,Ssex char(2),Sage smallint,Sdept char(20)); add to student Table data mysql>insert into Student(Sno,Sname,Ssex,Sage,Sdept)values('201215121','Li Yong','male','20','CS'); mysql>insert into Student(Sno,Sname,Ssex,Sage,Sdept)values('201215122','Liu Chen','female','19','CS'); mysql>insert into Student(Sno,Sname,Ssex,Sage,Sdept)values('201215123','Wang Min','female','18','MA'); mysql>insert into Student(Sno,Sname,Ssex,Sage,Sdept)values('201215125','Zhang Li','male','19','IS');
3. Export document (required for exercise 1)
mysqldump -h 127.1 -uroot -P3306 -p -d mxdata > mx_ddl.sql
4. Import data (I suggest not to do so, too many mistakes) I can't help it, official exercises.
MySQL [mxdata] > source mx_ddl.sql
The exported script has several characteristics:
- The definition of the view will also be in it, but it will be annotated with /! /. We don't pay attention to the view. This part can be deleted.
- There will be some special syntax that OceanBase MYSQL does not support, but does not affect. Some of them need to be replaced. For example, variable SQL_NOTES, determiner statements, etc.
5. Install datax (focus on pom.xml. If the compilation fails, comment it)
5.1 system requirements:
System Requirements
Linux
JDK (above 1.8, 1.8 is recommended)
Python (Python 2.6. X is recommended)
Apache Maven 3.x (Compile DataX)
Official requirements are basically available. However, it is recommended that Linux use CentOS 6.9 because the supporting Python version is 2.6 6. Just configure Maven. Maven version 3.2.0 is recommended 5. 3.8.1 used at the beginning 1 too many errors
5.2. Download DataX source code:
$ git clone git@github.com:alibaba/DataX.git
1
Use git to download the source code. Refer to other documents. BTW: unstable access to the Internet. Try again.
5.3, packaged by maven:
$ cd {DataX_source_code_home}
$ mvn -U clean package assembly:assembly -Dmaven.test.skip=true
3.1. Remember to configure the domestic image and modify the setting xml
<mirror> <id>nexus-aliyun</id> <mirrorOf>central</mirrorOf> <name>Nexus aliyun</name> <url>https://maven.aliyun.com/repository/central</url> </mirror>
5.2, common error (dependent package not found)
[ERROR] Failed to execute goal on project hdfsreader: Could not resolve dependencies for project com.alibaba.datax:hdfsreader:jar:0.0.1-SNAPSHOT: Could not find artifact org.pentaho:pentaho-aggdesigner-algorithm:jar:5.1.5-jhyde in nexus-aliyun (https://maven.aliyun.com/repository/central) -> [Help 1]
From the prompt, the package of hdfsreader is missing, and the corresponding hdfswriter should not work. Because I can't use these two components, I found them in POM Comment it out of the XML.
There is also an Oscar writer, which is also available in POM Comment it out of the XML.
[ERROR] Failed to execute goal on project oscarwriter: Could not resolve dependencies for project com.alibaba.datax:oscarwriter:jar:0.0.1-SNAPSHOT: Could not find artifact com.oscar:oscar:jar:7.0.8 at specified path /opt/DataX/oscarwriter/src/main/lib/oscarJDBC.jar -> [Help 1]
5.4. Then, the compilation is successful.
[INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 05:43 min [INFO] Finished at: 2021-05-28T00:42:50+08:00 [INFO] Final Memory: 305M/492M [INFO] ------------------------------------------------------------------------
5.5 configure datax to connect mysql and ocneanbase json files
{ "job": { "setting": { "speed": { "channel": 4 }, "errorLimit": { "record": 0, "percentage": 0.1 } }, "content": [ { "reader": { "name": "mysqlreader", "parameter": { "username": "datax", "password": "root123", "column": [ "*" ], "connection": [ { "table": [ "lcxtest1" ], "jdbcUrl": ["jdbc:mysql://192.168.32.130:3306/lcxtest?useUnicode=true&characterEncoding=utf8&useSSL=false"] } ] } }, "writer": { "name": "oceanbasev10writer", "parameter": { "obWriteMode": "insert", "column": [ "*" ], "preSql": [ "truncate table lcxtest1" ], "connection": [ { "jdbcUrl": "||_dsc_ob10_dsc_||obcluster:test_tenant||_dsc_ob10_dsc_||jdbc:oceanbase://192.168.32.111:2883/lcxtest?useLocalSessionState=true&allowBatch=true&allowMultiQueries=true&rewriteBatchedStatements=true", "table": [ "lcxtest1" ] } ], "username": "root", "password":"root123", "writerThreadCount":10, "batchSize": 1000, "memstoreThreshold": "0.9" } } } ] } }
Note: required here
mysql adds a synchronization account. datax can be changed by itself
onceanbase modify tenant account password
Start datax for synchronization
./datax.py ../job/mysql2ob.json (Self naming)