Construction of Hadoop running environment

1.1 template virtual machine environment preparation

0) install the template virtual machine, with IP address 192.168.10.100, host name Hadoop 100, memory 4G and hard disk 50G

1) The configuration requirements of Hadoop 100 virtual machine are as follows (the Linux system in this paper takes CentOS-7.5-x86-1804 as an example)
(1) Using Yum to install requires that the virtual machine can access the Internet normally. You can test the virtual machine networking before installing yum

[root@hadoop100 ~]# ping www.baidu.com
PING www.baidu.com (14.215.177.39) 56(84) bytes of data.
64 bytes from 14.215.177.39 (14.215.177.39): icmp_seq=1 ttl=128 time=8.60 ms
64 bytes from 14.215.177.39 (14.215.177.39): icmp_seq=2 ttl=128 time=7.72 ms

(2) Install EPEL release
Note: Extra Packages for Enterprise Linux is an additional software package for the "red hat" operating system, which is applicable to RHEL, CentOS and Scientific Linux. As a software warehouse, most rpm packages cannot be found in the official repository)

[root@hadoop100 ~]# yum install -y epel-release

(3) Note: if the minimum system version is installed on Linux, the following tools need to be installed; If you are installing Linux Desktop Standard Edition, you do not need to perform the following operations
 net tool: toolkit collection, including ifconfig and other commands

[root@hadoop100 ~]# yum install -y net-tools 

 vim: Editor

[root@hadoop100 ~]# yum install -y vim

2) Turn off the firewall. Turn off the firewall and start it automatically

[root@hadoop100 ~]# systemctl stop firewalld
[root@hadoop100 ~]# systemctl disable firewalld.service
Note: during enterprise development, the firewall of a single server is usually turned off. The company as a whole will set up a very secure firewall

3) Create the atguigu user and modify the password of the atguigu user

[root@hadoop100 ~]# useradd atguigu
[root@hadoop100 ~]# passwd atguigu

4) Configure the atguigu user to have root permission, which is convenient for sudo to execute the command with root permission later

[root@hadoop100 ~]# vim /etc/sudoers

Modify the / etc/sudoers file and add a line under the% wheel line as follows:

Allow root to run any commands anywhere
root    ALL=(ALL)     ALL

Allows people in group wheel to run all commands
%wheel  ALL=(ALL)       ALL
atguigu   ALL=(ALL)     NOPASSWD:ALL

Note: the line atguigu should not be placed directly under the root line, because all users belong to the wheel group. You first configured atguigu to have a password free function, but when the program runs to the% wheel line, the function is overwritten and requires a password. So atguigu should put it under the line% wheel.
5) Create a folder in the / opt directory and modify the owner and group
(1) Create the module and software folders in the / opt directory

[root@hadoop100 ~]# mkdir /opt/module
[root@hadoop100 ~]# mkdir /opt/software

(2) Modify that the owner and group of the module and software folders are atguigu users

[root@hadoop100 ~]# chown atguigu:atguigu /opt/module 
[root@hadoop100 ~]# chown atguigu:atguigu /opt/software

(3) View the owner and group of the module and software folders

[root@hadoop100 ~]# cd /opt/
[root@hadoop100 opt]# ll
 Total consumption 12
drwxr-xr-x. 2 atguigu atguigu 4096 5 August 28-17:18 module
drwxr-xr-x. 2 root    root    4096 9 July 2017 rh
drwxr-xr-x. 2 atguigu atguigu 4096 5 August 28-17:18 software

6) Uninstall the JDK that comes with the virtual machine
Note: if your virtual machine is minimized, you do not need to perform this step.

[root@hadoop100 ~]# rpm -qa | grep -i java | xargs -n1 rpm -e --nodeps 

 rpm -qa: query all installed RPM software packages
 grep -i: ignore case
 xargs -n1: it means that only one parameter is passed at a time
 rpm -e – nodeps: forced software unloading
7) Restart the virtual machine

[root@hadoop100 ~]# reboot

1.2 cloning virtual machines

1) Using the template machine Hadoop 100, clone three virtual machines: Hadoop 102, Hadoop 103 and Hadoop 104
Note: when cloning, close Hadoop 100 first
2) Modify the clone machine IP, which is illustrated by Hadoop 102 below
(1) Modify the static IP of the cloned virtual machine

[root@hadoop100 ~]# vim /etc/sysconfig/network-scripts/ifcfg-ens33
 Change to
DEVICE=ens33
TYPE=Ethernet
ONBOOT=yes
BOOTPROTO=static
NAME="ens33"
IPADDR=192.168.10.102
PREFIX=24
GATEWAY=192.168.10.2
DNS1=192.168.10.2

(2) View the virtual network editor of Linux virtual machine, edit - > virtual network editor - > VMnet8

(3) View the IP address of Windows system adapter VMware Network Adapter VMnet8

(4) Ensure that the IP address and virtual network editor address in the ifcfg-ens33 file of Linux system are the same as the VM8 network IP address of Windows system.
3) Modify the host name of the clone machine. The following is an example of Hadoop 102
(1) Modify host name

[root@hadoop100 ~]# vim /etc/hostname
hadoop102

(2) Open the Linux host configuration file / etc/hosts

[root@hadoop100 ~]# vim /etc/hosts
 Add the following
192.168.10.100 hadoop100
192.168.10.101 hadoop101
192.168.10.102 hadoop102
192.168.10.103 hadoop103
192.168.10.104 hadoop104
192.168.10.105 hadoop105
192.168.10.106 hadoop106
192.168.10.107 hadoop107
192.168.10.108 hadoop108

4) Restart the clone machine Hadoop 102

[root@hadoop100 ~]# reboot

5) Modify the host mapping file (hosts file) of windows
(1) If the operating system is Windows 7, you can modify it directly
① Enter the path C:\Windows\System32\drivers\etc
② Open the hosts file, add the following contents, and then save

192.168.10.100 hadoop100
192.168.10.101 hadoop101
192.168.10.102 hadoop102
192.168.10.103 hadoop103
192.168.10.104 hadoop104
192.168.10.105 hadoop105
192.168.10.106 hadoop106
192.168.10.107 hadoop107
192.168.10.108 hadoop108

(2) If the operating system is windows10, copy it first, modify and save it, and then overwrite it
① Enter the path C:\Windows\System32\drivers\etc
② Copy hosts file to desktop
③ Open the desktop hosts file and add the following

192.168.10.100 hadoop100
192.168.10.101 hadoop101
192.168.10.102 hadoop102
192.168.10.103 hadoop103
192.168.10.104 hadoop104
192.168.10.105 hadoop105
192.168.10.106 hadoop106
192.168.10.107 hadoop107
192.168.10.108 hadoop108

④ Overwrite the desktop hosts file with the C:\Windows\System32\drivers\etc path hosts file

Keywords: Linux CentOS Hadoop

Added by misslilbit02 on Sun, 06 Feb 2022 09:59:58 +0200