[Hadoop] mac builds Hadoop 3 X pseudo distribution pattern

catalogue I. Homebrew installation II. SSH password free login configuration III. Hadoop installation Four pseudo distributed configurations a.hadoop-env.sh configuration b.core-site.xml configuration c.hdfs-site.xml configuration d.mapred-site.xml configuration e.yarn-site.xml configuration V. startup and operation Six test WordCo ...

Added by cqinzx on Tue, 11 Jan 2022 06:40:01 +0200

2, Command line operation, API operation and Zookeeper election mechanism for Zookeeper cluster

3, Zookeeper cluster operation 3.1 cluster operation 3.1.1 installation of zookeeper on Cluster We learned earlier Installation of Zookeeper locally , the basic steps are similar, but after we install it on a host in the cluster, we still need to install the installation files Distribute across hosts , and the number of the configuration ...

Added by ericburnard on Mon, 10 Jan 2022 22:11:40 +0200

Nginx implements cross network segment access of CDSW

90.1 presentation environment Implementation process: CM and CDH version: 5.13.1The CDSW and CDH cluster operating systems are RedHat: 7.2Nginx server: redhat6 fourLivy version: 0.4Nginx version: 1.8.6 90.2 operation demonstration 1. Install DNS service and configure pan domain name resolution The DNS service is mainly used to resolve the ...

Added by ambivalent on Mon, 10 Jan 2022 04:25:48 +0200

Hadoop master's road -- 06--Flume log collection

Flume log collection 1, Flume introduction Logo refers to the transfer of wood (data) from one place (data source) to another place (data destination) through the river channel (channel) Working with documents 2, Installation configuration of Flume 1. Download and upload to the server 2. Decompression 3. Configuration Copy fl ...

Added by future_man on Sun, 09 Jan 2022 10:17:28 +0200

Zookeeper high availability cluster & distributed message queue Kafka | Cloud computing

1. Set up zookeeper cluster 1.1 problems This case requires: Set up zookeeper cluster1 leader2 follower s1 observer 1.2 steps To implement this case, you need to follow the following steps. Step 1: install Zookeeper 1) Edit / etc/hosts and all cluster hosts can ping each other (configured on Hadoop 1 and synchronized to node-0001, ...

Added by biohazardep on Wed, 05 Jan 2022 11:08:14 +0200

hadoop high availability cluster building (HA) semester summary (please correct any errors)

(1) Preparation: Linux uses CentOS-7-x86 Official website download address: https://archive.kernel.org/centos-vault/7.6.1810/isos/x86_64/ Virtual machine VMware Workstation 16 Pro I know everything on this website Hadoop installation package version: hadoop-2.7.4 Official website download address: https://archive.apache.org/dist/hadoop/co ...

Added by matty on Wed, 05 Jan 2022 08:54:31 +0200

hadoop cluster construction

1, Hadoop cluster construction 1. Installing virtual machines 1. Installing vmtools hadoop@ubuntu:sudo apt-get install open-vm-tools-desktop -y 2. Install vim editor hadoop@ubuntu:sudo apt install vim 2. Install jdk 1. Unzip the installation package hadoop@ubuntu:~$ sudo tar -zxvf jdk-8u171-linux-x64.tar.gz -C /usr/local 2. Modify e ...

Added by nano on Wed, 05 Jan 2022 03:46:35 +0200

Linux command artifact: lsof

brief introduction lsof(list open files) is a tool that lists the open files of the current system. In the linux environment, everything exists in the form of files. Through files, you can access not only conventional data, but also network connections and hardware. Therefore, for example, transmission control protocol (TCP) and user datagram ...

Added by nubby on Tue, 04 Jan 2022 10:40:20 +0200

[hadoop job] Call MapReduce to count the number of occurrences of each word in the file

1, Environment introduction Install the Ubuntu virtual machine using VirtualBox. Install Hadoop and Eclipse 3.0 in Ubuntu 8 compiler. Download and install JAVA environment, Download jdk and complete the pseudo distributed environment configuration of Hadoop. Import all the required JAR packages encountered by the compiler in Eclipse. Start Had ...

Added by IRON FART on Tue, 04 Jan 2022 09:13:29 +0200

First understand the three installation modes of Hadoop

First understand the three installation modes of Hadoop Features: high reliability (not afraid of loss), high efficiency (fast processing speed), high fault tolerance ps: use Hadoop version: Next, Hadoop 2 8.5, although Hadoop has been updated to 3.5 X is over; However, we always adhere to the view of "using the old instead of th ...

Added by chaser7016 on Tue, 04 Jan 2022 03:14:13 +0200