Linux Learn 28-linux One Line Command Kills Processes with Specified Names (killall, kill, pkill)
Preface
Common interview questions: How to use a linux directive to find a process with a specified name and kill it Three commands commonly used to kill processes: killall, kill, pkill
Several ways to kill a process
The killall command is used to kill processes by name.
killall [parameter] [process name]
The kill Command kills the p ...
Added by dcace on Thu, 30 Apr 2020 03:14:06 +0300
Word vector representation: word2vec and word embedding
In NLP tasks, the training data is usually one sentence (Chinese or English), and each step of input sequence data is one letter. We need to preprocess the data: first use the unique heat code for these letters, and then input it into RNN. For example, the letter a represents (1, 0, 0, 0,...) , 0), the letter b is (0, 1, 0, 0 , 0). If only the ...
Added by Garth Farley on Sun, 26 Apr 2020 07:51:24 +0300
Tutorial on using the SpringBoot series log framework
Based on the previous blog, An introduction to the log framework of the SpringBoot series and its principles After the blog, this blog can help you learn about Springboot by providing a more detailed description of the specific use of the blog framework as a usage manual
@[toc]
1. SpringBoot Log Level
1) Introduction to Log Level
Briefly introd ...
Added by busnut on Tue, 21 Apr 2020 20:02:24 +0300
Use of log4j
1. Simple use
Only one jar package of log4j needs to be imported, and the log4j.properties configuration file under the src directory will be loaded by default
1. Print log to console
log4j.properties configuration file content
#Set the log level to info and output to the console
log4j.rootLogger=info,console
log4j.appender.console = org ...
Added by jay_bo on Mon, 20 Apr 2020 18:53:38 +0300
Actual | Write to Hudi using Spark Streaming
1. Project Background
Traditional data warehouse organization structure is designed for OLAP (Online Transaction Analysis) requirements of offline data. The common way to import data is to use sqoop or spark timer jobs to import business database data into warehouses one by one.With the increasing requirement of real-time in data analysis, hou ...
Added by robh76 on Sun, 19 Apr 2020 03:03:03 +0300
Accumulator monitoring of Flink source code analysis
Let's explore how to obtain accumulator monitoring again. Let's follow the screenshot to find out: 1.2. Then we find the node.js rendering module index.js fileFind out here that the data processing and writing of the Flink web front end is so complicated. Are those developers deliberately writing so complex that we can't understand it?
3. Know ...
Added by THW on Fri, 17 Apr 2020 18:19:46 +0300
Using Kafka message queue in springboot project
1. Kafka message producer
1) Create the springboot Kafka producer demo project and import the jar package.
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
<version>2.4.5.RELEASE</version>
</dependency>
2) Add Kafka producer configuration it ...
Added by Patrick3002 on Tue, 14 Apr 2020 18:03:10 +0300
A summary of Jmeter 's Rsa encryption and decryption
Encrypt script first:
import org.apache.commons.codec.binary.Base64;
import java.io.ByteArrayOutputStream;
import java.security.Key;
import java.security.KeyFactory;
import java.security.KeyPair;
import java.security.KeyPairGenerator;
import java.security.PrivateKey;
import java.security.PublicKey;
import java.security.Signature;
i ...
Added by kind on Thu, 09 Apr 2020 17:46:55 +0300
Location and Solution of Large Backlog Problem in HBase2.0 replication wal znode
phenomenon
There are two clusters A and B on the line, which are configured with two-way synchronization and single activity, i.e. only one cluster will be visited at a certain time by the business layer.A lot of exceptions have been reported in the regionserver log of Cluster A recently, but the monitoring page is working and functionality is ...
Added by oliverw92 on Wed, 08 Apr 2020 04:40:06 +0300
Simple addition, deletion, modification and search of Lucene 5 search
I. Preface
Do a simple Lucene CRUD test, record
2, Code + effect
1. Index batch generation
Generate several indexes in batch for later testing
Code 1.
package com.cun.test;
import java.nio.file.Paths;
import org.apache.lucene.analysis.Analyzer;
import org.apache.lucene.analysis.cn.smart.SmartChineseAnalyzer;
import org.apache.lucene.an ...
Added by louie on Tue, 07 Apr 2020 21:02:16 +0300