Detailed explanation and implementation of self encoder model (implemented by tensorflow2.x)
Learning latent variables using self encoder
Because there are many redundancies in the high-dimensional input space, it can be compressed into some low-dimensional variables. The self encoder was first introduced by Geoffrey Hinton et al. In the 1980s. Similar to the technology used to reduce the input dimension in traditional machine lea ...
Added by tauchai83 on Thu, 10 Feb 2022 08:43:45 +0200
Machine learning - Data Preprocessing
Data pre filling has its own characteristics, and redundant and invalid data need to be pre selected according to different data formats. Data preprocessing is roughly divided into three steps: data preparation, data conversion and data output. Data processing is not only the basic link of system engineering, but also an effective means to impr ...
Added by niesom on Thu, 10 Feb 2022 07:14:40 +0200
Exploratory Data Analysis EDA (Exploratory Data Analysis) analysis with python
Exploratory Data Analysis EDA (Exploratory Data Analysis) analysis with python
show holy respect to python community, for there dedication and wisdom
Dataset related:
First, UCL wine dataset:
UCI data set is a commonly used standard test data set for machine learning. It is a database for machine learning proposed by the University of ...
Added by iacataca on Thu, 10 Feb 2022 05:31:49 +0200
Feature Engineering - normalization, standardization, dimensionality reduction 02
Standard zoom 1 Normalization
from sklearn.preprocessing import MinMaxScaler
def mm():
"""
normalization
:return:
"""
mm = MinMaxScaler(feature_range=(2,3))
data = mm.fit_transform([[90,2,10,40],[60,4,15,45],[75,3,13,46]])
print(data)
[[3. 2. 2. 2. ] [2. 3. 3. 2.83333333] [2.5 2.5 2.6 3. ]]
Zoom to [2,3] Th ...
Added by fatalcure on Thu, 10 Feb 2022 00:06:05 +0200
Teacher Li Hang's "statistical learning methods", Second Edition, Chapter 15, singular value decomposition, after-school question answers
1. Trial matrix
A
=
[
1
2
...
Added by kid_drew on Wed, 09 Feb 2022 23:18:06 +0200
Key point detection project code is open source!
Author: Yan Yongqiang, algorithm engineer, Datawhale memberIn this paper, through the self built gesture data set, we use YOLOv5s detection, then train squeezenet to predict the key points of the hand through the open source data set, and finally judge the specific gesture through the angle algorithm between the fingers and display it. The four ...
Added by devang23 on Wed, 09 Feb 2022 20:37:46 +0200
VSCode debugging C + + program, opencv library and Qt5 Library under Ubuntu
1. Readme
I have been using vscode for more than three years. Most of them write python programs and occasionally C + + programs, but they are not complex and do not involve other third-party libraries. Even if third-party libraries are involved, most of them write CMakeLists files first, and then compile and run it with the standard cmake... ...
Added by mlschutz on Wed, 09 Feb 2022 12:15:01 +0200
ML self realization / KNN / classification / unauthorized duplication
brief introduction
KNN(K Nearest Neighbors)
It can be used for classification problems and regression problemsClassification problem and regression problem are divided into whether to take weight or not
give an example
introduce
The existing categories are red triangles and blue squaresWhat type should the newly entered green dot belong t ...
Added by mkosmosports on Tue, 08 Feb 2022 19:29:37 +0200
Machine learning: RFormula for feature selection (RFormula in SparkMLlib)
catalogue
0. Links to related articles
1. General
2. Spark code
0. Links to related articles
Algorithm article summary
1. General
Use RFormula to select features listed in spark2 Version 1.0 only supports some R operations, including: ~ ','. ',': ',' + ', and ‘-‘.
~ separate target and terms Split labels and features
+ con ...
Added by xangelo on Tue, 08 Feb 2022 18:14:21 +0200
Transformer source code explanation (PyTorch) that ordinary people can understand
PyTorch chapter of Transformer source code interpretation
chapter
Word embeddingLocation codingLong attentionBuild Transformer
Insert picture description here
Word embedding
Transformer is essentially an Encoder. Taking the translation task as an example, the original data set is composed of one line in two languages. In application, ...
Added by ChetUbetcha on Tue, 08 Feb 2022 17:36:24 +0200