Visualizing models, data, and training using tensorboard

abstract In order to understand what happened, we printed out some statistics during model training to see whether the training was in progress. However, we can do better: PyTorch is integrated with TensorBoard, which is a tool for visualizing the results of neural network training. This tutorial explains some of its functions using the fa ...

Added by p0pb0b on Fri, 17 Sep 2021 06:26:32 +0300

[optimization prediction] BP neural network prediction optimized by firefly algorithm [Matlab 1313]

1, Introduction to firefly optimization algorithm (FA) 1 Introduction There are many kinds of fireflies, mainly distributed in the tropics. Most fireflies produce rhythmic flashes in a short time. This flash is due to a chemical reaction of bioluminescence, and the flash pattern of fireflies varies from species to species. firefly algorithm (F ...

Added by abushahin on Thu, 16 Sep 2021 03:11:41 +0300

Second job: multi-layer perceptron

1, Linear neural network (1) Linear regression 1. Linear model The linear model is regarded as a single-layer neural network. 2. Loss function The loss function can quantify the difference between the actual value and the predicted value of the target.   3. Analytical solution     4. Optimization method: small batch grad ...

Added by cyberlew15 on Mon, 13 Sep 2021 04:38:22 +0300

NIPS15 - STN Spatial Transformer Network (including code reproduction) of spatial transformation module in neural network

Original address original text Thesis reading methods Three times thesis method First acquaintance CNN method is brilliant in the field of computer vision, and has replaced the traditional method in many fields. However, the architecture of convolutional neural network lacks spatial invariance. Even if convolution and Max pooling opera ...

Added by hacksurfin on Sun, 12 Sep 2021 23:09:02 +0300

NLP star sky intelligent dialogue robot series for natural language processing: in depth understanding of Transformer natural language processing multi head attention architecture-1

NLP star sky intelligent dialogue robot series for natural language processing: in depth understanding of Transformer's multi head attention architecture for natural language processing This paper starts with the architecture of Transformer's multi head attention layer, and then uses an example of Python coding to help readers understand the mu ...

Added by R_P on Sat, 11 Sep 2021 04:18:23 +0300