Introduction to BertTiny and deployment of OpenVINO
1, Bert introduction
bert is a language model technology applied in NLP field, and Transformer, Encoder and Decoder are its core technologies. The purpose of this blog is to teach you to get started bert quickly.
1.1 what problems can Bert solve
1. Combined with the context understanding semantics, it extracts all the referential relationshi ...
Added by oscar2 on Wed, 09 Mar 2022 11:15:44 +0200
vision_transformer actual combat summary: a very simple introduction to VIT. Don't miss it
abstract
This example extracts part of the data in the plant seedling data set as the data set. The data set has 12 categories. It demonstrates how to use the VIT image classification model of pytorch version to realize the classification task.
Through this article, you can learn:
1. How to build VIT model?
2. How to generate data sets? ...
Added by iffy on Mon, 21 Feb 2022 10:10:23 +0200
A small record of neural network learning 67 -- a detailed explanation of the reproduction of Vision Transformer (VIT) model in pytoch
Study Preface
Visual Transformer is very hot recently. I'll learn from VIT first.
What is Vision Transformer (VIT)
Vision Transformer is the visual version of transformer. Transformer has basically become the standard configuration of natural language processing, but its application in vision is still limited.
Vision Transformer breaks ...
Added by asukla on Mon, 24 Jan 2022 19:56:06 +0200
Transformer backbone network -- TNT nanny level analysis
preface
Thesis address: arxiv Code address: github Receiver: NeurIPS 2021
Series articles
Transformer backbone - PVT_V1 nanny level resolution Transformer backbone - PVT_V2 level parsing Transformer backbone network -- T2T-ViT nanny level analysis Transformer backbone network -- TNT nanny level analysis Continuous update!
motivation
The au ...
Added by 14zero on Fri, 21 Jan 2022 08:28:45 +0200
Attached code: Swin Transformer
Swing transformer: Interpretation of the paper on hierarchical vision transformer using shifted windows
Reference link: https://blog.csdn.net/qq_37541097/article/details/121119988?spm=1001.2014.3001.5501 Code link: https://github.com/microsoft/Swin-Transformer Paper link: https://arxiv.org/pdf/2103.14030.pdf
Summary:
This paper presents a ne ...
Added by m0rpheu5 on Mon, 03 Jan 2022 09:21:02 +0200
Research on automatic code annotation generation technology based on Trasformer
Project introduction: Research on automatic code annotation generation technology based on transformer
Introduction: This is my graduation project, which mainly realizes the automatic generation of code comments. Generally speaking, it is to give a piece of code and then generate corresponding comments (functional comments).
The data set we u ...
Added by Blicka on Sat, 25 Dec 2021 22:37:02 +0200
The most complete Vision Transformer(ViT) paper interpretation and code reproduction (based on the paddle framework)
preface
The pioneering work of the VIT model is to use a pure transformer structure, as shown in the title of the paper: AN IMAGE IS WORTH 16X16 WORDS, which embeds the pictures into a series of sequence s, and realizes the effect comparable to the SOTA model in CNN through multiple encoder structures and head s.
Image classification t ...
Added by T2theC on Sat, 18 Dec 2021 09:54:15 +0200
Chinese English translation based on Transformer
Machine translation based on Transformer
Machine translation is the process of using computers to convert one natural language (source language) into another natural language (target language).
This project is the PaddlePaddle implementation of Machine Translation's mainstream model Transformer, including model training, prediction, and use o ...
Added by vent on Sun, 05 Dec 2021 02:21:57 +0200
Transformer hardware implementation part 3: supplement to pytoch basic knowledge
This article is a supplement to the knowledge of pytorch before training Transformer. Thank blogger Mo fan for his video course on Python https://www.youtube.com/watch?v=lAaCeiqE6CE&feature=emb_title , whose home page is: Don't bother Python
It is recommended to directly watch the blogger's video tutorial to complete the knowledge suppleme ...
Added by unidox on Tue, 12 Oct 2021 21:42:29 +0300
NLP star sky intelligent dialogue robot series for natural language processing: in depth understanding of Transformer natural language processing multi head attention architecture-1
NLP star sky intelligent dialogue robot series for natural language processing: in depth understanding of Transformer's multi head attention architecture for natural language processing This paper starts with the architecture of Transformer's multi head attention layer, and then uses an example of Python coding to help readers understand the mu ...
Added by R_P on Sat, 11 Sep 2021 04:18:23 +0300