Complete solution of big data Flink e-commerce real-time warehouse actual combat project process (final chapter) simple application of Flink SQL

Flink Sql introduction In the actual development process, I personally focus on Stream API, and Flink Sql does not write much. Here I mainly refer to the original project code. I am not very familiar with the specific details. I suggest you can learn about Flink Sql alone; To put it simply, first obtain the required field data from the corresp ...

Added by dumdumsareyum on Sat, 25 Dec 2021 11:55:52 +0200

Flink: Flink getting started and running architecture

Flink (I): introduction to Flink installation and running architecture What is flink? Apache Flink is a memory stream based computing engine born in December 2014. Apache Flink is a framework and distributed processing engine for stateful computing on unbounded and bounded data streams. Flink is designed to run in all common cluster envi ...

Added by hunna03 on Thu, 23 Dec 2021 12:22:09 +0200

Flink reads Kafka data and sinks to Clickhouse

Flink reads Kafka data and sinks to Clickhouse In real-time streaming data processing, we can usually do real-time OLAP processing in the way of Flink+Clickhouse. The advantages of the two will not be repeated. This paper uses a case to briefly introduce the overall process. Overall process: Import json format data to kafka specific topicsWr ...

Added by dallasx on Thu, 23 Dec 2021 03:37:36 +0200

[hard big data] summary of Flink's enterprise application in real-time computing platform and real-time data warehouse

Welcome to the blog home page: https://blog.csdn.net/u013411339 Welcome to like, collect, leave messages, and exchange messages!This article was originally written by [Wang Zhiwu] and started on CSDN blog!This article is the first CSDN forum. It is strictly prohibited to reprint without the permission of the official and myself! This artic ...

Added by Adam_28 on Sun, 19 Dec 2021 13:25:10 +0200

Build SQL client streaming application based on Flink-1.11 local source code compilation

preface I want to try how to run the sql of flink. Last year, I saw that the big brother JarkWu shared it, but I haven't actually operated it. Recently, I'm interested. Original Jarkwu sharing Demo: building streaming applications based on Flink SQL. JarkWu github flink-sql-demo The boss is based on flink1 Version 10 is built, but the rel ...

Added by Smudly on Sat, 18 Dec 2021 13:25:24 +0200

[Flink] source code Notes - Analysis and operation principle of Flink SQL

1. IntroductionSqlClient is an SQL command line interaction tool provided by Flink. When downloading the blink binary package, there is a SQL client in its bin directory SH, you can enter the interactive page by starting the script. The specific source code implementation of SqlClient can be found under the Flink SQL client sub module of the Fl ...

Added by refined on Tue, 14 Dec 2021 09:46:13 +0200

Detailed explanation of watermark of Flink

watermark introduction In Flink, Watermark is a mechanism proposed by Apache Flink to handle EventTime window calculation. It is essentially a timestamp. Used to deal with the disorder problem in real-time data, it is usually realized by the combination of waterline and window. From the generation of real-time stream events by the device, to F ...

Added by watthehell on Sat, 11 Dec 2021 06:24:10 +0200

FlinkCEP mode API introduction and entry case

The code in this article can run normally in flink 1.13.3 1. Introduction to mode API Based on the pattern API of FlinkCep, you can define a complex pattern sequence to be extracted from the input stream. Each complex pattern sequence consists of multiple simple patterns patterns single pattern: refers to a simple pattern, such as a pattern ...

Added by mikes127 on Wed, 08 Dec 2021 21:23:19 +0200

Flink -- transform (keyed flow conversion operator)

Keyed flow conversion operator keyby If you want to aggregate now, you must group first, so keyby is very important The keyby operator is special and is not a step-by-step operation Not the real aoprete It defines the relationship between two tasks Data transmission mode keyby groups based on defined key s A repartition is performed based on ...

Added by 11Tami on Tue, 07 Dec 2021 09:19:32 +0200

Flink practice tutorial: getting started: Jar job development

Author: Tencent cloud flow computing Oceanus team   Introduction to flow computing Oceanus Stream computing Oceanus is a powerful tool for real-time analysis of big data product ecosystem. It is an enterprise level real-time big data analysis platform based on Apache Flink with the characteristics of one-stop development, seamless connection, s ...

Added by josborne on Sun, 05 Dec 2021 02:02:48 +0200