Complete collection of 23 basic knowledge of big data series scala (the most complete in history, recommended Collection)
Official account: Data and intelligence, big data Club
The series of articles on big data are presented from three parts: technical ability, business foundation and analytical thinking. You will gain:
❖ improve self-confidence, deal with the interview freely, and get an internship or offer smoothly;
❖ master the basic knowledge of big d ...
Added by Wolphie on Wed, 02 Feb 2022 03:48:28 +0200
Learn Scala_day01_ Chapter 6 notes
Scala process control
Branch control if else
The if else expression in Scala actually has a return value. The specific return value depends on the last line of the code body that meets the conditions.
object TestIfElse {
def main(args: Array[String]): Unit = {
println("input age")
var age = StdIn.readInt()
val res :String = if (age < ...
Added by tegwin15 on Thu, 27 Jan 2022 23:30:11 +0200
Scala - Json parsing optimization
I introduction
Use com.com in the work scenario alibaba. Fastjson encountered some time-consuming scenarios. Here are some time-consuming scenarios and simple optimization methods.
II Storage form of Jason information
The usage scenario is the most basic kv String combination. The loading location is the dirver part of Spark Program. The e ...
Added by gimzo on Thu, 27 Jan 2022 02:34:03 +0200
14 Scala abstract classes and features of scala series
17 abstract class and trait
Abstract class:
Abstract classes are similar to Java, except that there is no attribute abstraction in Java, and scala can have attribute abstraction;
Characteristics:
The trait can be understood as an upgraded interface in Java
In Java, interfaces cannot declare properties without values and methods with implem ...
Added by smilepak on Fri, 21 Jan 2022 02:39:05 +0200
Scala functions \ detailed explanation of the simplification principle of anonymous functions
Scala functions have the principle of simplicity and follow the style of saving what can be saved. This also makes us programmers who come into contact with Scala programming for the first time feel very confused when reading the program. The export is, I don't know what these bars mean. Therefore, the simplification principles of these functio ...
Added by csckid on Mon, 17 Jan 2022 10:04:21 +0200
Experiment 2 - primary practice of scala programming - topic
1. Calculation series
Please program, calculate and output the sum Sn of the first n items of the following series in the form of script until Sn is just greater than or equal to Q, where q is an integer greater than 0, and its value is input through the keyboard.
For example, if the value of q is 50.0, the output should be: Sn=50.416695 ...
Added by techrat on Mon, 17 Jan 2022 05:39:21 +0200
Scala process control
Branch control if else
Let the program execute selectively. There are three kinds of branch control: single branch, double branch and multi branch
1. Single branch
Basic grammar
if (Conditional expression) {
Execute code block
}
Description: when the conditional expression is true, the code of {} will be executed.
2. Double bra ...
Added by tomharding on Wed, 12 Jan 2022 09:43:34 +0200
Scala pattern matching
pattern matching
The pattern matching in scala is similar to the switch syntax in Java, but Scala adds more functions from the syntax, so it is more powerful.
Basic grammar
In the pattern matching syntax, the match keyword is used to declare, and each branch is declared with the case keyword. When matching is needed, it will start from the f ...
Added by superman on Mon, 10 Jan 2022 23:30:01 +0200
[Spark] action operator of RDD
The so-called action operator is the method to trigger job execution
reduce
Function signature: def reduce (F: (T, t) = > t): t Function Description: aggregate all elements in RDD, first aggregate data in partitions, and then aggregate data between partitions
@Test
def reduce(): Unit = {
val rdd = sc.makeRDD(List(1,2,3,4)) ...
Added by nascarjunky on Wed, 05 Jan 2022 02:37:28 +0200
Spark introduction and spark deployment, principle and development environment construction
Spark introduction and spark deployment, principle and development environment construction
Introduction to spark
Spark is a fast, universal and scalable big data analysis and calculation engine based on memory.
It is a general memory parallel computing framework developed by the AMP Laboratory (Algorithms, Machines, and People Lab) at the U ...
Added by benreisner on Mon, 03 Jan 2022 22:14:19 +0200