I thought facial classics were just rote things before, but later I found that remembering them was really helpful to my understanding of knowledge. No wonder Chinese articles always require memorization.
preface
This face sorting is divided into the following parts. I hope it will be helpful to your work.
content | Link address |
---|---|
Java Foundation | |
Java collection | |
Java multithreading | |
Java virtual machine | |
computer network | |
Data structures and algorithms | |
database | |
JavaWeb | |
Design pattern | |
Spring,MyBatis |
1 data structure
1.1 tree
1 briefly describe the binary search tree
Binary lookup tree is defined as:
- The values of all nodes on the left subtree are less than or equal to the values of the root node;
- The values of all nodes on the right subtree are greater than or equal to the values of the root node;
- The left and right subtrees of any node are binary search trees;
- It can be an empty tree.
A conclusion is that the results of the middle order traversal of the binary search tree are ordered. At the same time, when using binary search tree to find a number, it uses the idea of binary search. The maximum number of times required for each search is equal to the height of binary search tree;
2 briefly describe the balanced binary tree
Balanced binary search tree: referred to as balanced binary tree. The highly balanced binary tree proposed by the mathematicians adelse velskil and Landis of the former Soviet Union in 1962 is also called AVL tree according to the English name of scientists. It has the following properties:
- It can be an empty tree.
- If it is not an empty tree, the left subtree and right subtree of any node are balanced binary trees, and the absolute value of the height difference does not exceed 1.
3 briefly describe the red black tree
Red black tree is improved on the basis of binary search tree, which ensures the self balance of red black tree and will not limp like binary search tree;
The longest path from the root node to the leaf node of the red black tree will not exceed twice the shortest path, which can speed up the query.
Red and black trees maintain balance through color change, left rotation and right rotation.
4 briefly describe the B tree
In large-scale data storage, when implementing index query, the binary search tree structure causes disk I/O to read and write too frequently due to the excessive depth of the tree, which leads to low query efficiency. At this time, due to the structure of external memory, a basic idea to solve this problem is to use multi fork tree structure.
For the time being, consider the B tree as a tree that can store a lot of data in each layer.
5 briefly describe the full binary tree
A binary tree is a full binary tree if the number of nodes in each layer reaches the maximum. That is, if the number of layers of a binary tree is K,And the total number of nodes is(2^k) -1 ,Then it is a full binary tree. 0 / \ 1 2 / \ / \ 3 4 5 6 / \ / \ / \ / \ 7 8 9 10 11 12 13 14
6 briefly describe the complete binary tree
Complete binary tree: the number of nodes of a complete binary tree is arbitrary. Formally, it is a missing triangle, but the missing part must be a continuous part in the lower right corner. The last line may not be complete. For k Layer complete binary tree, the range of nodes is 2^ (k - 1) -1 < N< 2^k - 1; 0 / \ 1 2 / \ / \ 3 4 5 6 / \ / \ / 7 8 9 10 11
A full binary tree must be a complete binary tree, while a complete binary tree is not necessarily a full binary tree.
7 briefly describe the reactor
Heap is a complete binary tree implemented by array. It has two forms: maximum heap and minimum heap. The so-called maximum heap means that the value of the parent node is greater than that of each child node, and this attribute is true for each node in the heap. According to this attribute, the maximum heap always stores the maximum value in the root node of the tree.
However, although the root node of the heap stores the largest or smallest elements, the sorting order of other nodes is unknown. For example, in the largest stack, the largest element is always in the position of index[0], but the smallest element may not be the last element. The only guarantee is that the smallest element is a leaf node, but it is not sure which one.
Using arrays to implement tree related data structures may seem strange, but it is very efficient in time and space.
Next, use the following example to introduce. The array implements a maximum heap and does not need additional space.
[ 10, 7, 2, 5, 1 ]
But when we don't allow pointers, how do we know which node is the parent node and which node is its child node? Good question! There is a mapping relationship between the position index of a node in the array and the indexes of its parent and child nodes. For example, if i is the index of a node, the following formula gives the position of its parent node and child node in the array:
parent(i) = floor((i - 1)/2) left(i) = 2i + 1 right(i) = 2i + 2
Note that right(i) is simply left(i) + 1, and the left and right nodes are always in adjacent positions. At the same time, not every smallest heap is an ordered array! To convert the heap into an ordered array, you need to use heap sorting.
2 sorting algorithm
2.1 comparison of sorting algorithms
Sorting algorithm | Average time complexity | Best case | Worst case scenario | Spatial complexity | stability |
---|---|---|---|---|---|
Bubble sorting | O(N^2) | O(N) | O(N^2) | O(1) | stable |
Quick sort | O(NlogN) | O(NlogN) | O(N^2) | O(1) | instable |
Insert sort | O(N^2) | O(N) | O(N^2) | O(1) | stable |
Shell Sort | O(NlogN) | O(NlogN) | O(NlogN) | O(1) | instable |
Select sort | O(N^2) | O(N^2) | O(N^2) | O(1) | instable |
Heap sort | O(NlogN) | O(NlogN) | O(NlogN) | O(1) | instable |
Merge sort | O(NlogN) | O(NlogN) | O(NlogN) | O(N) | stable |
2.2 bubble sorting
1. Briefly describe the bubble sorting process
- Start from the first element of the array and constantly compare with the latter element. If the latter element is less than the previous element, exchange it, and so on, and the last bit value of the array becomes the maximum value;
- Follow the same steps to find the second largest value of the array and put it in the penultimate position; And so on, complete the sorting of the array;
- You can also add a flag bit to judge whether the position of the elements in the array has changed after a cycle. If not, the array is already in order, and the sorting is ended early.
2 handwritten bubble sorting algorithm
// 2021.9.10 new version Class Main{ public static int[] bubbleSort(int[] nums){ int len = nums.length; boolean flag = false; int assistNum; // A total of len - 1 round for(int i = 0; i < len - 1; i++){ // Ranking of each round for(int j = 0; j < len - i - 1; j++){ if(nums[j] > nums[j + 1]){ assistNum = nums[j]; nums[j] = nums[j + 1]; nums[j + 1] = assistNum; flag = true; } } if(!flag){ break; }else{ flag = false; } } return nums; } }
2.3 quick sort
1. Briefly describe the quick sort process
- First, determine a benchmark value. I usually select the leftmost value of the array as the benchmark value, and set two front and rear pointers to point to the leftmost value of the array and the rightmost value of the array respectively;
- The right pointer looks for the number less than the reference value to the left. If not, it will move left one bit until the number less than the reference value is found; Then, the left pointer looks for the number greater than the reference value to the right. If not, it moves one bit to the right until it finds the number greater than the reference value;
- Exchange the numbers of the two pointers, and then continue with the above steps;
- When two pointers point to the same value, end the cycle, and exchange the reference value with the value pointed by the pointer at this time; At this time, the number on the left of the pointer will be less than or equal to the benchmark value, and the number on the right of the pointer will be greater than or equal to the benchmark value;
- Then the left and right arrays of the pointer are sorted by the above methods, and the sorting of the whole array is realized by recursive method.
2 why is the time complexity of explicit fast scheduling worse than that of heap sorting, but in practical use, the performance of fast scheduling is better than that of heap sorting?
Because each step of heap sorting is not conducive to the locality principle of the program (each heap adjustment takes elements from the top of the heap to the bottom of the heap, and then adjusts downward), each adjustment between elements is not an adjustment between adjacent elements, so it is necessary to constantly exchange data between disk and memory. In contrast, fast scheduling uses the divide and conquer method, The comparison between elements is within a certain segment, and the locality is quite good.
3 handwriting quick sorting algorithm
// Array is not a basic data type. Of course, it can be changed without using this public int[] sortArray(int[] nums) { quickSort(nums, 0, nums.length - 1); return nums; } // Quick sort is a sort that takes no extra space public void quickSort(int[] nums, int left, int right){ // Recursion requires a termination condition if(left >= right){ return; } int l = left; int r = right; // The reference value defaults to the value of the left pointer int benchmark = l; // Temporary variables for exchange int tempValue; while(l < r){ // Be sure to go first on the right, or you will make mistakes while(l < r && nums[r] >= nums[benchmark]){ r -= 1; } while(l < r && nums[l] <= nums[benchmark]){ l += 1; } // Find the values corresponding to the last two pointers and exchange them tempValue = nums[l]; nums[l] = nums[r]; nums[r] = tempValue; } // At the end of the loop, left must be the same as right, not left > right tempValue = nums[l]; nums[l] = nums[benchmark]; nums[benchmark] = tempValue; // Start binary traversal sorting to both sides, excluding the original benchmark value quickSort(nums, left, l - 1); quickSort(nums, r + 1, right); }
2.4 insert sort
1. Briefly describe the process of insertion sorting
- The unordered array is divided into two parts, one is the ordered array on the left and the other is the unordered array on the right;
- Judge that the first place of the unordered array is in the position of the ordered array, then insert it, and so on to sort the array.
2 handwriting insertion sorting algorithm
public int[] insertSort(int[] nums){ // The key of insertion sort is a sequence with and a sequence without for(int i = 0 + 1; i < nums.length; i++){ int curIndex = i; // First, save the leftmost value of the unordered array to a temporary variable int value = nums[i]; while(curIndex > 0 && value < nums[curIndex - 1]){ nums[curIndex] = nums[curIndex - 1]; curIndex -= 1; } nums[curIndex] = value; } return nums; }
2.5 Hill sorting
1. Describe the sorting process
In fact, it is the incremental version of insert sorting, during which the increment is continuously reduced, so that most of the arrays to be sorted become orderly, and so on to complete the sorting of the arrays.
Code reference: Portal
Theoretical knowledge reference: Portal
2 handwritten Hill sorting algorithm
// The third modification was made on September 10, 2021 public int[] shellSort(int[] nums){ int increase = nums.length; // Compared with direct insertion sort, direct insertion sort is a hill sort with increase = 1 and I = 0 while(increase > 1){ // The increment is 2 increase = increase / 2; // increase group insertion sorting is required for each round of increment for(int i = 0; i < increase; i++){ // The insertion sort of each group is as follows. The first number in the insertion sort must be orderly, so start from the second number // Here is the code for inserting sorting for(int j = i + increase; j < nums.length; j += increase){ int curIndex = j; int value = nums[curIndex]; while(curIndex > i && value < nums[curIndex - increase]){ nums[curIndex] = nums[curIndex - increase]; curIndex -= increase; } nums[curIndex] = value; } } } return nums; }
2.6 selection and sorting
1. Briefly describe the process of selecting and sorting
Traverse the array once, exchange the maximum value with the last bit of the array, then traverse the remaining elements except the last bit, exchange the maximum value with the penultimate bit of the array, and so on to complete the sorting of the array.
2 handwriting selection sorting algorithm
// Selective sorting is even less efficient than bubbling public int[] selectSort(int[] nums){ // The first for loop represents the number of transpositions int assist; for(int i = 0; i < nums.length - 1; i++){ int maxIndex = 0; // The second for loop is to find the maximum value of the current unordered array for(int j = 1; j < nums.length - i; j++){ if(nums[j] > nums[maxIndex]){ maxIndex = j; } } // When the maximum value is found, the value is exchanged with the last bit of the unordered array assist = nums[nums.length - i - 1]; nums[nums.length - i - 1] = nums[maxIndex]; nums[maxIndex] = assist; } return nums; }
2.7 heap sorting
1. Briefly describe the process of heap sorting
Heap sorting is actually a kind of selective sorting. It obtains the maximum value of the array by establishing the maximum heap each time, and exchanges the position of the value with the last bit of the array, and so on to realize the sorting of the array.
Concept of reactor 2
Heap is a complete binary tree implemented by array. It has two forms: maximum heap and minimum heap. The so-called maximum heap means that the value of the parent node is greater than that of each child node, and this attribute is true for each node in the heap. According to this attribute, the maximum heap always stores the maximum value in the root node of the tree.
However, although the root node of the heap stores the largest or smallest elements, the sorting order of other nodes is unknown. For example, in the largest stack, the largest element is always in the position of index[0], but the smallest element may not be the last element. The only guarantee is that the smallest element is a leaf node, but it is not sure which one.
Using arrays to implement tree related data structures may seem strange, but it is very efficient in time and space.
Next, use the following example to introduce. The array implements a maximum heap and does not need additional space.
[ 10, 7, 2, 5, 1 ]
But when we don't allow pointers, how do we know which node is the parent node and which node is its child node? Good question! There is a mapping relationship between the position index of a node in the array and the indexes of its parent and child nodes. For example, if i is the index of a node, the following formula gives the position of its parent node and child node in the array:
parent(i) = floor((i - 1)/2) left(i) = 2i + 1 right(i) = 2i + 2
Note that right(i) is simply left(i) + 1, and the left and right nodes are always in adjacent positions. At the same time, not every smallest heap is an ordered array! To convert the heap into an ordered array, you need to use heap sorting.
3 handwriting heap sorting algorithm
// Heap sorting, secondary summary on September 10, 2021 public void heapSort(int[] nums){ // Construct large root pile createMaxHeap(nums); int size = nums.length; while(size > 1){ swap(nums, 0, size - 1); size -= 1; adjustMaxHeap(nums, 0, size); } } // Construct large root heap (push parent node up from child node) // It can be judged according to the necessary and sufficient condition that the child node of the large root heap is always smaller than the parent node public void createMaxHeap(int[] nums){ for(int i = 0; i < nums.length; i++){ // Currently inserted index int currentNode = i; // Parent node index int fatherNode = (currentNode - 1)/2; // If the currently inserted value is greater than the value of its parent node, the value is exchanged and the index points to the parent node // Then continue to compare with the above parent node value until it is not greater than the parent node, then exit the loop // Because the minimum value of fatherNode is also 0, it is not necessary to judge that fatherNode > = 0 while(nums[fatherNode] < nums[currentNode]){ swapPosition(nums, fatherNode, currentNode); // We need to constantly judge upward // Points the current index to the parent index currentNode = fatherNode; fatherNode = (currentNode - 1)/2; } } } // Construct the remaining data into a large root heap (push the child node down from the parent node) public void adjustMaxHeap(int[] nums, int fatherNode, int size){ int maxNode; int leftNode = 2 * fatherNode + 1; int rightNode = 2 * fatherNode + 2; // At this time, considering that the rightNode is the node exchanged by the root node, the size here is reduced by 1 while(leftNode < size){ if(nums[leftNode] < nums[rightNode] && rightNode < size){ maxNode = rightNode; }else{ maxNode = leftNode; } // Without this break, you still report a mistake if(nums[maxNode] < nums[fatherNode]){ break; } swap(nums, maxNode, fatherNode); // Point the index to the index of the larger value in the child. Because the index of the smaller value in the child does not transform, its child nodes do not transform. At this time, only the index of the larger value needs to be considered fatherNode = maxNode; leftNode = 2 * fatherNode + 1; rightNode = 2 * fatherNode + 2; } } public void swapPosition(int[] nums, int x, int y){ int tempValue = nums[x]; nums[x] = nums[y]; nums[y] = tempValue; }
2.8 merge sort
1. Briefly describe the process of merging and sorting
That is, the idea of divide and conquer is constantly used to sort.
reference: Portal
2 handwritten merge sorting algorithm
// 2021.8.1 write it yourself class Solution { public int[] sortArray(int[] nums) { divide(nums, 0, nums.length - 1); return nums; } public void divide(int[] nums, int left, int right){ if(left >= right){ return; } int mid = left + (right - left)/2; // How to divide, how to close // The important thing is the process of merging // The following two are the process of division in divide and conquer divide(nums, left, mid); divide(nums, mid + 1, right); // This is the process of divide and conquer conquer(nums, left, mid, right); } public void conquer(int[] nums, int left, int mid, int right){ int[] tempArray = new int[right - left + 1]; // The first part is [left, mid], and the second part is [mid + 1, right] int index1 = left; int index2 = mid + 1; int index = 0; // When jumping out of the loop, one of the two indexes must be greater than the critical value. It is impossible for both indexes to be greater than the critical value at the same time while(index1 <= mid && index2 <= right){ // If the judgment condition here is < then the sorting method is not stable if(nums[index1] <= nums[index2]){ tempArray[index++] = nums[index1++]; }else{ tempArray[index++] = nums[index2++]; } } // At the end of a cycle, the possible result is that either index1 is greater than mid or index2 is greater than right. It is impossible for both to be greater than at the same time // If index1 is less than the critical value, index2 is greater than the critical value if(index1 <= mid){ // From left to right: the original array, the copy start position of the original array, the target array, the start position of the target array, and the length of the copied data System.arraycopy(nums, index1, tempArray, index, mid - index1 + 1); } if(index2 <= right){ System.arraycopy(nums, index2, tempArray, index, right - index2 + 1); } System.arraycopy(tempArray, 0, nums, left, right - left + 1); } }
3 other algorithms
How do you understand depth first traversal
Depth first traversal is to give priority to mining vertically. The main idea is to start from an unreachable vertex in the graph, go to the end along one path, then retreat from the end of this path to the previous node, and then go to the end from another path, and repeat this process recursively until all nodes are traversed;
2 how do you understand breadth first traversal
Breadth first traversal is to traverse nodes by layer. In implementation, breadth first traversal needs a queue to maintain the order of visited nodes, so as to access the adjacent nodes of these nodes in this order.
3 talk about the idea of dynamic programming
- Think of the best value and dynamic programming;
- When we think of dynamic programming, we first think of violent solution;
- When you think of the violent solution, you think of listing all the substructures;
- After all the substructures are listed, it is necessary to think that there is a certain overlap between each substructure;
- When the overlap is found, we should think that the complexity of the violent solution is too high, and whether each substructure can be calculated separately; When you see that each substructure is calculated separately, find out whether there is a recursive method between each substructure, and find the base case at the same time.