What is algorithm
What is algorithm ? simply , Algorithm is used to describe the method to solve the problem . Nowadays, the general definition of algorithm is : A description of the steps to solve a particular problem , A finite sequence of instructions in a computer , And each instruction contains one or more operations .
Characteristics of the algorithm
The algorithm has five basic characteristics , Namely :
- Input
Output
Fineness
deterministic
The feasibility of
Input & Output An algorithm , Input and output are essential . The algorithm has zero or more inputs , But there must be at least one or more outputs . in other words , Algorithms can have no input , But there must be output , The output can be a printout , It can also return one or more values .
Fineness The so-called poverty , It means that an algorithm will end automatically after executing certain steps , Without infinite loops , And each step needs to be completed in a limited time .
deterministic Certainty means that every step of the algorithm must have a specific meaning , There must be no ambiguity . For a simple example , It's like a person can only follow one path to the end , Instead of meeting a fork in the road .
in other words , under certain conditions , An algorithm can only have one execution path , The same input can only have a unique output .The feasibility of It means that every step of the algorithm must be feasible , In other words, each step can be completed by executing a certain number of times .
Algorithm design requirements
The same question , We can use different algorithms to solve . As the saying goes , All roads lead to Rome . Same destination , We can get there in different ways .
That's what I said , But when designing the algorithm, we must also follow certain requirements .
- correctness An algorithm , It must be ensured that it can achieve its ultimate goal , What else to talk about . The correctness of the algorithm , It means that the algorithm should at least have input 、 Output and processing are unambiguous , Can get the right answer to the question .
The correctness of the algorithm also varies in varying degrees , From shallow to deep, it can be roughly divided into the following levels :
- There is no syntax error in the algorithm program .
The algorithm program can get the required output for the legal input .
The algorithm program can get the output that meets the specification for illegal input .
The algorithm program also has satisfactory output for special test data .
Readability Readability means that the algorithm should be designed to be easy to read 、 Understand and communicate . High readability can help us better understand the algorithm , It is more convenient for us to debug and modify .
Robustness, An algorithm not only needs to deal with the reasonable input , And for unreasonable situations , Can also make a corresponding introduction . So called robustness , It refers to when the input data in the algorithm is illegal , It can also do related processing , Without abnormal or inexplicable results .
High time efficiency For the same question , Although there are different algorithms , But the algorithm also has good and bad . Time efficiency refers to the execution time of the algorithm , Longer execution time , It's also inefficient , The shorter the execution time , The more efficient .
Low reserve Apart from time efficiency , Storage is also an important indicator . Storage refers to the maximum storage space required during the execution of the algorithm , It mainly refers to the memory or external storage space occupied by the algorithm program when running . For the same problem , The less space the algorithm requires , The better the effect of the algorithm , The more space needed , The worse the effect of the algorithm .
Algorithm efficiency measurement method
We talked about the characteristics of the algorithm and the design requirements of the algorithm , But there is no clear way to measure the quality of an algorithm . To measure the quality of an algorithm , The concepts of time complexity and space complexity are put forward .
Time complexity
Definition
If there is a function f ( n ) f(n) f(n), Properly n n n Towards infinity , T ( n ) / f ( n ) T(n) / f(n) T(n)/f(n) The limit value of is not equal to 0 The constant , said f ( n ) f(n) f(n) yes T ( n ) T(n) T(n) Functions of the same order of magnitude of , Write it down as T ( n ) = O ( f ( n ) ) T(n)=O(f(n)) T(n)=O(f(n)), call O ( f ( n ) ) O(f(n)) O(f(n)) For algorithmic Progressive time complexity , abbreviation Time complexity , Use big O To express , be called Big O notation ;
The principle of deriving time complexity
- If the running time is a constant order of magnitude , Then we use the constant 1 Express ;
Only the highest order term in the time function is retained , Such as O ( n 2 + 4 n ) O(n^2 + 4n) O(n2+4n), After retaining the highest order term , Become O ( n 2 ) O(n^2) O(n2);
If the highest order term exists , Then the coefficient before the highest order term is omitted , Such as O ( 4 n 2 ) O(4n^2) O(4n2), After omitting the coefficient of the highest order term , Become O ( n 2 ) O(n^2) O(n2);
Methods of analyzing time complexity
Sum up , How to analyze the time complexity of a piece of code , Mainly as follows 3 Practical methods :
- Focus only on the line of code with the most loop execution times ;
The principle of addition : The total complexity is equal to the complexity of the largest piece of code ;
Multiplication principle : The complexity of nested code is equal to the product of the complexity of inner and outer nested code ;
Common time complexity curves

Common time complexity
O ( 1 ) O(1) O(1)
That is, no matter how many lines are executed , It doesn't affect other areas , The complexity of the code is O ( 1 ) O(1) O(1), As in the following code , Suppose that the execution time of each line of code is the same t t t, be 2,3 Each line needs its own needs 1 Two execution times , That is to say $t + t = 2t. The execution time complexity is constant .
void sayHello(String name){ System.out.prinln("Hello, " + String); System.out.prinln(" Welcome to my official account. :【 Village Yuyao 】");}
O ( l o g n ) O(log n) O(logn)
As in the following binary search code , adopt while
loop , It can reduce the search scope by many times , Suppose you need x
Times to get out of the loop , Then there are num \* 2 \* 2 \* ... = n
, among num
Is constant , Yes n
individual 2 Multiply , Then there are n u m ∗ 2 x = n num * 2 ^x = n num∗2x=n, To launch x = l o g 2 ( n / n u m ) x = log_2(n/num) x=log2(n/num) , So it's time complexity O The representation is O ( l o g n ) . O(log n). O(logn).
int binarySearch(int[] arr, int target){ int left = 0; int right = arr.length - 1; while(left <= right){ int middle = left + (left - right) / 2; if(arr[middle] == target){ return middle; }else if(arr[middle] > target){ right = middle - 1; }else { left = middle + 1; } } return -1;}
O ( n ) O(n) O(n)
As in the following code , for
The code in the loop is executed arr.length
Time , So the time required is proportional to the length of the array , So you can use O ( n ) O(n) O(n) To represent its time complexity . Using the above deduction principle and analysis method , As you can see, the code with the most loops is 4,5 That's ok , The total execution time is O ( 2 n ) O(2n) O(2n), After the coefficients are removed , Get the final time complexity O ( n ) O(n) O(n).
int sum(int[] arr){ int total = 0; for(int i = 0; i < arr.length; i++){ total += arr[i]; } return total;}
O ( n l o g n ) O(n log n) O(nlogn)
If we take a complexity of O ( l o g n ) O(logn) O(logn) The code is executed repeatedly n n n Time , Then the complexity of the code at this time does not become O ( n l o g n ) O(nlogn) O(nlogn) Did you? .
void hello (int n){ for( int i = 1 ; i < n ; i++){ int m = 1; while( m < n ){ m *= 2; } }}
O ( n 2 ) O(n^2) O(n2)
Suppose we take the time complexity to be O ( n ) O(n) O(n) The code is executed repeatedly n n n Time , So the time complexity at this point is n ∗ O ( n ) n*O(n) n∗O(n), It can be expressed as O ( n 2 ) O(n^2) O(n2), It's in the form of a double cycle .
void selectionSort(int[] arr, int n){ for(int i = 0; i < n; i++){ int min = i; for(int j = i + 1; j < n; j++){ if(arr[j] < arr[min]){ int tmp = arr[i]; arr[i] = arr[j]; arr[j] = tmp; } } }}
O ( n 3 ) O(n^3) O(n3)
and O ( n 2 ) O(n^2) O(n2), similar , Time complexity is O ( n 2 ) O(n^2) O(n2) Code nested loop once , Now the complexity becomes O ( n 3 ) O(n^3) O(n3), This is the form of triple loop nesting .
void demo(int n){ for(int i = 0; i < n; i++){ for(int j = 0; j < n; j++){ for(int k = 0; k < n; k++){ System.out.print("Hello, World"); } System.out.print("------"); } System.out.print("******"); }}
O ( n ! ) O(n!) O(n!)
Although theoretically there is a time complexity of O ( n ! ) O(n!) O(n!) The algorithm of , But in practice, we can hardly meet , So it's not going to unfold here .
Spatial complexity
Definition
Space complexity is a measure of the amount of storage space temporarily occupied by an algorithm during its operation ( That is, excluding the memory of the original sequence size , The extra storage space used in the algorithm ), Reflect the trend of memory usage , Instead of specific memory , Also called Progressive spatial complexity , Represents the growth relationship between the storage space and the data size of the algorithm , use S ( n ) S(n) S(n) Instead of ;
Common space complexity
O ( 1 ) O(1) O(1)
The temporary space required for algorithm execution is independent of a variable n
The size of , Then the space complexity of the algorithm is a constant , Expressed as S ( n ) = O ( 1 ) S(n) = O(1) S(n)=O(1);
int num1 = 1;int num2 = 2;int total = num1 + num2;
O ( n ) O(n) O(n)
The memory occupied by the array is n
, And no new space has been allocated later , Therefore, the space complexity of the algorithm is S ( n ) = O ( n ) S(n) = O(n) S(n)=O(n);
int[] arr = new int[n];
O ( n 2 ) O(n^2) O(n2)
In the case of two-dimensional arrays ;
int[][] arr = new int[n][n];
The time complexity and space complexity of common sorting algorithms
For the common sorting algorithm in the interview , The time complexity and space complexity are given in the following summary , And the stability of the algorithm .
Sorting algorithm | Average time complexity | Best time complexity | Worst time complexity | Spatial complexity | stability |
---|---|---|---|---|---|
Insertion sort | O ( n 2 ) O(n^2) O(n2) | O ( n ) O(n) O(n) | O ( n 2 ) O(n^2) O(n2) | O ( 1 ) O(1) O(1) | Stable |
Shell Sort | O ( n 1.3 ) O(n^{1.3}) O(n1.3) | O ( n ) O(n) O(n) | O ( n 2 ) O(n^2) O(n2) | O ( 1 ) O(1) O(1) | unstable |
Selection sort | O ( n 2 ) O(n^2) O(n2) | O ( n 2 ) O(n^2) O(n2) | O ( n 2 ) O(n^2) O(n2) | O ( 1 ) O(1) O(1) | unstable |
Heap sort | O ( n l o g 2 n ) O(nlog_2n) O(nlog2n) | O ( n l o g 2 n ) O(nlog_2n) O(nlog2n) | O ( n l o g 2 n ) O(nlog_2n) O(nlog2n) | O ( 1 ) O(1) O(1) | unstable |
Bubble sort | O ( n 2 ) O(n^2) O(n2) | O ( n ) O(n) O(n) | O ( n 2 ) O(n^2) O(n2) | O ( 1 ) O(1) O(1) | Stable |
Quick sort | O ( n l o g 2 n ) O(nlog_2n) O(nlog2n) | O ( n l o g 2 n ) O(nlog_2n) O(nlog2n) | O ( n 2 ) O(n^2) O(n2) | O ( n l o g 2 n ) O(nlog_2n) O(nlog2n) | unstable |
Merge sort | O ( n l o g 2 n ) O(nlog_2n) O(nlog2n) | O ( n l o g 2 n ) O(nlog_2n) O(nlog2n) | O ( n l o g 2 n ) O(nlog_2n) O(nlog2n) | O ( n ) O(n) O(n) | Stable |
Count sorting | O ( n + k ) O(n+k) O(n+k) | O ( n + k ) O(n+k) O(n+k) | O ( n + k ) O(n+k) O(n+k) | O ( n + k ) O(n+k) O(n+k) | Stable |
Bucket sort | O ( n + k ) O(n+k) O(n+k) | O ( n ) O(n) O(n) | O ( n 2 ) O(n^2) O(n2) | O ( n + k ) O(n+k) O(n+k) | Stable |
Radix sorting | O ( n ∗ k ) O(n*k) O(n∗k) | O ( n ∗ k ) O(n*k) O(n∗k) | O ( n ∗ k ) O(n*k) O(n∗k) | O ( n + k ) O(n+k) O(n+k) | Stable |
summary
Okay , That's the content of today's article . This paper mainly introduces the definition of algorithm 、 Characteristics of the algorithm 、 The design requirements of the algorithm and the measurement method of algorithm efficiency . Definition of time complexity 、 Derivation principles and common time complexity , The definition of spatial complexity and common spatial complexity are also introduced , Finally, the time complexity and space complexity of common sorting algorithms are summarized .
author : Village Yuyao
Game programming , A game development favorite ~
If the picture is not displayed for a long time , Please use Chrome Kernel browser .