Has been writing CRUD really can not improve their level of coding, it is time to upgrade your own internal strength, the data structures and algorithms is still very necessary to learn about, and open only in this column only in order to record the learning process of their own learning data structures drink progress, the goal is three months or so, from the master list of all the Huffman tree, corresponding corresponding leetcode already on the exercises and writing exercises, speed up the process of their own learning, and grow !!!
The purpose of my personal study data structure is also very clear just to enhance their
Higher write performance program
◼ quickly learn new technologies
◼ open a new door
◼ take advantage of the brain not rust, to overcome it. Once mastered, a lifetime
record at the first day of learning content, complexity
This test code is to test the column with that number Fibonacci deed classic, simple, concise and clear
//斐波那契数列的第一种方法
public static int fib1( int n) {
if (n<=1) {
return n;
}
return fib1(n-1)+fib1(n-2);
}
//斐波那契数列的第二种办法
public static int fib2(int n) {
if (n<=1) {
return n;
}
int first=0;
int second=1;
for (int i = 0; i <n-1; i++) {
int sum=first+second;
first=second;
second=sum;
}
return second;
}
public static void main(String[] args) {
int n=35;
Times.test("fib1", new Task() {
public void execute() {
System.out.println(fib1(n));
}
});
Times.test("fib2", new Task() {
public void execute() {
System.out.println(fib2(n));
}
});
}
Test Case
We can see clearly the second algorithm is more efficient, faster, and there would involve a large 0 algorithm, until the Faculty understand the big O algorithm in the end is how mean, today was completely understand! ! !
If a single assessment of the implementation of efficiency might expect such a scheme
compare different algorithms to perform the processing time such programs for the same set of inputs is also known as: ex post statistics
◼ above solution has obvious disadvantages
execution time is heavily dependent on the hardware and run a variety of environmental factors of uncertainty
must write the code corresponding estimates
to select test data more difficult to ensure the fairness
◼ generally used to evaluate the correctness of the algorithm , readability, robustness (against unreasonable response capability and processing power input) from the following dimensions
time complexity (time complexity): the estimated number of execution program instructions (execution time)
space complexity (space complexity): estimate the required storage space occupied
Big O notation (Big O)
◼ ignore the constant factor, low-level
9 >> O(1)
2n + 3 >> O(n)
n2 + 2n + 6 >> O(n2)
4n3 + 3n2 + 22n + 100 >> O(n3)
on the wording, n3 ^ 3 is equivalent to the n-
◼ Note: Big O notation is merely a rough analysis model is an estimate that can help us a short time to understand the efficiency of the algorithm
◼ O (1) <O ( logn) <O (n) <O (nlogn) <O (n2) <O (n3) <O (2n) <O (n!) <O (nn)
certainly recommend a look the bloggers of time complexity and space complexity analysis is in place