Basic concepts and terminology of data structures, time complexity of algorithms

!!! Data structure: It is a collection of one or more specific data elements that exist between each other.

1. Related terms

1. Data:
That is the symbol, but these symbols must have two premises: they can be input into the computer and processed by the computer. Such as integer, real, sound, image, etc.
2. Data elements:
It is the basic unit that makes up the data. For example, the data elements of animals include cats and dogs.
3. Data items:
A data element can be composed of multiple data items. For example: people have data items such as age, height and weight. ** Data item is the smallest unit of data. **
4. Data objects:
It is a collection of data elements of the same nature.

Insert picture description here

2. Structure


1. Logical structure:
It is the relationship between data elements in the data object.
(1) Collection structure:
The set structure is that there is no other relationship between them except that they belong to the same set.
E.g:

typedef struct person
{
 char name[10];      //名 
 int  year;   //年龄
float tall;  //身高    
 
}user;

The name, height, and age in this structure are the aggregate structure.

(2) Linear structure
Data elements are in a one-to-one relationship

Insert picture description here

(3) Tree structure
The data element is a multi-layered relationship, so there is no need to give an example. It is a tree, you know it.
(4) Graphic structure
Data elements are many-to-many relationships.
2. Physical structure
Simply put, it is how to store data elements in the computer's memory.
(1) Sequential storage structure:
It is to store data elements in storage units with consecutive addresses, such as arrays.
(2) Chain storage structure:
Data elements can be stored in any storage unit, but you need to use a pointer to store the address of the data element.




!!! The time complexity of the algorithm

The time complexity of the algorithm is the time measure of the algorithm, written as: ** T (n) = Of (n) ** where n is some function of the problem size n
The method of deriving the big O order:
(1) Replace all addition constants in the running time with constants.
(2) In the function of the number of runs, only the highest item is kept.
(3) If the highest term exists and is not 1, remove the constant multiplied by this term.
1, constant order
Performing a constant algorithm, we call it with O (1) time complexity, which is a constant order. example:
int sum = 1,n = 100; //执行一次  ``
sum = (1+n)*n/2; //执行一次  

5> 2. Linear order To determine the order of an algorithm, you need to determine the number of times a particular statement or set of statements is run. Therefore, to analyze the complexity of the algorithm, the key is to analyze the operation of the loop structure.

int sum=0;
for(int i=0;i<n;i++)
{
    sum=sum+i
}

The complexity of this code is O (n);

Logarithmic order:
int count=1;
while(count<n)
{ 
	count=count*2; 	
	} 

When the count of each cycle continues to increase, and finally when the count is greater than or equal to n, jump out of the loop, using x to represent the number of cycles, get 2 ^ x = n; Then: x = log₂n, that is f (n) = log₂n, so The time complexity of this algorithm is O (log2n).

4. Square order:
The time complexity of the loop is equal to the complexity of the loop body times the number of times the loop runs.
Example: The time complexity of the inner loop is O (n) and the outer layer is O (n), then the total time complexity is O (n²).
for(int i=0;i<n;i++)
{      
     for(int j=0;j<n;i++)
     {    
        ........   
      }
 }



Common time complexity:
Insert picture description here
The time consumed by common time complexity is:
Insert picture description here

Published 10 original articles · Likes2 · Visits 217

Guess you like

Origin blog.csdn.net/dfwef24t5/article/details/105228753