The usage of delay in C language

Reposted from: Micro reading   https://www.weidianyuedu.com

  As a new high-level programming language, C language is widely used and implemented in computer software programming. The following small reading editor will introduce to you the usage of delay in C language in detail, I hope it will be useful to you.

  The usage of delay in C language is as follows:

  Suppose a delay function is as follows:

  void delay()

  {

  uint i;

  for(i=0;i<20000;i++);

  }

  How do we calculate it? First convert 20000 into hexadecimal to get 4E20, then multiply the high byte 4E by 6 to get 468D, then 468+2=470, and then 470D*20HEX (ie 32D)=15040; It is said that this delay function needs to execute a total of 15040 cycles. Assuming that the crystal oscillator used is 12MHZ, the total delay of this function is 15.04ms.

  With this formula, if we want to set the specified delay length, we can use its known formula to determine. That is: total time = [(6*high 8 bits of i value)+2]*i low 8 bits.

  For example: set a delay of 125ms.

  We first assign the low byte to 200D* (ie: C8), and then calculate the fixed value of the high and low bytes. From the formula, we can know that 125ms=200*((high byte of i value*6)+2), and it can be calculated The total delay time of output (high and low section*6)+2 should be equal to 625us, take the integer 625/6=104.1666 to get 104, convert 104 to hexadecimal to get 68H, and then combine the high byte and low byte That is to say, the fixed value is obtained, namely: 68C8HEX, and the substituting function is as follows:

  void delay()

  {

  uint i;

  for(i=0;i<0x68C8;i++);

  }

  If you write directly, you need to convert 68C8 into decimal, that is: 26824, after substituting it, you get:

  void delay()

  {

  uint i;

  for(i=0;i<0x26824;i++);

  Embed a compiled delay program in c language, read the book yourself, it's very simple

  C and assembly can be mixed for use in single-chip microcomputers, because it is more accurate to write delays in assembly, so you might as well write an assembly program, and then call it, it is really difficult to program accurate delays in C Oh. Hehe

  Who said that the C language can’t accurately delay time, use the timer/counter 1 of the 51 single-chip microcomputer or the working mode 2 of the timer/counter 2, and automatically reload the 8-bit counter to delay the time accurately, not to mention 1MS or 100um. Do it precisely.

  In fact, it is very simple. Using a timer, calculate the required timing time and then write it into the interrupt program of the single-chip microcomputer. This method can achieve precise timing, and the final error is only determined by the crystal oscillator .

  Yes, the timer is the most accurate. However, the number of timers is limited, and sometimes it may not be available. For example, temperature detection (especially if the change is relatively large and fast), a timer has to be continuously detected. At this time, if another timer has other tasks that are just assigned, it can only be delayed by other methods.

  Empty loop on the line

  Such as while(i--); the length of the delay is determined according to the difference of i

  However, the delay of C is not very accurate. You have to calculate the time based on the disassembly, the number of assembly statements and the instruction cycle.

  can also be

  void mDelay(unsigned int Delay) //Delay = 1000 time is 1S

  {

  unsigned int i;

  for(;Delay>0;Delay--)

  {

  for(i=0;i<124;i++)

  {;}

  }

  }

Guess you like

Origin blog.csdn.net/hdxx2022/article/details/129813308