The difference between signed and unsigned: C language

  Reference blog: https://blog.csdn.net/wen381951203/article/details/79922220

signed and unsigned integer type used to modify (including char, starting from the support standard ANSI C89)

signed representation signed, unsigned unsigned. Corresponding to the maximum number of signed values ​​than about half an unsigned small, since the highest bit is used to represent symbols

The default int, short, long, long long to signed numbers, ie, int is equivalent to signed int, short is equivalent to a signed short, long equivalent to signed long, long long equivalent to signed long long, but char itself is signed char or unsigned char, depending on the implementation language (compiler)

Range listed below:
Signed char: [- 2. 7 ^, 2 ^. 7) that is [-128, 128)
unsigned char: [0, 2 ^. 8) i.e. [0, 256)
Signed n-bit integer: [- 2 ^ ( . 1-n-), 2 ^ (n--. 1))
unsigned n-bit integer: [0, 2 ^ n)
Note that the type integer representing how much space is indeterminate, only guarantee sizeof (shor) <= sizeof ( int) < = sizeof (long). On conventional 32-bit platforms, int, and 32 bits long, short 16-bit, 64-bit long long

 

Guess you like

Origin www.cnblogs.com/Mr-choa/p/12642051.html