Java rookie supply station --- the difference between VARCHAR and CHAR

Commonality

In the database, varchar and char are used to store string type data, and when creating a data table, you must explicitly specify the length when using these two types!

the difference

Varchar is variable length, that is: the length of the stored string is variable. For example, it is designed as varchar(8). When actually stored Frank, the number of characters occupied is only 5, not the set value of 8! When processing varchar type data, the database will use an extra byte to record the number of characters actually stored by default. Taking deposit Frankas an example, it will use an extra byte to 5record this value! Since 1 byte has only 8 binary digits, the maximum number that can be represented is 1111 1111converted to decimal 255. Therefore, by default, the upper limit of the number of characters stored in varchar is 255 characters. When the set value is compared with the actual stored character When the number exceeds 255, it will automatically use 2 bytes to record the number of characters actually stored, and at most only 2 bytes will be used for recording!

Char is fixed-length, that is: the length of the stored string is fixed, for example, it is designed as char(8). When it is actually stored Frank, since the length is less than 8, it will automatically fill in 3 spaces to ensure the stored data The length must be 8!

summary

When the length of the stored data is a fixed value, char should be used, such as storing ID number, postal code, etc., otherwise, use varchar.

Guess you like

Origin blog.csdn.net/c202003/article/details/107217218