The relationship between CPU, memory and cache in the computer

CPU (Central Processing Unit, central processing unit)

Memory (Random Access Memory, Random Access Memory)

Cache

There is a close relationship among CPU, memory, and cache, and they together constitute the core part of a computer system, and both CPU and memory have corresponding speeds and bandwidths. In the process of configuring the computer, matching the memory with the corresponding speed and bandwidth according to the speed and bandwidth of the CPU will directly affect the performance of the whole machine. If the matching is not done properly, the performance of the CPU or memory will often be wasted.

1. The CPU is the main processor of a computer, which executes instructions and performs arithmetic, logic, and control operations. It is the "brain" of the computer and is responsible for processing data and performing various tasks.

2. Memory is a storage device in a computer, which is used to temporarily store and read data. The CPU reads instructions and data from memory and writes calculation results back to memory. The size of the memory determines the amount of data that the computer can process at the same time.

(1) The memory is used to temporarily store the calculation data in the CPU and the data exchanged with external memory such as hard disk. It is a bridge between the external memory and the CPU, and the operation of the memory determines the overall speed of the computer.


(2) Buffer (Buffer) is a part of the memory space. A certain amount of storage space is reserved in the memory to temporarily save the data of I/O operations such as input and output. This part of the reserved space is called the buffer , and the buffer has a size. In order to achieve the best disk operation efficiency, the data that needs to be written to the disk can be cached in the buffer first, and it is actually written to the disk when the buffer is full, so that the number of disk IOs can be reduced.

3. The cache is a high-speed memory located between the CPU and the memory. Its purpose is to speed up the CPU's access to data. Since the access speed of memory is much slower than that of CPU, the existence of cache can reduce the time that CPU waits for data. The cache is divided into multiple levels, usually including the first level cache (L1 Cache), the second level cache (L2 Cache) and the third level cache (L3 Cache). They are arranged in order of decreasing capacity and access speed to provide faster data access.

(1) Memory cache (from memory cache): The compiled and parsed files are directly stored in the memory of the process, occupying a certain amount of memory resources of the process, so as to facilitate fast reading when the next run is used, once the process Closed, the memory of the process will be cleared.

(2) Hard disk cache (from disk cache): write the cache into the hard disk file, and read the cache needs to perform I/O operations on the hard disk file stored in the cache, and then re-parse the cache content, which is complex to read and faster than memory Caching is slow.

Q: Why should buffer be introduced?
The mismatch between high-speed devices and low-speed devices will inevitably make high-speed devices spend time waiting for low-speed devices, so a buffer zone needs to be set up between the two.
Q: What is the main difference between cache and buffer?
The core function of Buffer is to buffer and ease the impact. The core function of Cache is to speed up access. Simply put, the buffer is more focused on writing, while the cache is more focused on reading.

1. Capacity and speed
Capacity: network storage ( cloud storage ) > hard disk > memory > cache > register
Speed: register > cache > memory > hard disk > network storage (cloud storage)
In theory, too much physical memory will slow down the speed , because it increases the addressing time.
2. Cache is to solve the speed difference between CPU speed and memory speed

The speed of CPU access to data is very fast. It can access and process one billion instructions and data in one second (term: CPU main frequency 1G), while the memory is much slower. Fast memory can reach tens of megabytes. , it can be seen how big the speed difference between the two is.
The data and instructions most frequently accessed by the CPU in the memory are copied into the cache in the CPU, so that the CPU only needs to go to the cache to fetch them, and the speed of the cache is much faster than that of the memory.
It should be pointed out here that:
1. Because the cache is only a copy of a small part of the data in the memory, when the CPU looks for the data in the cache, it will not be found (because the data is not copied from the memory to the cache) ), at this time, the CPU will still go to the memory to find the data, so that the speed of the system will slow down, but the CPU will copy the data to the cache, so that it will not go to the memory to fetch it next time.
2. As time changes, the most frequently accessed data is not static, that is to say, the data that was not frequent just now needs to be frequently accessed at this time, and the data that was the most frequent just now is not Frequent, so the data in the cache should be replaced frequently according to a certain algorithm, so as to ensure that the data in the cache is the most frequently accessed.

3. About the first level cache and the second level cache

Compared with RAM and ROM, RAM is charged storage, that is, the information will disappear when the power is turned off, while ROM will not be affected.
There are two types of RAM, one is static RAM, SRAM; the other is dynamic RAM, DRAM. The storage speed of the former is much faster than that of the latter, and the memory we use now is generally dynamic RAM. The cache is usually static RAM, the speed is very fast, but the integration of static RAM is low (storing the same data, the volume of static RAM is 6 times that of dynamic RAM), and the price is high (static RAM of the same capacity is four times that of dynamic RAM) times), but in order to improve the performance and speed of the system, we must expand the cache, so there is a compromise method, instead of expanding the original static RAM cache, but adding some high-speed dynamic RAM as a cache, these high-speed dynamic The speed of RAM is faster than conventional dynamic RAM, but slower than the original static RAM cache. We call the original static RAM cache the first-level cache, and the later added dynamic RAM is called the second-level cache.
The contents of the first-level cache and the second-level cache are copies (mapping) of data with high access frequency in the memory, and their existence is to reduce the access of the high-speed CPU to the slow memory.
Usually the order in which the CPU looks for data or instructions is: first look for it in the first-level cache, and then look for it in the second-level cache if you can't find it. Even more levels of caching may appear in the future as the number of cores in a single CPU increases.

When the CPU needs to read data, it first checks whether the required data exists in the cache. If the data is in the cache (hits the cache), the CPU can access it immediately, which can greatly increase the speed of reading data. If the data is not in the cache (cache miss), the CPU has to read the data from memory, which causes a long delay.

The existence of the cache enables the CPU to use memory data more efficiently and reduce frequent access to memory. A larger cache can hold more data, increasing the hit rate, which further improves the computer's performance.

To sum up, the relationship between CPU, memory and cache can be summarized as follows: CPU is the processor of a computer, memory is a device used to store data, and cache is a high-speed memory located between the CPU and memory to improve data throughput. read speed. They work together to provide efficient computer performance.


 

Guess you like

Origin blog.csdn.net/a694704123b/article/details/131403264