Basic concept of caching

  Cache is to adjust the speed of two or more different species of inconsistent speed, acceleration acts as a one of the slower one of the faster in the middle, such as one, two CPU cache is stored the most recent data frequently accessed CPU, memory, CPU frequently access the hard disk is stored in the data, but there are different sizes of hard disk cache, and even physical server raid card also has a cache, CPU speed up play is to access the hard disk data purposes because the CPU is too fast, the CPU need hard data often can not meet in a short time the CPU needs, so PCU cache, memory, Raid cards and hard disk cache to meet the data needs of the CPU to a certain extent, that CPU can significantly improve the efficiency of the CPU cache reads data from.

Cache classification

System cache: buffer / cache

buffer

  Also known as buffer write buffer, generally used for a write operation, data can be written to memory is written to disk, buffer write buffer is generally used for different media to address inconsistent speed buffering, data is first written to temporary Lane own nearest place to improve the write speed, CPU will data is written to the disk buffer memory, and then you think that the data has been written to complete the look, and then by the kernel at a subsequent time in written to the disk, so the server suddenly cut off electricity will lose some data in memory.

cache

  Cache is also known as a read cache, generally used for read operations, CPU reads the file read from the memory, if memory is not to start with the hard disk into memory and then read the CPU, the need to frequently read data is placed in its own nearest cache area, the next time you can read fast read.

cache properties

  1. Automatically expires: to cached data combined with effective time expires automatically deleted after exceeding time
  2. Expires: forced expired, the source website updated picture CDN is not updated, the picture is the need to force the cache expires
  3. Hit rate: the read cache hit rate

CDN cache

  CDN (Content Delivery Network), by the content distribution services to accelerate the whole network nodes, using the global scheduling system allows users to get near, reduce access latency, improve service availability, CDN first reduce bandwidth room, because a lot of CDN resources directly returned to the user, and second to solve the interconnection between different operators, because it provides access to the network of China Unicom China Unicom allows telecommunications network access telecommunications, serve the purpose of accelerating user access, third: to address user access regional issues, the nearest returns user resources.

CDN service providers

Baidu CDN: https://cloud.baidu.com/product/cdn.html
Ali CDN: https://www.aliyun.com/product/cdn?spm=5176.8269123.416540.50.728y8n
Tencent CDN: HTTPS: // the WWW .qcloud.com / product / cdn

CDN process user requests

  In advance of the pre-caching static content, to avoid the large number of requests back to the source, leading to the master network bandwidth is played and cause data can not be updated, in addition CDN data can be accessed, depending on the heat and different levels of cache, for example, the highest traffic the access edge node CD memory resources, and secondly on the SSD or SATA, and then followed by the storage in the cloud, so that both the speed and cost.

Basic concept of caching

CDN advantages

  1. In advance of the pre-caching static content, to avoid the large number of requests back to the source, leading to the master network bandwidth is played and cause data can not be updated, in addition CDN data can be accessed, depending on the heat and different levels of cache, for example, the highest traffic the access edge node CD memory resources, and secondly on the SSD or SATA, and then followed by the storage in the cloud, so that both the speed and cost. Cache - Cache fastest to places such as memory, cache hit rate of accurate data, access speed is fast
  2. Accurate scheduling: scheduling the user to the nearest edge node
  3. Performance Optimization: CDN dedicated cache for fast response
  4. Safety-related: to resist ***
  5. Saving bandwidth: in response to a user request since the edge node, thus significantly reducing the bandwidth of the source station.

The application layer caching

  Nginx, PHP and other web services may be provided in response to acceleration applied to a user request cached, there are some other interpretive language such as PHP / Python not run, and you need to be compiled into bytecode, but the bytecode interpreter needs to be interpreted as a machine code, after execution, and therefore also a bytecode cache, sometimes the phenomenon after the program code line byte code is not updated.

Other level cache

  CPU cache (L1 data cache and L1 instruction cache), the secondary cache, the cache three
Basic concept of caching

Guess you like

Origin blog.51cto.com/12980155/2407560