Computer architecture
Von Neumann architecture.
Background: The early computer only has a specific function, if you want to increase or decrease the function, it is necessary to redesign the circuit and restructuring. If the stored program, and a common circuit design, can improve the ability of playing the computer.
Concept: The computer architecture design concept with the stored program instructions and data. In a certain essence, modern computers are von Neumann machines
Content: There must be a reservoir, controller, operator, input devices, output devices. Wherein the controller and the operator to the CPU.
The data and programs can be fed into the computer, the memory of long-term program, data, intermediate results and the final result of the operation; and data transfer can be provided with arithmetic, logical and data processing capability; and can output the processing results to the user as required.
Bottleneck: the problem between the CPU and memory rate can not be reconciled, most of the time, CPU processing speed is much faster than memory, causing the CPU idle waiting for data, a waste of force count.
Structure of modern computers
Modern computer architecture is modified on the basis of von Neumann, the performance difference can solve the problem between the CPU and storage devices.
Modern computer architecture, the CPU for the operator, and a memory controller. Mainly because it has a more high-speed devices, such as memory and CPU registers.
Level computer programming language
Programs Translation and interpretation program.
Language translation and interpretation is to convert either to convert natural human language into machine language. Are low-level computer language executed last.
Translation: After using the higher level computer language describing the program logic using the compiler for compiling, generating relatively low-level language computer program logic, a computer to execute the final relatively low-level computer language (machine language). Compilation process generates a new program. The C / C ++, Golang.
Explanation: The use of higher level language computer program logic described later, these high-level statements as input, a program (interpreter) input to a lower level language L0 achieved, to obtain a more low-level computer language L0. The process does not generate a new program, but the interpreter to explain a statement, the computer will execute a statement. L0 interpreter must be used to explain a program written in high-level language. Such as: python, js, php.
Translation + interpretation: Java and C # strictly speaking belong compiled interpreted language, Java program is compiled by the compiler to be JVM bytecode .class file, and then to explain this to the computer to execute machine code.
Level computer programming language.
The computer is divided into seven levels, from bottom to top as hardware logic layer, micro-program the machine level, conventional machines layer, the three layers also called the actual machine. The operating system layer, layer assembly language, high-level language layer, application layer, which is also known as four virtual machines, of which the first three-called system software, application layer is the application software.
Hardware logic layer. Belonging to the electronic engineering, by the door, like the flip-flop logic circuit.
Microprogram machine level. Programming language for the microinstruction set, microinstruction set can be composed of micro-programs, micro-programs sent directly to the hardware execution.
Traditional machine layer. CPU instruction set programming language (machine instructions), hardware and programming language are directly related, use different CPU architectures different CPU instruction set.
A set of microinstructions may constitute a microprogram, the microprogram can generally correspond to a single machine instruction. Machine instructions -> microprogram -> microinstruction.
The operating system layer. It provides a simple user interface up, down instruction docking system, hardware resource management, operating system layer is a layer between the hardware and software adaptation.
Assembly language layer. You can use assembly language assembler translates assembly language into machine language directly executable.
High-level language layer. Essentially writing programs let the machine run.
Application layer. For some uses specially designed program running on the computer. It is designed specifically for use.
The unit of the computer
Unit of capacity
On the physical level, the use of high level to indicate a low level to represent 0, so theoretically known only to 0/1.
0/1 is called a 'bit (bit) byte: 1Byte = 8bits, in addition, others are 1024 hex, including B -> KB -> MB -> GB -> TB -> PB -> EB.
1G = 1024^3Bytes = 1024^3*8bits,注:1024=2^10。
List 10 is generally hard with a carry flag capacity
Speed unit
Network speed: 100M 100M in the broadband does not refer to 100MB, but to 100Mbps, both 100Mbit / s, in MB per second is converted to (100/8) MB / s = 12.5 MB / s.
Speed CPU: CPU clock frequency units is generally Hertz (Hz), which is a measure of cyclical changes in the number of repetitions per second, each actuator 15 that if the frequency is 15 Hz. 2GHz = 2 * 1000 ^ 3Hz = 2 billion times per second (to Byte count on it)
Computer character set and encoding
ASCII code, 7 bits can represent all the characters, 2 ^ 7 = 128 characters, can print 95, 33 unprintable.
Extended ASCII code, expand the ASCII code of 7bits => 8bits, a total of 256 characters.
GB2312,6763 682 characters and other symbols, a total of 7445 characters.
GBK, backward compatible with 2312, support for up international ISO standards, contains a total of 21,003 Chinese characters, support for all CJK characters. Chinese GBK encoding using the default Windows system.
Unicode, Unicode, Unicode, single, UTF - *, UTF-8 bytes for Unicode encoded. Recommended to use UTF-8 encoding when programming.
Today's content:
Four months after learning framework has been developed initially to build (what to learn, how to learn, when to learn, to learn how to be kind of degree).
Review some of the important points of principle computer.
2019-10-23