In order to better systematize software engineering related technologies, the first document first reviews the key time points of the entire software engineering core from an overall perspective, and then gradually expands.
A brief history of the development of software engineering
1.【1930~1946】 The founding era of computer theory
Before the emergence of computers, many mathematicians had already laid a solid mathematical theoretical foundation for the emergence of computer software. The most famous one is "Alan Turing", who created the "Turing machine" in the 1930s and 1940s. Theory, theoretically solves the problems of computer software and core - "computational complexity" and "algorithm representation".
The universal Turing machine is equivalent to showing us such a process: the program and its input can be saved on the storage tape first, and the Turing machine runs the program step by step until the result is given, and the result is also saved on the storage tape.
"Turing" machine
Turing machine theory is the most important core theory of computers
- It proves the general computing theory and affirms the possibility of computer implementation. At the same time, it gives the main architecture that a computer should have.
- It introduces the concepts of reading and writing, algorithms and programming languages, which greatly breaks through the design concepts of computing machines in the past.
- It is explained that the ultimate computing power of a computer is the computing power of a universal Turing machine. Many problems can be transformed into the simple model of Turing machine to consider.
We can vaguely see the main components of modern computers (actually the main components of von Neumann theory), memory (equivalent to storage tape), central processing unit (controller and its status, and its alphabet can only be 0 and 1 two symbols), 1O system (equivalent to pre-input of memory tape);
If Alan Turing laid the theoretical foundation of computers, then John Von Neumann materialized Turing's theory into actual physical entities and became the founder of computer architecture, known as "Father of Computers".
Von Neumann and Von Neumann Machine
"Von Neumann Machine", this architecture consists of an arithmetic and logic unit (ALU), a control unit and temporary memory registers, which together form the central processing unit (CPU). The CPU is connected to the memory unit which contains all the data that will be processed and manipulated by the CPU. The CPU is also connected to input and output devices to change data as needed and to retrieve the results of running programs.
Since von Neumann proposed this architecture in 1945, and to this day, it is essentially how most general-purpose computers operate today, with little change.
2.【1946~1955】The stage when computers were born without the concept of “software”
On February 14, 1946, the University of Pennsylvania in the United States built the world's first general-purpose electronic digital computer, ENIAC, which is the Electronic Numerical Integral Computer, or "ENIAC" for short. The inventors, Mowgli and Eckert, are both Americans, and their original intention was developed by the Auberdine Weapons Test Site in the United States to meet the needs of calculating ballistics.
Eniak
The invention of "ENIAC" marked the beginning of the information age.
Soon after, the successor EDVAC adopted binary and von Neumann structures and became a truly electronic computer.
Since then, a new profession has emerged: programmer. But in the early days, limited by computer performance, the programs at that time mainly dealt with scientific computing problems. Early programmers were more accurately called "Computers" rather than "Programmers". The mainstream programming language at that time was assembly, a low-level machine-oriented language.
Punch paper and assembly language
Machine language is often called the "first generation language", followed by "assembly language", which is the "second generation language".
Assembly language is actually machine language. The difference is that some symbols are used to replace the sequence of 0 and 1 in assembly language, just to facilitate memory.
3. [1955~1970] The rise of high-level programming languages and the outbreak of the software crisis
With the improvement of computer hardware performance, high-level programming languages have begun to rise. Fortran is the earliest high-level programming language in the world. Its emergence has changed the traditional way of interaction between humans and computers, freeing people from tedious labor. , devote most of their energy to advanced thinking, so the emergence of Fortran has epoch-making significance. Many subsequent languages were influenced by it.
high level language fortan
At this time, the concept of "software" also began to appear. But programming at the time was haphazardly coded, and the entire software looked as disorganized as a bowl of spaghetti.
- In the early days of software engineers, most of whom were trained mathematicians and electronic engineers, the hardware was usually used to execute a single program, and this program was programmed for a specific purpose.
- In the early days, when general-purpose hardware became commonplace, the versatility of software was very limited. Most software is developed by the individuals or organizations that use it, and software often has a strong personal flavor.
- In the early days of software development, there was no systematic method to follow. Software design was a hidden process completed in someone's mind. Moreover, apart from the source code, there are often no software manuals and other documents.
As the scale of software systems becomes larger and larger, the quality of software products becomes worse and worse, production efficiency becomes lower and lower, and maintenance becomes more and more difficult, resulting in a "software crisis".
In 1968, computer scientists from the North Atlantic Treaty Organization first proposed the term "software crisis" at an international academic conference held in the Federal Republic of Germany.
In summary, software crises include two aspects:
- How to develop software to meet growing and increasingly complex needs;
- How to maintain an ever-expanding number of software products.
4.【1970~1990】The birth of software engineering
In 1968, NATO's Science and Technology Committee convened nearly 50 first-class programmers, computer scientists and industrial giants to discuss and formulate countermeasures to get rid of the "software crisis". At that meeting, the concept of software engineering was first proposed. Software engineering is a discipline that studies how to develop and maintain software using engineering principles and methods such as systematization, standardization, and quantification.
Software engineering includes two aspects: software development technology and software project management. Software development technology includes software development methodology, software tools and software engineering environment. Software project management includes software measurement, project estimation, progress control, personnel organization, configuration management, project planning, etc.
Since then, software engineering theory has begun to flourish, and many software development models have been proposed, such as waterfall model, incremental development, spiral model, agile development, etc.
waterfall model
incremental development
spiral model
Agile development
The software at this stage is stand-alone software oriented to process development.
5.【1990~1999】Object-oriented and the birth of the Web
The birth of object-oriented is the most important milestone event in the history of programming. It proposed a new software design idea and development model. From then on, object-oriented analysis (OOA), object-oriented design (OOD), and object-oriented coding (OOP) became software Engineer's mantra. What followed was new methods and fields such as object-oriented modeling language (represented by UML), software reuse, and component-based software development. Correspondingly, software process management is proposed from the perspective of enterprise management.
object-oriented analysis
It is also worth noting that on August 6, 1991, the first website was launched. Created by Tim Berners-Lee, this site details the World Wide Web (W3) project. It was originally run on the NeXT computer at CERN, the European Organization for Nuclear Research.
The first static page
The birth of this website opened the Web1.0 era. At this time, Web Page is just a document, written and produced by web page editors and graphic designers, and does not require development by software engineers.
6. [1999~2005] The rise of Web software development (full-stack development, monolithic architecture era)
With the popularity of the Internet and the improvement of network speed, traditional stand-alone software has gradually developed into Web-based applications. Web Page changes from static to dynamic and begins to require programmer intervention.
In particular, the three major mainstream dynamic website technologies have emerged! .php => PHP, .jsp => Java Web, .asp => ASP
They all generate dynamically calculated and generated Html pages (Pages) on the server side, and then send the entire page to the front end.
The main feature of this stage is that static pages are gradually transformed into dynamic pages. Some of the more famous frameworks include:
struts framework
Engineers at this stage are all full-stack development engineers. The page is generated by the server, and the back-end service logic can be directly manipulated through scripts in the page.
Architecturally, this stage is mainly Monolithic Architecture: the main process calls in the system are in-process calls, and no inter-process communication will occur. The following is a common layered pattern of monolithic architecture.
Monolithic architecture
layered architecture
7. [2005~2012] Web2.0 era (the rise of AJAX technology, separation of front-end and back-end, SOA architecture era)
With the further popularization of the Internet and the further improvement of infrastructure, Web applications have transformed from simply browsing and obtaining information to user generated content (User Generated Content), and the Internet has entered the Web2.0 era. The rise of AJAX technology decoupled the presentation layer and business logic layer of Web software, and ultimately promoted the separation of the front-end and back-end of Web software. From then on, Web full-stack development engineers were divided into two professions: front-end engineers and back-end engineers. Front-end engineers went deep into the field of software visual interaction. After back-end engineers got rid of the burden of the presentation layer, they began to rethink the server-side software architecture. The idea of SOA (Service-Oriented Architecture, service-oriented architecture) began to prevail.
Main features of web1.0~3.0
Front-end architecture evolution
The front-end architecture has gone through a process from monolithic, to front-end and back-end separation, to microservices, and finally to the current micro-frontend, as shown in the figure below.
The idea of micro front-end is to introduce the micro-service architecture into the front-end. Its core is to be able to build an end-to-end vertical architecture with business as a unit, so that a single team can independently carry out related development while having considerable flexibility. , to compose the delivery application according to the requirements.
SOA (Service-Oriented Architecture) is a method of designing, developing, deploying and managing discrete models in a computer environment. SOA is not a new thing. It was proposed in the context of repeated construction and low efficiency of internal IT systems in enterprises. In the SOA model, all functions are defined as independent services, and all services are connected through a service bus (ESB) or process manager. This loosely coupled structure enables the integration of existing heterogeneous systems at minimal cost. Of course, due to the need to adapt to various heterogeneous systems (ESB is usually used to complete protocol conversion and data format conversion), thus introducing more complexity of its own.
Monolithic architecture
service-oriented architecture
8.【2012~2018】Mobile Internet Era (B/S→Multi-Terminal/S)
Entering the mobile Internet era, the B/S structure of Web software has transformed into multi-terminal/S (PC browsers, mobile browsers, Android, IOS, and small programs). It turned out that the one-to-one front-end and back-end became many-to-one, the "componentization" capability of the server was valued, and the microservice architecture was proposed and became popular.
SOA is an idea, ESB is the centralized implementation of SOA, and microservices are the decentralized implementation of SOA.
ESB has gradually faded out of the mainstream circle due to its complexity, and microservices have become mainstream. The architecture diagram of the microservices scenario
Microservice architecture
Panorama of the development and commercialization process of software engineering