[Cloud Native] Introduction to Docker

what is

Question: Why does docker appear

Suppose you are developing a grain mall in Silicon Valley, you are using a laptop and your development environment has a specific configuration. Other developers are also in different environment configurations. The application you are developing depends on your current configuration and also depends on certain configuration files. In addition, your business has standardized test and production environments with its own configuration and set of supporting files. You want to simulate as many of these environments locally as possible without incurring the overhead of recreating the server environment. Excuse me?

How do you ensure that your application can run and pass quality checks in these environments? And without the headaches of versioning, configuration, re-coding and bug-fixing during deployment?

The answer is to use containers. The reason why Docker develops so rapidly is also because it provides a standardized solution for this ----- system smooth migration, container virtualization technology .

The environment configuration is quite troublesome. If you change a machine, you have to do it all over again, which is laborious and time-consuming. Many people think, can the problem be solved fundamentally, and the software can be installed with the environment? That is to say, when installing, copy the original environment exactly the same. Developers can take advantage of Docker to eliminate the "just works on my machine" problem of collaborative coding.

Before configuring an application running environment on the server, you need to install various software, such as the environment of the Shang Silicon Valley e-commerce project, Java/RabbitMQ/MySQL/JDBC driver package, etc. Not to mention how cumbersome it is to install and configure these things, it's not yet cross-platform. If we install these environments on Windows, we have to reinstall them on Linux. Moreover, even if it does not cross operating systems, it is very troublesome to port applications for another server with the same operating system.

It is traditionally believed that after the software coding development/testing is completed, the output is a program or binary bytecode that can be compiled and executed (java is an example). In order for these programs to be executed smoothly, the development team must also prepare complete deployment files so that the maintenance and operation team can deploy the application. The development needs to clearly inform the operation and maintenance deployment team of all configuration files + all software environments to be used. However, even so, deployment failures often occur .

The emergence of Docker allows Docker to break the concept of "program as application" in the past. Through the image (images), the core of the operating system is excluded, and the system environment required to run the application is packaged from the bottom to the top to achieve the seamless operation of the application across platforms.

 The idea of ​​docker

Docker is a cloud open source project based on Go language.

The main goal of Docker is "Build, Ship and Run Any App, Anywhere", that is, through the management of the life cycle of application components such as packaging, distribution, deployment, and operation, the user's APP (which can be a WEB application or a database application) etc.) and its operating environment can achieve "image once, run everywhere".

The emergence of Linux container technology solves such a problem, and Docker is developed on its basis . The application is made into an image, and the image becomes an instance running on the Docker container, and the Docker container is consistent on any operating system, which realizes cross-platform and cross-server. You only need to configure the environment once, and switch to another machine to deploy with one click, which greatly simplifies the operation. 

a sentence

A software container that solves the problem of running environment and configuration , facilitates continuous integration and facilitates the overall release of container virtualization technology.

Comparing containers and virtual machines 

A brief history of container development

traditional virtual machine technology

A virtual machine is a solution for installing with an environment.

It can run another operating system in one operating system, such as running Linux system CentOS7 in Windows10 system. The application has no awareness of this, because the virtual machine looks exactly the same as the real system, and to the underlying system, the virtual machine is a normal file, which can be deleted when it is not needed, and has no effect on other parts. This kind of virtual machine runs another system perfectly, which can keep the logic between the application, the operating system and the hardware unchanged. 

Disadvantages of virtual machines:

1 Too much resource usage             

 2 There are many redundant steps          

 3 slow start

Container Virtualization Technology

Due to some shortcomings of the previous virtual machine, Linux has developed another virtualization technology:

Linux Containers (Linux Containers, abbreviated as LXC)

A Linux container is a series of processes isolated from the rest of the system, running from another image that provides all the files needed to support the process. The image provided by the container contains all the dependencies of the application, so it is portable and consistent from development to testing to production.

Instead of emulating a complete operating system, Linux containers isolate processes. With containers, all the resources needed for software to run can be packaged into an isolated container. Unlike virtual machines, containers do not need to be bundled with a complete operating system , only the library resources and settings needed for the software to work. The system thus becomes efficient and lightweight and guarantees that software deployed in any environment will run consistently.

Compared 

Compare the differences between Docker and traditional virtualization:

* The traditional virtual machine technology is to virtualize a set of hardware, run a complete operating system on it, and then run the required application process on the system;

*The application process in the container runs directly on the host's kernel, and the container does not have its own kernel and does not perform hardware virtualization . Therefore, containers are lighter than traditional virtual machines.

* Each container is isolated from each other, each container has its own file system, processes between containers will not affect each other, and can distinguish computing resources.

what can you do 

Changes in technical rank

(The back end is getting more and more bitter)

Development / operation and maintenance ( DevOps ) new generation of development engineers

Build Once, Run Anywhere

Faster application delivery and deployment

After the traditional application development is completed, a bunch of installation programs and configuration documentation need to be provided. After installation and deployment, complicated configuration needs to be performed according to the configuration documentation to run normally . After Dockerization, only a small number of container image files need to be delivered , and the image can be loaded and run in the official production environment. The application installation configuration is already built in the image, which greatly saves deployment configuration and testing and verification time.

Easier upgrade and expansion

With the development of microservice architecture and Docker, a large number of applications will be structured through microservices, and application development and construction will become like Lego blocks. Each Docker container will become a "building block", and application upgrades will become very easy. When the existing container is not enough to support business processing, you can run a new container through the image for rapid expansion, so that the expansion of the application system can be changed from the original day level to the minute level or even the second level.

Simpler system operation and maintenance

After the application is containerized, the application running in the production environment can be highly consistent with the application in the development and testing environment. The container will completely encapsulate the application-related environment and state, and will not give the application due to the inconsistency of the underlying infrastructure and operating system. Bring influence and generate new bugs. When a program exception occurs, it can also be quickly located and repaired through the same container of the test environment.

​​​​​​​More efficient use of computing resources

Docker is a kernel-level virtualization. It does not require additional hypervisor support like traditional virtualization technologies . Therefore, many container instances can be run on a physical machine, which can greatly improve the CPU and memory utilization of the physical server.

Docker application scenarios 

Which businesses are using 

​​​​​​​​​​​​​​Sina

​​​​​​​Meituan

mushroom Street 

where to go 

·docker官网:http://www.docker.com
·Docker Hub官网: https://hub.docker.com/

​​​​​​​

Guess you like

Origin blog.csdn.net/m0_62436868/article/details/127135309