[Docker] The road to advancement: (2) Introduction to Docker

What is Docker

To put it simply, Docker is an application container engine. Through Docker, administrators can manage containers very conveniently. Docker is developed based on the Go language and complies with the Apache 2.0 open source license.
Docker provides the function of packaging container images. Using Docker, developers can package the application systems and dependencies they develop into a lightweight, portable container, and then publish them to any Linux or Windows. In this way, Docker unifies the entire development, testing and deployment environment and process, greatly reducing operation and maintenance costs.
Docker completely uses the sandbox mechanism and there will be no interface between containers.

The origin and development history of Docker

In 2010, several bearded young people established a PaaS (Platform-as-a-Service) platform company in San Francisco, USA, and named it dotCloud. Although dotCloud has received some financing, it is struggling as major manufacturers, including Microsoft, Google and Amazon, enter the cloud computing field.
Fortunately, every time God closes a door, he opens a window. In early 2013, dotCloud engineers decided to open source their core technology Docker, which can package application code in Linux containers and easily migrate between servers.
To everyone's surprise, Docker technology became popular all over the world after open source. Therefore, dotCloud decided to change its name to Docker and devote itself wholeheartedly to the development of Docker. In August 2014, Docker announced the sale of its PaaS business dotCloud to cloudControl, a PaaS service provider located in Berlin, Germany. Since then, dotCloud and Docker have parted ways.

Docker’s architecture and composition

Docker adopts C/S architecture, that is, client/server architecture. Administrators interact with the Docker server through the Docker client. The Docker server is responsible for building, running and distributing Docker images. Users can deploy the Docker client and server on the same machine, or they can deploy them on different machines, and the two communicate through various interfaces.
The typical architecture of Docker is shown in the figure.
Insert image description here

Docker container ecosystem

Container core technology

Container core technologies refer to those technologies that enable containers to run on hosts. These technologies include container specifications, container runtime, container management tools, container definition tools, Registry, and container OS, which are introduced below.

Container specifications

Containers include not only Docker, but also other containers such as CoreOS's rkt. In order to ensure the healthy development of the container ecosystem and ensure compatibility between different containers, several companies including Docker, CoreOS, Google, etc. jointly established the Open Container Initiative (OCI), whose purpose is to formulate open container specifications. OCI has published two specifications: runtime spec and image format spec. With these two specifications, containers developed by different organizations and vendors can run on different runtimes, ensuring the portability and interoperability of containers.

Container platform technology

Container core technology enables containers to run on a single host, while container platform technology enables containers to run as a cluster in a distributed environment. Container platform technologies include container orchestration engines, container management platforms, and container-based PaaS. These pieces of content will be introduced below.

Container orchestration engine

Container-based applications generally adopt a microservice architecture. Under this architecture, applications are divided into different components, run in their own containers as services, and provide services to the outside world through APIs. In order to ensure high availability of the application, each component may run multiple identical containers. These containers will form a cluster, and the containers in the cluster will be dynamically created, migrated and destroyed according to business needs.
Such an application system based on microservice architecture is actually a dynamic and scalable system. This puts new requirements on our deployment environment. We need an efficient method to manage container clusters. This is the job of the container orchestration engine.
The so-called orchestration usually includes container management, scheduling, cluster definition, service discovery, etc. Through the container orchestration engine, containers are organically combined into microservice applications to achieve business requirements.

Why use Docker

Docker application scenarios

Docker provides lightweight virtualization services. Each Docker container can run an independent application. For example, users can run the Java application server Apache Tomcat in one container and the MySQL database server in another container.
At present, Docker has a wide range of application scenarios, mainly including the following.

  1. Simplify configuration
    This is the original purpose of Docker. Docker packages application code, runtime environment, and configuration. When deploying, users only need to use the image as a template to create a container. In fact, this achieves decoupling of the application environment and the underlying environment.
  2. Simplify the deployment process
    Docker allows developers to package their applications and dependencies into a portable container and then publish them to any popular Linux machine. Virtualization.
    Docker has changed the traditional virtualization method, allowing developers to directly put their own developed applications into Docker for management. Convenience and speed are already the biggest advantages of Docker. Tasks that used to take days or even weeks can be completed in just a few minutes under the processing of Docker containers.
  3. Saving expenses
    On the other hand, the advent of the cloud computing era has freed developers from the need to configure expensive hardware in order to pursue effects. Docker has changed the mindset that high performance must be high price. potential. The combination of Docker and cloud computers not only solves the problem of hardware management, but also changes the way of virtualization.

The widespread application of Docker has greatly reduced the operation and maintenance costs of IT facilities. Specifically, it is mainly reflected in the following aspects.

  • Lightweight virtualization. Compared with traditional server or host virtualization, Docker implements more lightweight virtualization. This can reduce deployment time and labor costs for application deployment.
  • Standardized application release. Docker containers contain running environments and executable programs and can be used across platforms and hosts.
  • Save startup time. The startup of a traditional virtual host generally takes minutes, while the startup of a Docker container takes seconds.
  • Save storage costs. In the past, a virtual machine required at least several GB of disk space, but Docker containers can be reduced to the MB level.

Guess you like

Origin blog.csdn.net/sinat_36528886/article/details/134891241