One article lets you understand the past and present life of Docker

Docker overview

Why does Docker appear?

In the process of developing a product, we must need at least two sets of environments: development and production. Nowadays, more companies are using three sets of environments: development, testing, and production. Our configuration and packaging of different environments are very cumbersome and complicated, and the degree of duplication is very high. The same thing needs to be done several times. Moreover, I believe every developer has encountered a problem: why can it run on my computer, but other people’s will have problems. These are all problems caused by inconsistent environments.

Publishing a project requires a jar package (including redis, mysql, jdk, es...). The project cannot be packaged with the environment, and it is very troublesome to configure an application environment on a server, and it does not support cross-platform.

So Docker appeared to solve the above problems.

Traditional: Development and packaging of jar packages, and the environment is configured by operation and maintenance.
Now: Development, packaging, and deployment are online, and a set of processes is completed.
image.png

We can look at the Docker icon, which is a whale carrying many containers. The idea of ​​Docker comes from containers.
Isolation: The core idea of ​​Docker is packaging and boxing, and each box is isolated from each other.

Docker uses the isolation mechanism to squeeze the performance of the server to the extreme.

The history of Docker

In 2010, several young people engaged in IT established a company called dotcloud in the United States to provide some pass cloud computing services. Regarding container technology related to LXC, they named their technology (containerization technology) Docker. When Docker was first born, it did not attract the attention of the industry!
So these young people thought of making the project open source - that is, developing the source code.

In 2013, the Docker project was open sourced. As a result, more and more people discovered the advantages of Docker, and Docker naturally became popular.
On April 9, 2014, Docker 1.0 was released

What is Docker?

After talking about the history of Docker for so long, let’s really understand what is Docker?
Let’s first take a look at the explanation given by Baidu Encyclopedia: “Docker is an open source application container engine that allows developers to package their applications and dependency packages into a portable image, and then publish it to any popular Linux or Windows Virtualization can also be implemented on machines with operating systems. Containers completely use the sandbox mechanism and will not have any interfaces with each other." When we learn to use Docker, we also need to know the official document address
of Docker : https://docs
.docker.com/ The content is very detailed
. Warehouse address: https://hub.docker.com/ Similar to git push...

What can Docker do?

Previous virtual machine technology

shortcoming:

  1. Taking up a lot of resources
  2. Many redundant steps
  3. Starts very slowly

Containerization technology

Container technology is not a simulation of a complete operating system

Compare the differences between Docker and virtual machine technologies

  • A traditional virtual machine virtualizes a layer of hardware, runs a complete operating system, and then installs and runs software on this system.
  • The applications in the container run directly on the host. The container does not have its own kernel and does not virtualize our hardware, so it is very lightweight.
  • Each container is isolated from each other. Each container has its own file system and does not affect each other.

Faster delivery and deployment of applications
Traditional: a bunch of help documents, installer
Docker: packaged image release test, one-click operation for
more portable upgrades and expansion and contraction
After using Docker, we deploy applications and package the project just like building blocks Create an image and expand server A. You only need to package the image and run it directly on server B for
simpler system operation
and maintenance. After containerization, our development and testing environments are highly consistent and
more efficient computing resources are utilized.
Docker is Kernel-level virtualization can run many container instances on a physical machine, and the performance of the server can be squeezed to the extreme.

Guess you like

Origin blog.csdn.net/zwb568/article/details/129189632
Recommended