Docker usage scenarios

Below are some of the Docker usage scenarios I've summed up to show you how to take advantage of Docker to create a consistent environment with low overhead. The content comes from: Eight real application scenarios of Docker 1. Simplified configuration This is the main use scenario of Docker promoted by Docker company. The biggest advantage of virtual machines is that they can run various platforms (software, systems) with different configurations on your hardware facilities. Docker provides the same functionality with reduced overhead. It allows you to put the runtime environment and configuration in the code and then deploy, the same Docker configuration can be used in different environments, which reduces the coupling between hardware requirements and application environments. 2. Code Pipeline Management The previous scenario is very helpful for managing code pipelines. From the developer's machine to the final deployment in the production environment, the code needs to go through many intermediate environments. Each intermediate environment has its own slight differences. Docker provides a consistent environment for applications from development to launch, making the code pipeline much simpler. 3. Improve development efficiency This brings some additional benefits: Docker can improve the development efficiency of developers. If you want to see a more detailed example, you can refer to Aater's talk at DevOpsDays Austin 2014 or at DockerCon. In different development environments, we all want to do two things well. First, we want to make the development environment as close to the production environment as possible, and second, we want to quickly build a development environment. Ideally, to achieve the first goal, we need to run each service in a separate virtual machine in order to monitor the running status of the service in the production environment. However, we don't want to need a network connection every time, and it is particularly troublesome to connect remotely every time we recompile. This is where Docker does a particularly good job. The machines in the development environment usually have relatively small memory. When we used virtual machines before, we often needed to add memory for the machines in the development environment. Now Docker can easily make dozens of services in Docker. start running. 4. Isolating applications There are many reasons why you might choose to run different applications on one machine, such as the previously mentioned scenarios for improving development efficiency. We often need to consider two points, one is to consolidate servers to reduce costs, and the other is to split a monolithic application into loosely coupled individual services (Translator's Note: Microservice Architecture). If you want to understand why loosely coupled applications are so important, check out this paper by Steve Yege comparing Google and Amazon. 5. Consolidate servers Just like consolidating multiple applications through virtual machines, Docker's ability to isolate applications allows Docker to consolidate multiple servers to reduce costs. With no memory footprint of multiple operating systems, and the ability to share unused memory among multiple instances, Docker can provide a better solution for server consolidation than virtual machines. 6. Debugging capabilities Docker provides a lot of tools, not necessarily just for containers, but for containers. They provide many features, including the ability to checkpoint containers, set versions, and view differences between two containers, which can help debug bugs. You can find an example of this in the article "Docker Saves the World". 7. Multi-tenant environments Another interesting use case for Docker is in multi-tenant applications, where it avoids rewriting critical applications. One of our particular examples of this scenario is developing a fast, easy-to-use multi-tenant environment for IoT (Translator's Note: Internet of Things) applications. The code base for such multi-tenancy is very complex and difficult to handle, and reprogramming such an application is time consuming and a waste of money. Using Docker, it is possible to create isolated environments for multiple instances of each tenant's application layer, which is not only simple but also inexpensive, thanks to the speed at which the Docker environment starts and its efficient diff commands. You can learn more about this scenario here. 8. Introducing new hardware resources can take days before rapid deployment to virtual machines. Docker's virtualization technology reduces this time to minutes. Docker just creates a container process without starting the operating system. This process only takes seconds. This is a feature that both Google and Facebook value. You can create and destroy resources in the data center without worrying about the overhead of restarts. Usually the resource utilization rate of the data center is only 30%. By using Docker and performing effective resource allocation, the resource utilization rate can be improved. Introducing new hardware resources can take days before rapid deployment to virtual machines. Docker's virtualization technology reduces this time to minutes. Docker just creates a container process without starting the operating system. This process only takes seconds. This is a feature that both Google and Facebook value. You can create and destroy resources in the data center without worrying about the overhead of restarts. Usually the resource utilization rate of the data center is only 30%. By using Docker and performing effective resource allocation, the resource utilization rate can be improved.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=326268741&siteId=291194637