How to build a secure and fast microservice architecture with Nginx

This article is adapted from a talk by Chris Stetson on nginx.conf about microservices today and how to use Nginx to build a fast and secure network system. You can watch this talk on YourTube.

How to build a secure and fast microservice architecture with Nginx

Self introduction

Chris Stetson: Hi, my name is Chris Stetson, and I lead the professional services department at Nginx and also lead the microservices practice.

Today we are going to talk about microservices and how to build a fast and secure network system with Nginx. At the end of our talk, we will show you a demo of how to build a microservice very quickly and easily using the Fabric pattern with our partners at Zokets.

Before we discuss the Fabric pattern, I want to talk about microservices and what it means from an Nginx perspective.

0:56 - The Big Shift

How to build a secure and fast microservice architecture with Nginx

Microservices have caused a major shift in application architecture.

How to build a secure and fast microservice architecture with Nginx

When I first started building apps, they were all about the same. The monolithic architecture shown in the slides also symbolizes how the application is structured.

There exists some type of virtual machine (VM), which for me is the usual Java. The functional components applied in the virtual machine exist in the form of objects that communicate with each other in memory, and they process back and forth and make method calls in the future. Occasionally, you will use mechanisms such as notifications to reach other systems in order to obtain data or pass information.

How to build a secure and fast microservice architecture with Nginx

With microservices, the paradigm of how applications are built is completely different. Your functional components will move from being in memory on the same host communicating with each other via virtual machines to being deployed in containers and using Restful API calls to connect to each other via HTTP.

This is very powerful because it gives you functional isolation. It gives you more fine-grained scalability, and you get better resilience to failures. In many cases it's the simple fact that you just need to use HTTP to make cross-network calls.

Now, this approach also has some drawbacks.

an anecdote

How to build a secure and fast microservice architecture with Nginx

I have a dark secret, I am a Microsoft employee and have been developing .Net for many years. While I was there, I built their video distribution platform called Showcase.

Showcase is a tool used to publish all the videos that Microsoft internally publishes to the web. People can watch these videos and learn, such as tips and tricks for using Microsoft Word. It's a very popular platform, we have a lot of people using it, and many of them comment on the videos we post.

Showcase was a .Net monolith from the start, and as it grew in popularity, we decided that it should be replaced with an SOA architecture. Conversion is relatively easy. Visual Studio provides the ability to essentially flip a switch, which is to turn your DLL calls into Restful API calls. With some small refactorings, we were able to get our code to run fairly well. We also use smart community services for these reviews and in-app community features.

tight loop problem

How to build a secure and fast microservice architecture with Nginx

It looks like we are SOA doable, everything was working fine in our first test, until we switched the system to our staging environment and started working with production environment data we saw some serious issues. These questions have many comments on the page.

It's a very popular platform, and some of its pages already have as many as 2,000 comments. When we dig into these issues, we realized that the reason these pages take a minute to render is because the smart community service first needs to populate the username, and then for each username it needs to make a network call to the user database to get the user Details and populates on the rendered page. This is very inefficient, taking a minute or two to render the page, while doing it in memory usually only takes 5 to 6 seconds.

ease

How to build a secure and fast microservice architecture with Nginx

After we went through the process of finding and solving problems, we ended up tweaking the optimization system by doing things like grouping all requests. We cached some data and eventually we optimized the network to really improve performance.

So, what does this have to do with microservices? Yes, with microservices, you're basically taking an SOA architecture and you're going to put that into the hyperdrive. In SOA architecture all objects are contained and managed within a single virtual machine, communicating with each other in memory, whereas in microservices today HTTP is used for data exchange.

When doing this without problems, you get great performance and linear scalability.

Nginx works well with microservices

How to build a secure and fast microservice architecture with Nginx

Nginx is one of the best tools you can use to transition to microservices.

Some history on Nginx and microservices. We've been involved in the microservices movement since the beginning, being the first to download applications from Docker Hub, and our customers and end users who have some of the largest microservices installations in the world use Nginx extensively in their infrastructure.

The reason is that Nginx is small, fast, and reliable.

Nginx Microservice Reference Architecture

How to build a secure and fast microservice architecture with Nginx

We have also been working on using microservices inside Nginx for some time. This is a stylized Nginx microservice reference architecture that we have built and is currently running on AWS.

We have 6 core microservices all running in Docker containers. We decided to build a multilingual application, so each container can run a different language, we currently use Ruby, Python, PHP, Java and Node.js.

We built this system using a twelve-factor application, and with a few modifications, it would work better for microservices and could replace the Roku platform. Later, we'll show you an application that actually runs in the demo.

The value of MRA

How to build a secure and fast microservice architecture with Nginx

Why do we want to build such a reference microservice architecture?

We built this reference architecture because we needed to provide our customers with a blueprint for building microservices, and we also wanted to test the capabilities of Nginx and Nginx Plus in the context of microservices and figure out how we could take advantage of it better. Finally, we want to make sure we have a solid understanding of the microservices ecosystem and what it can offer us.

Internet problem

How to build a secure and fast microservice architecture with Nginx

Let's go back to the big shift we discussed.

From moving all the functional components of your application running in memory and managed by a virtual machine to working over the network and communicating with each other, you essentially introduce a set of problems that you need to solve in order for the application to work effectively.

First you need service discovery, second, you need to load balance all the different instances in your architecture, and then third, you need to worry about performance and security.

For better or worse, these problems are inextricably linked and you have to make tradeoffs, and hopefully we have a solution that addresses all of them.

Let's take a deeper look at each issue.

service discovery

How to build a secure and fast microservice architecture with Nginx

Let's talk about service discovery. In a monolithic application, the APP engine will manage all the object relationships, you never have to worry about the relative position of one object to another, you just need to simply call a method, the virtual machine will connect to the object instance, and then after the call is complete destroy.

Then with microservices, you need to think about where those services are located. Unfortunately, this is not a common standard process. The various service registries you're using, whether it's Zookeeper, Consul, etcd, or others, work in different ways. During this process, you need to register your services, and you need to be able to read where those services are and can be connected.

load balancing

How to build a secure and fast microservice architecture with Nginx

The second question is about load balancing. When you have multiple service instances, you want to be able to easily connect to them, distribute your requests among them efficiently, and execute them in the fastest way possible, so load balancing between different instances is a very important issue .

Unfortunately, load balancing in its simplest form is very inefficient. It also becomes more complex and less manageable as you start load balancing with different and more complex schemes. Ideally, you want your developers to be able to decide which load balancing scheme to use based on the needs of their application. For example, if you are connecting to a stateful application, you need to have persistence, which ensures that your Session information will be preserved.

Secure and fast communication

How to build a secure and fast microservice architecture with Nginx

Perhaps the most daunting areas of microservices are performance and security.

When running in memory, everything is fast. Now, running on the network is an order of magnitude slower.

Information that is securely contained in a system, usually in binary format, is now transmitted over the network in text format. It is now easier to place a sniffer on the network and be able to listen to all the data your application is moving.

If data were to be encrypted at the transport layer, significant overhead would be introduced in connection rate and CPU usage. SSL/TLS in its full implementation phase requires nine steps to initiate a request. When your system needs to handle thousands, tens of thousands, hundreds of thousands or millions of requests per day, this becomes a significant barrier to performance.

a solution

How to build a secure and fast microservice architecture with Nginx

Some of the solutions we've developed at Nginx, we think, will solve all of these problems, giving you robust service discovery, awesome user-configurable load balancing, and secure and fast encryption.

Network Architecture

How to build a secure and fast microservice architecture with Nginx

Let's talk about the various ways you can install and configure your network architecture.

We propose three network models, which are not mutually exclusive per se, but which we consider to belong to multiple formats. The three modes are Proxy mode, Router Mesh mode and Fabric mode - this is the most complex and in many ways does load balancing at its head.

Proxy mode

How to build a secure and fast microservice architecture with Nginx

The Proxy pattern focuses entirely on the inbound traffic of your microservice application and effectively ignores internal communication.

You get all the HTTP traffic management benefits that Nginx provides. You can have SSL/TLS termination, traffic shaping, and security, and with the latest versions of Nginx Plus and ModSecurity, you can get WAF capabilities.

You can also cache, you can add everything that Nginx provides to your monolith to your microservice system, and with Nginx Plus, you can implement service discovery. As your API instances go up and down, Nginx Plus can dynamically add and subtract them in the load balancer.

Router Mesh Mode

How to build a secure and fast microservice architecture with Nginx

Router Mesh mode is similar to Proxy mode, in which we have a front-end proxy service to manage incoming traffic, but it also adds centralized load balancing between services.

Each service is connected to a centralized Router Mesh, which manages the distribution of connections between different services. The Router Mesh pattern also allows you to build in circuit breaker patterns so that you can add resiliency to your application and allow you to take steps to monitor and pull back your failed service instances.

Unfortunately, because this mode adds an extra link, it actually exacerbates performance issues if you have to do SSL/TLS encryption. That's why Fabric mode was introduced.

Fabric mode

How to build a secure and fast microservice architecture with Nginx

Fabric mode is the mode that flips everything in its head.

Like the other two modes before, there is a proxy server in front to manage incoming traffic, but the difference with the Router Mesh mode is that you replace the centralized Router with Nginx Plus running in each container.

This Nginx Plus instance acts as a reverse and forward proxy for all HTTP traffic, using this system you get service discovery, robust load balancing and most importantly high performance encrypted networking.

We'll explore how this happened and how we approach this work. Let's first look at the normal flow of how a service connects and dispatches their request structure.

normal process

How to build a secure and fast microservice architecture with Nginx

In this diagram, you can see that the Investment Manager needs to communicate with the User Manager to get information. The Investment Manager creates an HTTP client that initiates a DNS request against the Service Registry and gets an IP address in return, then initiates an SSL/TLS connection to the User Manager that goes through nine stages The negotiation or "handshake" process. Once the data transfer is complete, the virtual machine closes the connection and performs garbage collection of the HTTP client.

The whole process is like that. This is fairly simple and easy to understand. When you break it down into these steps, you can see how the pattern actually completes the request and response process.

In Fabric mode, we've changed that.

Details of Fabric Patterns

How to build a secure and fast microservice architecture with Nginx

The first thing you'll notice is that Nginx Plus runs in every service, and the application code communicates with Nginx Plus locally. Since these are local connections, you don't need to worry about encryption. They can be HTTP requests from Java or PHP code to the Nginx Plus instance, and they are all local HTTP requests inside the container.

You also noticed that Nginx Plus manages the connection to the service registry, we have a resolver that fetches all the user manager instances by asynchronously querying the registry's DNS instance, and pre-establishes the connection so that when the Java service needs to get from the user When the manager requests some data, it can use a pre-established connection.

Persistent SSL/TLS connections

How to build a secure and fast microservice architecture with Nginx

A stateful, persistent, and encrypted connection between microservices is a real benefit.

Remember in the first diagram how a service instance goes through some processes like creating an HTTP client, negotiating an SSL/TLS connection, making a request and closing it? Here, Nginx pre-establishes the connection between the microservices and uses the keepalive feature to keep a persistent connection between calls so that you don't have to deal with SSL/TLS negotiation for every request.

Essentially, we create a mini service-to-service VPN connection. In our initial tests, we saw a 77% increase in connection speed.

Fuses Plus

How to build a secure and fast microservice architecture with Nginx

In Fabric mode as well as Router Mesh mode, you can also benefit from creating and using circuit breaker mode.

Essentially, you define an active health check inside the service, and set up the cache to retain data in case the service is unavailable, resulting in full circuit breaker functionality.

So, now I'm sure you think the Fabirc mode sounds cool and want to jump on the bandwagon in a real environment.

Recommend an exchange learning group: 478030634  will share some videos recorded by senior architects: Spring, MyBatis, Netty source code analysis, principles of high concurrency, high performance, distributed, microservice architecture, JVM performance optimization, these become architects Necessary body of knowledge. You can also receive free learning resources, which are currently benefiting a lot:

How to build a secure and fast microservice architecture with Nginx

Zokets Demo

We've worked with Zokets partners who have helped us build a system that can easily visualize, control, and automate the process of building Microservices-based Fabric-pattern applications.

I would like to introduce Sehyo Chang, CTO of Zokets, who will help us demonstrate Fabric mode on their platform.

in conclusion

For those interested in learning more about how to build these types of network architectures, I highly recommend our blog series, which discusses the Microservices Reference Architecture, covering each of the patterns: Proxu Pattern, Router Mesh Pattern, and Fabric model.

 

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324377404&siteId=291194637