A solution for rapid deployment of traditional applications

Background introduction

We have a web project, and the deployment method is distributed deployment achieved by a single application (dotnet without core) + load balancer (nginx). Due to the recent surge in the number of users, the number of load machines has increased from 2 to 8. This creates another problem. Every time you add a machine, update the software, or roll back the version, it will become very troublesome and painful. You will no longer be able to fish properly, and your life will be wasted on repetitive tasks with no technical content. superior.

Some veterans may have said, why shouldn’t we create a microservice architecture system? Use kubernetes to show off your skills, allowing you to easily manage hundreds or thousands of machine clusters. While talking and laughing, everything will be wiped out. Wouldn’t it be beautiful? !

All I can say is, Tiezi, I also want to enjoy the pleasure of controlling everything with one hand, but the actual situation does not allow it yet, but I am already actively developing in that direction.

Water far away cannot quench thirst nearby. The current situation urgently requires a semi-temporary solution to solve this problem of duplication of work.

Since there is no ready-made solution, then make one yourself!

Whole life

Manual process

Think about it when the load is light, the entire site update process usually includes the following links.

  • In order to avoid update errors, before updating, back up the version of the original stable running site. Of course, not all files need to be backed up, which will cause storage pressure. Only key files are backed up according to the actual situation. But as for which key files are, it needs to be determined according to the situation each time.
  • Package newly published files
  • Upload to the server and replace the original file
  • If each update includes server-side files, you need to restart the iis site (currently our web-side business runs on iis, but it should actually be extended to apache, tomcat, nginx and other site servers)

Automatic operation process

After sorting out the entire process of update deployment, I have organized the entire update process into the following stages:

  • Monitoring: At this stage, the program regularly monitors changes in a certain file. The file I set here is a json format file, which records the version number and the download address of the new file. Every time a change in the version number is detected, Start automating subsequent processes;
  • Download file: This part is to download the file to the specified location based on the download address obtained during the listening stage;
  • Pause the site (optional): Since not every update requires restarting the site, this step is optional;
  • Backup: Based on the incoming backup parameters, perform file backup and generate backup logs;
  • Decompression: After the backup is completed, you can decompress the second downloaded file and complete the replacement of the old and new files;
  • Restart the site (optional): If the site is paused, you need to restart the site again after completing the decompression and transition;
  • Notification: After the entire update process is completed, an email notification must be sent to relevant development, operation and maintenance personnel to inform them of the key information of this automatic update.

After clarifying the above major processes, you can start coding!

What I use here is the console structure, and I use several important tool packages, including Coravel, Serilog, SharpCompress, Hosting, etc.

The complete code address is provided at the end of the article.

monitor

Because it is a console program, in order to monitor the action of accessing URLs like a web program, you need to introduce the Microsoft.Extensions.Hosting package.

After introduction, you can combine it with Coravel to do scheduled monitoring.

public static void UseCoravelService(string[] args)
{
    // Renders each item with own style
    bool intervelFlag = true;
    var process = new Rule().RuleStyle("#FDE047");
    var host = new HostBuilder()
         .ConfigureAppConfiguration((hostContext, configApp) =>
         {
             configApp.AddEnvironmentVariables(prefix: "PREFIX_");
             configApp.AddCommandLine(args);
         })
         .ConfigureServices((hostContext, services) =>
         {
             // Add Coravel's Scheduling...
             services.AddScheduler();
         })
         .Build();

    host.Services.UseScheduler(scheduler =>
        scheduler
        .Schedule(async () =>
        {
            //核心业务
            //...
        })
        .EveryFifteenSeconds()
    );
    host.Run();
}

After introducing the network proxy and Coravel, the subsequent operation is the business code.

/// <summary>
/// 监听
/// </summary>
internal class StepMonitor
{
    public static async Task<bool> getJsonFile(string address)
    {            
        try
        {
            if (string.IsNullOrEmpty(address))
                address = "<我这里给了一个默认地址,也可以不要>";
            HttpClient hc = new HttpClient();
            var content = await hc.GetStringAsync(address);
            var json = JsonHelper.JsonDeserialize<VersionModel>(content);
            string versionFile = "currVersion.txt";
            string currVersion = "0";

            currVersion = await MainScheduling.ReadFile(versionFile);
            if(currVersion.Trim()==json.Version) 
            {
                return false;//线上版本和本地版本相同,继续等待;
            }
            
            await MainScheduling.WriteFile("currVersion.txt", json.Version);
            await MainScheduling.WriteFile("DownloadList.json", JsonHelper.JsonSerialize(json.Items));
            return true;
        }
        catch (Exception ex)
        {
            AnsiConsole.WriteLine($"[red]下载更新索引文件失败,{ex.Message}[/]");
            throw;
        }

    }
}

After defining the relevant monitoring methods, you can go back to Schedule and call them.

host.Services.UseScheduler(scheduler =>
        scheduler
        .Schedule(async () =>
        {
            string cache = CacheManager.Default.Get<string>("step");
            AnsiConsole.MarkupLine($"[#1E9FFF]当前状态:【{cache}】,{DateTime.Now.ToString()}[/]");
            if (cache == "waiting")
            {
                intervelFlag = false;
                OutputStep(0, "监听中(waiting),正在执行...");
                string file = GetParamValue(args, "file");
                if (await StepMonitor.getJsonFile(file))
                {
                    //CacheManager.Default.Set_SlidingExpire<string>("step", "download", TimeSpan.FromMinutes(10));
                    await SetStep("download");
                }
                intervelFlag = true;
            }
        })

The final monitoring effect is as follows

0a1229313e7688cd09e543bba68dea07.png

download

There is not much to say about this part. You can just download the file according to the way you like. It is not impossible to even call shaping tools such as wget. You just need to do a good job of interacting with the information of the program.

I won’t post the download code here. You can view it in the source code. The screenshot is as follows.

ca1d62e533538b8a846594a29b53981d.png

 

Close site (optional)

I judge whether to close the site here based on whether the version file contains the static keyword.

string currVersion = await ReadFile("currVersion.txt");
if (currVersion.Contains("static"))
{
    OutputStep(2, "仅更新静态文件,无需重启站点");
}
else
{
    OutputStep(2, "站点关闭中(stopweb),正在执行...");                            
    string appSite = GetParamValue(args, "appSite");
    string appPool = GetParamValue(args, "appPool");
    StepIISManager.Stop(appSite, appPool);                                                    
}

Screenshots of the two situations are as follows

 

2acbd9f5e29b5110e29b61d657749d0a.png

45ae0898e89be29028930798c789f268.png

backup file

For backup files, I have set a special input parameter "-subpath" here. If this parameter is not passed in, the site will be fully backed up. If it is passed in, the backup will be based on the passed in files or directories.

//这里的参数是解压地址
string inputPath = GetParamValue(args, "output");
string outputPath = GetParamValue(args, "backuppath");

string currVersion = await ReadFile("currVersion.txt");
string subPath = GetParamValue(args, "subpath");
OutputStep(3, "站点备份中(backup),正在执行...");
if (!string.IsNullOrEmpty(subPath))
{
    AnsiConsole.MarkupLine("[#FDE047]检测到子目录参数,不再进行全量备份,正在顺序备份[/]");
    if (!string.IsNullOrEmpty(subPath))
    {
        string[] parts = subPath.Split(',');
        var paths = parts.Where(u => !u.Contains(".")).ToList();
        var files = parts.Where(u => u.Contains(".")).ToList();
        if (files.Any()&&paths.Any(u=>!u.Equals("_temp")))
        {
            paths.Add("_temp");
        }
        foreach(var file in files)
        {
            AnsiConsole.MarkupLine($"[#FDE047]文件{file}备份中...[/]");
            string subInputPath = Path.Combine(inputPath, file);
            if (File.Exists(subInputPath))
            {
                string tempPath = Path.Combine(inputPath, "_temp");
                if (!Directory.Exists(tempPath))
                {
                    Directory.CreateDirectory(tempPath);
                }
                File.Copy(subInputPath, Path.Combine(tempPath, file), true);                                                                                
            }
        }
        foreach (var item in paths)
        {
            
            AnsiConsole.MarkupLine($"[#FDE047]{item}备份中...[/]");
            string subInputPath = Path.Combine(inputPath, item);                                    
            await StepDeCompress.ZipCompress(subInputPath, outputPath, item);
        }
    }
}
else
{
    await StepDeCompress.ZipCompress(inputPath, outputPath);
}

6e773434141f2bdf78f3c886a46d1bf7.png

 

Unzip

Decompression is to decompress the file downloaded in the second step to the specified directory, taking over the ability of the sharpcompress component.

/// <summary>
/// 解压文件
/// </summary>
/// <param name="inputFile">解压文件路径</param>
/// <param name="outputFile">解压文件后路径</param>
public static void Decompression(string inputFile, string outputFile)
{
    try
    {
        if (!inputFile.Contains(":"))
        {
            inputFile = Path.Combine(Path.GetDirectoryName(System.Diagnostics.Process.GetCurrentProcess().MainModule.FileName), inputFile);
        }
        SharpCompress.Readers.ReaderOptions options = new SharpCompress.Readers.ReaderOptions();
        options.ArchiveEncoding.Default = Encoding.GetEncoding("utf-8");
        var archive = ArchiveFactory.Open(inputFile, options);
        AnsiConsole.Status()
            .Start("解压文件...", ctx =>
            {
                foreach (var entry in archive.Entries)
                {
                    if (!entry.IsDirectory)
                    {                        
                        entry.WriteToDirectory(outputFile, new ExtractionOptions { ExtractFullPath = true, Overwrite = true });
                    }
                }
                AnsiConsole.MarkupLine("[#1E9FFF]解压完成[/]");
            });
    }
    catch (Exception ex)
    {
        AnsiConsole.WriteLine($"[red]{ex.Message}[/]");
        throw;
    }
}

renderings

f6a30980c4a55eb22a1be0b2dd39ed03.png

 

Restart site (optional)

The same steps as for pausing the site, except that the incoming command becomes start

if (currVersion.Contains("static"))
{
    OutputStep(5, "仅更新静态文件,无需重启站点");
}
else
{
    OutputStep(5, "站点启动中(startweb),正在执行...");                            
    string appSite = GetParamValue(args, "appSite");
    string appPool = GetParamValue(args, "appPool");
    StepIISManager.Start(appSite, appPool);
    Console.WriteLine(DateTime.Now.ToString());
}

Effect

6bcad7bf550556a31f6962cb7a61cff7.png

 

Send notification

This part actually goes without saying. It is to send an email to the designated manager. The content of the email should include the details of the update.

I won’t post the code here. Let’s take a look at the renderings.

ac36f149aa68988b468ee10a6f725971.png

 

3927d101fdadac9e5824a49f9e9b9032.png

Summarize

Although we are now in the era of cloud native, and various frameworks that turn to microservices are emerging one after another, one fact that cannot be ignored is that monolithic applications still account for a considerable proportion today. Especially in the projects of small and micro enterprises and even quite a few medium and large enterprises, monolithic applications should still account for the mainstream. Due to the limitations of their own business, some businesses may not be suitable to be transferred to the ranks of microservices from birth to death. This is reality and also a pity.

My personal attitude towards microservices is basically that of a believer, and I am actively embracing the transformation. The team leader also supports us to convert the existing business into a microservices architecture step by step based on the actual situation. However, after all, it is a project that has been put into production, and we still need to Stability is our top priority, so we move much slower. A large number of businesses are still running based on the structure of a single application, so the main development and maintenance tasks are still based on actual business. We are cautiously optimistic about the transformation. I think this is an objective, positive and correct attitude. Innovation itself is a fascinating thing with an uncertain future, but being too involved or even doing some extreme operations will cause unpredictable risks. Starting from reality, combining business conditions, and proceeding step by step is the right way. I also believe that in 2023, we will finally take the most critical step on the road to transformation.

 

attached

Code address: For Yourself_Bring Salt/AutoDeploy · GitCode

 

Guess you like

Origin blog.csdn.net/juanhuge/article/details/128677380