Fabric Django project using Automated Deployment

Author: HelloGitHub- Dream figure

Sample code relates to text, it has been updated to synchronize HelloGitHub-Team warehouse

In the previous tutorial, we will deploy by the code manually to the server. The whole process involves dozens of command, lose N characters. Once we have local code update, the whole process they have to repeat it again, it will become very complicated.

Fabric can use to automatically execute commands on the server. Because the entire code deployment process is the same, as long as we write with Fabric deployment script, since you can run a script to automatically deployed.

First install Fabric locally:

$ pipenv install fabric --dev

Just because the Fabric used locally, so use --devoption to Pipenv will depend Fabric written under dev-packages configuration, online environment will not install Fabric.

Deployment Process Review

Fabric Before writing the script, let's look at when we updated the code under the local development environment, the entire deployment process on the server.

  1. Remote connection server.
  2. Into the project root directory, pull the latest code from a remote repository.
  3. If the project introduces a new dependency, we need to perform pipenv install --deploy --ignore-pipfileto install the latest dependence.
  4. If the project static files modified or added, you need to do pipenv run python manage.py collectstaticto collect static files.
  5. If the database has changed, you need to perform pipenv run python manage.py migratethe migration database.
  6. Restart Nginx and Gunicorn changes to take effect.

The whole process is the way to translate every step of the operation to Fabric corresponding script code, so an automated deployment script is complete.

Perfect project configuration

Separate settings file

In order to secure, online environment, we will debug changed to False, but the development environment should be changed to True, change to change to a lot of trouble. In addition, django of SECRET_KEY is very intimate configuration, django many security mechanisms are dependent on it, if inadvertent disclosure, the site will face a huge security risk, as we do now written directly in the configuration file, Wan accidentally open source Code, SECRET_KEY will directly leak, good practice is to write the value of the environment variable, taking the value from the environment variables.

One solution to the above problem is to split the settings.py file, different environments corresponding to different settings files, read from the environment variable when you start django DJANGO_SETTINGS_MODULEvalue to the value specified in the configuration file as the final application.

Let's split the settings.py, first at blogproject directory create a Python package named settings, and then create a common.py, used to store common configuration, local.py store configuration development environment, production.py storage line on environment configuration:

blogproject\
    settings\
        __init__.py
        local.py
        production.py
    settings.py

The contents of all settings.py file copied to common.py in, and SECRET_KEY, DEBUG, ALLOWED_HOSTSthese configurations moved local.py and production.py in (common.py these items can be deleted).

Configuring local.py content development environment is as follows:

from .common import *

SECRET_KEY = 'development-secret-key'
DEBUG = True
ALLOWED_HOSTS = ['*']

Online environment configuration:

from .common import *

SECRET_KEY = os.environ['DJANGO_SECRET_KEY']
DEBUG = False
ALLOWED_HOSTS = ['hellodjango-blog-tutorial.zmrenwu.com']

Note that we use the top of from .common import *the entire configuration from common.py, then depending on the environment, is configured to cover below.

Different lines and development environment that, for safety, DEBUG mode is turned off, SECRET_KEY acquired from the environment variable, ALLOWED_HOSTS set of allowed HTTP HOSTS (see specifically explain the role of the latter).

After the above operation is completed, you must remember to delete settings.py .

Now we have two sets of configuration, a set is local.py, is set production.py, when you start the project, django how do you know which set of configuration we use it? The answer is that when you run manage.py script, django default help us specify. When manage.py execute commands using python, django can receive a --settings-module parameters for specifying execution commands, configuration files used by the project, if the parameter is not specified display, django will get from the environment variable DJANGO_SETTINGS_MODULE years. See manage.py source:

def main():
    os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'blogproject.settings')
    try:
        from django.core.management import execute_from_command_line
    except ImportError as exc:
        raise ImportError(
            "Couldn't import Django. Are you sure it's installed and "
            "available on your PYTHONPATH environment variable? Did you "
            "forget to activate a virtual environment?"
        ) from exc
    execute_from_command_line(sys.argv)

You can see the main function, setdefault the first line of the environment variable for our DJANGO_SETTINGS_MODULEvalues, the role of this code is that if the current environment DJANGO_SETTINGS_MODULEvalue is not set, it will be set blogproject.settings, so we use python manage.pywhen executing commands , django settings.py by default using this configuration for us.

So we can set the environment variable to specify the configuration file django use.

For manage.py, usually performed in the development environment, so here are the DJANGO_SETTINGS_MODULEvalues changed blogproject.settings.local, django loads blogproject / settings / local.py this configuration file so running the development server.

Also see wsgi.py file, this file has an application, is Gunicorn loaded when running on-line environment, inside of these DJANGO_SETTINGS_MODULEinsteadblogproject.settings.production

Thus, when using the manage.py command execution, loading a local.py setting, and use gunicorn run the project, using production.py setting.

BASE_DIR modify configuration items

One thing to note as well, see store common.py common configuration file, which has a configuration item to:

BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))

This BASE_DIRpoints to the root directory of the project, which is obtained by way of back up according to where the configuration file, find the root of the project. Because previous directory structure HelloDjango-blog-tutorial / blogproject / settings.py, so back up layer 2 to reach the root item. The directory structure now becomes HelloDjango-blog-tutorial / blogproject / settings / common.py, layer 3 need to back up to get to the root of the project, and therefore need to BASE_DIRbe a simple modification, modified as follows:

BASE_DIR = os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))

That is again the outer layer of bread os.path.dirname, then rolled back up one to reach the root of the project.

Set the environment variable Supervisor

Further, since the online environment configuration secret_key acquired from the environment variable, so we change it supervisor configuration, introducing the environment variable, supervisor open configuration file ~ / etc / supervisor / conf.d / hellodjango-blog-tutorial.ini , add the environment variable configuration statements:

environment=DJANGO_SECRET_KEY=2pe8eih8oah2_2z1=7f84bzme7^bwuto7y&f(#@rgd9ux9mp-3

Because after the code is likely to pass through the public code repository, it is best to use online SECRET_KEY a change. This site can automatically generate SECRET_KEY: Django Secret Key Generator .

Save the configuration, and then to execute the update command to update the configuration.

$ supervisorctl -c ~/etc/supervisord.conf update

Fabric write script

All preparations are in place, now to write automated deployment script using the Fabric.

Fabric is usually located fabfile.py script file, so the first in the project root directory to build a fabfile.py file.

According to the script code written as follows:

from fabric import task
from invoke import Responder
from ._credentials import github_username, github_password


def _get_github_auth_responders():
    """
    返回 GitHub 用户名密码自动填充器
    """
    username_responder = Responder(
        pattern="Username for 'https://github.com':",
        response='{}\n'.format(github_username)
    )
    password_responder = Responder(
        pattern="Password for 'https://{}@github.com':".format(github_username),
        response='{}\n'.format(github_password)
    )
    return [username_responder, password_responder]


@task()
def deploy(c):
    supervisor_conf_path = '~/etc/'
    supervisor_program_name = 'hellodjango-blog-tutorial'

    project_root_path = '~/apps/HelloDjango-blog-tutorial/'

    # 先停止应用
    with c.cd(supervisor_conf_path):
        cmd = 'supervisorctl stop {}'.format(supervisor_program_name)
        c.run(cmd)

    # 进入项目根目录,从 Git 拉取最新代码
    with c.cd(project_root_path):
        cmd = 'git pull'
        responders = _get_github_auth_responders()
        c.run(cmd, watchers=responders)

    # 安装依赖,迁移数据库,收集静态文件
    with c.cd(project_root_path):
        c.run('pipenv install --deploy --ignore-pipfile')
        c.run('pipenv run python manage.py migrate')
        c.run('pipenv run python collectstatic --noinput')

    # 重新启动应用
    with c.cd(supervisor_conf_path):
        cmd = 'supervisorctl start {}'.format(supervisor_program_name)
        c.run(cmd)

To analyze the deployment of code.

deploy function entry of the deployment process, together with its task decorator fabric marked as a task.

And then define a number of projects related variables, mainly related to the application code and configuration of the server path.

deploy function will pass when it is called a c parameter, the value of the parameter is ssh client instance Fabric created when you connect to the server, use this instance can run related commands on the server.

Followed by the deployment of the implementation of a series of commands, enter a directory using ssh client instance cdmethod, run the command runmethod.

Note that each instance of ssh client implementation of the new command is stateless, that is, each will perform a new command at the server root directory, instead of to the directory where the last execution, so in the same directory continuous execute multiple commands, requires the use of with c.cdcontext manager.

Finally, if the server does not join the list of trusted code repository, running git pull will generally require a password. We managed code using GitHub, so write a GitHub account password responder, once Fabric detected GitHub need to enter the account password, it will call this response, auto fill in account password.

Since the response from _credentials.pyimport sensitive information module, and therefore a new fabfile.py in the same directory _credentials.pyfiles, write on GitHub username and password:

github_username = your-github-username
github_password = your-github-password

Of course, this file contains sensitive information such as account passwords, so be sure to remember this file join .gitignore file , it will be excluded from the version control system, do not accidentally submitted a public warehouse, resulting in leakage of personal GitHub account.

Fabric automatic deployment script execution

Fabfile.py into the directory where the file, run the script file (server_ip replaced the ip address of your server online) with the fab command:

fab -H server_ip --prompt-for-login-password -p deploy

Then Fabric will automatically detect fabfile.py script deploy and run the function, enter the server login password Enter, and then you will see a series of command-line output string, deployed last seen the news.

If the script is running in error, check the error messages output by the command line, you can run the script again after the fix the problem. Later, when you are finished after the local development-related functions, which only need to perform a script file, you can automatically deploy the latest code to the server.


"Explain open source projects Series" - to let people interested in open source projects are no longer afraid, let sponsors open source projects are no longer alone. Follow our articles, you'll discover the fun of programming, the use of open source projects and found to be involved so simple. Welcome messages to contact us, join us, so that more people fall in love with open source, open source contribution ~

Guess you like

Origin www.cnblogs.com/xueweihan/p/11593800.html