This article is Request! Tips for developing on Azure using Python![PR] Microsoft Japan Advent Calendar 2020 It is the 21st day.
GitHub announced CI/CD Support by GitHub Actions, which was released to the public on GitHub Universe in November 2019. Since then, many Azure services have released Actions (https://github.com/azure/actions) and integrations to make developer workflows more efficient. The App Service Deployment Center provides a guide for developers to set up GitHub Actions and deploy web apps.
This article doesn't give an overview of the CI/CD pipeline, so if you're new to the CI/CD pipeline, you might ask, "What is CI/CD? Why is it needed now? ", I think you should read the article and check the outline.
A simple CI pipeline for Python applications has three steps.
pip install
Installes the package, runs tests, and sends the application to the server. This method seems fine, but is it really so? This method may work for simple applications, but packages that the application depends on os (db driver, scipy, scikit)-If you are using learn, etc.), you may run into problems when your application starts on the server. This is because python makes an absolute reference to the os library, and if there is a difference between the library installed on the ci machine and the server, the application will not work properly.
Also, in such cases, you may think that it is better to utilize Docker. Indeed, Docker allows you to build container images with Python application dependencies already installed. Then deploy the image to the host where Docker is installed and "just run" it. However, this option is not without its drawbacks. You need to manage the container registry and configure your network for secure access to CIs and production servers. Also, since the Dockerfile will be part of the application repository, you will need to update the base OS and configure the container. Tips: Docker published in 2013 PyCon
Last year there was a Great Article on this topic. This article covers more deployment techniques for Python applications, so learn about different options from this article. Here you will learn how to deploy a Python application to App Service without managing Docker images.
Azure App Service is a PaaS (Platform as a Service) for hosting web and API applications. You can deploy application code or container images and take advantage of Python, .NET, Node, Java, PHP and Ruby runtimes. This allows developers to choose between using containers or simply deploying code to let the service manage the runtime.
If you've configured your Python app's CI/CD pipeline for App Service without using a container, it's likely that your build server's OS won't match the runtime on Azure, so simply pipeline your app and package. It cannot be installed and deployed to App Service (or any other server). To work around this, create an app setting called SCM_DO_BUILD_DURING_DEPLOYMENT in the App Service and set the value to true. This app setting launches the Oryx (https://github.com/Microsoft/Oryx) build pipeline for reinstalling packages on deployment. Oryx is Microsoft's open source utility that automatically builds source code. Oryx runs on his SCM (Site Control Manager) site for web apps. By configuring this app, Oryx will pipeline the dependencies on the runtime image to ensure that the package has the appropriate dependencies on the OS libraries.
The following section provides an example GitHub Actions workflow for building and deploying a Python app to App Service. The sample uses GitHub Actions, but other CI/CD providers such as Azure DevOps and Jenkins can use the same pattern.
Django Let's take a look at Example of Django app build and deploy workflow. Fork this repository (https://github.com/Azure/login#configure-deployment-credentials) and create a secret with the service principal. The name of the secret is AZURE_SERVICE_PRINCIPAL.
The workflow begins with checking out the repository on the build VM, setting the desired Python version, and creating a virtual environment.
- uses: actions/checkout@v2
- name: Setup Python version
uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Create and start virtual environment
run: |
python3 -m venv venv
source venv/bin/activate
Once the virtual environment is up, install the dependencies from the requirements.txt file. Then use manage.py to collect static assets and run unit tests.
- name: Install dependencies
run: pip install -r requirements.txt
- name: Collect static
run: python manage.py collectstatic
- name: Run tests
run: python manage.py test
Once all the steps up to this point are successful, upload the file for your next job. The virtual environment is not uploaded because it is not compatible with the runtime OS. If you upload the file at the end of the job and the deployment fails, you can download the file from the Actions tab for debugging and content review.
- name: Upload artifact for deployment jobs
uses: actions/upload-artifact@v2
with:
name: python-app
path: |
.
!venv/
The second job starts by downloading the file you uploaded in the previous job and logs in to the Azure CLI using the service principal you set up earlier.
- uses: actions/download-artifact@v2
with:
name: python-app
path: .
- name: Log in to Azure CLI
uses: azure/login@v1
with:
creds: ${{ secrets.AZURE_SERVICE_PRINCIPAL }}
Once the Azure CLI is authenticated, the job will configure the SCM_DO_BUILD_DURING_DEPLOYMENT setting described above. Also, set the app to disable static collection (since it was done in the previous job), set the migration to run on the database, and set the Django environment to "production". POST_BUILD_COMMAND is a hook that allows you to execute commands following a runtime build. In this case, you are running `` `manage.py makemigrations && python migrate```. You can apply database migration as part of your CI workflow, but if you need to set the connection string to secret and you have networking rules to protect your database, you can access your database from your CI pipeline. Must be.
Finally, the job deploys the code using the webapps-deploy action (https://github.com/azure/webapps-deploy/).
- name: Disable static collection and set migration command on App Service
uses: Azure/appservice-settings@v1
with:
app-name: ${{ env.WEBAPP_NAME }}
app-settings-json: '[{ "name": "DISABLE_COLLECTSTATIC", "value": "true" }, { "name": "POST_BUILD_COMMAND", "value": "python manage.py makemigrations && python manage.py migrate" }, { "name": "SCM_DO_BUILD_DURING_DEPLOYMENT", "value": "true" }, { "name": "DJANGO_ENV", "value": "production"}]'
- name: Deploy to App Service
uses: azure/webapps-deploy@v2
with:
app-name: ${{ env.WEBAPP_NAME}}
Refer to Workflow example of building and deploying Flask app using Vue.js. Fork this repository and create a secret with Service Principal. The name of the secret is AZURE_SERVICE_PRINCIPAL. You also need to replace the placeholder value of the environment variable RESOURCE_GROUP at the beginning of the workflow file.
This workflow sets the Python version, creates a virtual environment, and installs Python packages, similar to the Django example. In this example, we need to install and build the Vue project dependencies, so we've set Node.js to the desired version.
- uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.6
- name: Set up Node.js
uses: actions/setup-node@v1
with:
node-version: 12
- name: Install and build Vue.js project
run: |
npm install
npm run build
- name: Create and start virtual environment
run: |
python3 -m venv venv
source venv/bin/activate
- name: Install dependencies
run: pip install -r requirements.txt
- name: test with PyTest
run: pytest --cov=app --cov-report=xml
Once the Flask and Vue.js apps are built and tested, the files except the node_modules/and venv/directories will be uploaded for the second job. I would like to exclude these directories so that Oryx can install dependencies on the runtime image, as in the Django example.
- name: Upload artifact for deployment jobs
uses: actions/upload-artifact@v2
with:
name: python-app
path: |
.
!node_modules/
!venv/
The second job downloads the artifact, logs in to the Azure CLI, and sets the SCM_DO_BUILD_DURING_DEPLOYMENT flag and FLASK_ENV to "production". Unlike the Django example, the workflow sets the "startup-file" command to gunicorn --bind = 0.0.0.0 --timeout 600 app: app. (Gunicorn is a WSGI HTTP server commonly used in Python applications.
- uses: actions/download-artifact@v2
with:
name: python-app
path: .
- name: Log in to Azure CLI
uses: azure/login@v1
with:
creds: ${{ secrets.AZURE_SERVICE_PRINCIPAL }}
- name: Configure deployment and runtime settings on the webapp
run: |
az configure --defaults ${{ env.RESOURCE_GROUP }}
az webapp config appsettings --name ${{ env.WEBAPP_NAME }} --settings \
SCM_DO_BUILD_DURING_DEPLOYMENT=true \
FLASK_ENV=production
az webapp config set --name ${{ env.WEBAPP_NAME }} \
--startup-file "gunicorn --bind=0.0.0.0 --timeout 600 app:app"
Finally, the application is deployed with the webapps-deploy action.
- name: Deploy to App Service
uses: azure/webapps-deploy@v2
with:
app-name: ${{ env.WEBAPP_NAME}}
I wrote about CI/CD for Azure Python applications. I hope it will be helpful to as many people as possible.
Recommended Posts