Zero Downtime Deployment for ASP.NET applications

Posted by: Pavel Kutakov , on 8/1/2019, in Category ASP.NET Core
Views: 34056
Abstract: Continuous Deployment is a well known technique nowadays but each project has it's own recipe for it because the setup highly depends on technology stack and environment. In this article I will share an experience of setting up Continuous Deployment process for the application written with ASP.NET using Azure SQL as a database, Entity Framework Code First to access it and deployed to Microsoft Azure App Service by Azure DevOps.

Problem of Continuous Deployment for database applications

Launching a new version into production is always a nervous event, especially if the process involves a lot of manual operations.

“It would be so good to automate this process" - This idea is as old as the world of software development. And there is a term for this - Continuous Deployment.

But here's the problem, there is no "right" way to configure continuous deployment. This process is woven pretty much to the technology stack of the project and its environment.

In this tutorial, I want to share my practical experience in setting up automatic updates of an application without interrupting its operation for a specific technological environment.

 

Deploying ASP.NET MVC, EF application to Azure App Service using Azure DevOps

This application was written in ASP.NET MVC + SQL Azure + Entity Framework utilizing Code First, deployed to Azure App Service, built and deployed using the Azure DevOps (formerly Visual Studio Team Services).

At the first glance, everything is very simple. Azure App Service has the concept of deployment slot – you can deploy a new version of the application to that slot and just swap it with an active version. But it would be much easier if the application was based on a non-relational database that did not have strictly defined structure of tables. In which case, the newer version may start getting traffic and voila!

But with a relational database, it's a bit more complicated.

I believe you are familiar with the concept of migrations in Entity Framework and know that it was designed exactly for the “fire and forget” mode – just run automatic migrations and be always in sync.

Unfortunately, in the real-world, automatic migration mechanism has some problems when used as part of continuous deployment scenario:

1. The old version of the application may not/cannot work with the new database structure.

2. Updating the database structure may take considerable time and is not always possible by the application itself using these automatic migrations.

3. Your infrastructure may become inconsistent in case of migration failure.

Let me explain this further with an example.

Let us assume that you have deployed a new version in a parallel slot or in a secondary datacenter and have started applying migrations. Assume that we have three migrations and - horror of horrors - two rolled and the third failed!!

At this point, nothing will happen to the running servers. Entity Framework does not check the version for each request, but it is likely that you will not be able to solve the problem quickly. And at this time, the load on the application may increase and the platform will launch you an additional instance of the application and it... of course it will not start.

Entity Framework will compare the version of the code with version of the database during the first database request, and as far as the structure of the database has changed – EF will throw an error. A significant part of users will start receiving errors.

As you can see, the risk of automatic migration is high.

db-updated

Updating the application in main datacenter while users are switched to secondary datacenter. Failed update on primary DC may cause unpredictable problems for your customers.

Figure 1: Risk of Automatic Migration when DB is Updated

As for the second point, your migration may contain some commands, the execution time of which exceeds 30 seconds and the standard procedure will fall by timeout.

And in addition to these points, I personally do not like the fact that with automatic migrations you have to update part of the infrastructure to the new version. It's not so bad if you are using a deployment slot in Azure, but when you are deploying to a secondary datacenter, you have a part of the infrastructure with a deliberately broken application.

Implement Continuous Deployment for the ASP.NET application using Entity Framework

So, we want to implement continuous deployment for the ASP.NET application using Entity Framework. Let's start with the most difficult part - with the database.

It would be nice to automatically update the structure of the database while keeping the previous version of the application working. In addition, it would be good to take into account the fact that there are such updates in which an execution of a single command can take significant amount of time, which means that we need to update the database without using the built-in mechanisms and by executing a separate SQL script.

Needless to say, your migrations should be non-destructive. That is, changes in the database structure should not disrupt the performance of the previous version, and even better - previous two versions. If you are unable to satisfy this requirement, the described approach will be dangerous for your application.

The question is how to prepare SQL script for database migration?

You can make this process manual. If your team has a dedicated Release Manager role, you can coerce the team member to run the following command in Visual Studio

update-database -script

...which will generate the script and this person will put this script into a specific project folder.

But I agree, this approach is not error free. It depends on a human intervention, adds extra complexity if you have more than one migration between releases, and also deals with the possibility of one of release being skipped on the target system.

Be ready to implement some difficult migration tracking system to know which migrations have been there and what you need to run. It is difficult and the same wheel has already been invented in the built-in migration mechanism.

Of course, we’d be better off to embed the process of migration script generation and execution into the release pipeline. Unfortunately, “update-database –script" command cannot be used as part of CI/CD pipeline, it can be executed only in the “Package Manager” console of Visual Studio.

To achieve the same result, you can use separate “migrate.exe” utility which is included with the Entity Framework. Please note that you need Entity Framework 6.2 or higher, as the script generation option appeared in this utility only in April 2017. Calling the utility looks like this:

migrate.exe Context.dll /connectionString="Data Source=localhost;Initial Catalog=myDB;User Id=sa;Password=myPassword;" /connectionProviderName="System.Data.SqlClient" /scriptFile=1.SQL /startUpDirectory="c:\projects\MyProject\bin\Release" /verbose

Specify the name of the assembly where your Context class is located, the connection string to the target database, the provider, and, most importantly, the start directory that contains both the context assembly and the Entity Framework assembly. Do not experiment with the names of the working directory, keep it simple.

Note: We came across a strange case when migrate.exe was unable to read the directory with a name that had spaces and non-alphabetic characters.

There's an important digression to be made.

After the execution of the above command, the utility will generate a single SQL script containing all the commands for all migrations that need to be applied to the target database. This is not good for SQL Server.

The fact is that the server executes commands without the GO separator as a single batch, and some operations cannot be performed together in one batch.

For example, in some cases, adding a field to a table and immediately creating an index on that table with a new field, does not work.

But there is more. Some commands require certain environment settings when running the script. Such settings are enabled by default when you connect to SQL Server via SQL Server Management Studio, but when the script is executed via SQLCMD console utility - it must be set manually.

To take all this into account, you will have to modify the process of generating the migration script. To do so, create an additional class next to your DbContext descendant, which does everything you need:

public class MigrationScriptBuilder : SqlServerMigrationSqlGenerator
{
    public override IEnumerable<MigrationStatement> Generate(IEnumerable<MigrationOperation> migrationOperations, string providerManifestToken)
    {
        var statements = base.Generate(migrationOperations, providerManifestToken);
        var result = new List<MigrationStatement>();
        result.Add(new MigrationStatement { Sql = "SET QUOTED_IDENTIFIER ON;" });
        foreach (var item in statements)
        {
            item.BatchTerminator = "GO";
            result.Add(item);
        }
        return result;
    }
}

And to allow Entity Framework to use it, you should register it in the Configuration class, that usually is found in the Migrations folder:

public Configuration()
{
    SetSqlGenerator("System.Data.SqlClient", new MigrationScriptBuilder());

}

The resulting migration script will then contain a GO between each statement and a SET QUOTED_IDENTIFIER on at the beginning of the file.

Well done, now it is necessary to adjust the process.

In general, as part of the release pipeline in Azure DevOps (VSTS/TFS), this is quite simple. We'll need to create a PowerShell script to prepare and execute required database migrations. It will look like the following:

param 
(
    [string] [Parameter(Mandatory=$true)] $dbserver,
    [string] [Parameter(Mandatory=$true)] $dbname,
    [string] [Parameter(Mandatory=$true)] $dbserverlogin,
    [string] [Parameter(Mandatory=$true)] $dbserverpassword,
    [string] [Parameter(Mandatory=$true)] $rootPath,
    [string] [Parameter(Mandatory=$true)] $buildAliasName,
    [string] [Parameter(Mandatory=$true)] $contextFilesLocation,
)

Write-Host "Generating migration script..."
$fullpath="$rootPath\$buildAliasName\$contextFilesLocation"
Write-Host $fullpath
& "$fullpath\migrate.exe" Context.dll /connectionProviderName="System.Data.SqlClient" /connectionString="Server=tcp:$dbserver.database.windows.net,1433;Initial Catalog=$dbname;Persist Security Info=False;User ID=$dbserverlogin;Password=$dbserverpassword;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;" /startUpDirectory=$fullpath /verbose /scriptFile=1.SQL
Write-Host "Running migration script..."
& "SQLCMD" -S "$dbserver.database.windows.net" -U $dbserverlogin@$dbserver -P $dbserverpassword -d $dbname  -i 1.SQL
Write-Host "====Finished with migration script===="

…and add a PowerShell script execution task to the release pipeline. The task and its settings may look like this:

add-power-shell-execution-block

Figure 2: Add PowerShell Execution Block

PowerShell script settings looks like this:

power-shell-task-settings

Figure 3: Powershell Task Settings

You need to supply the following parameters to the PowerShell script:

  • dbserver – server name/address where your target database can be found.
  • dbname – name of your target database.
  • dbserverlogin – user name to connect to the database.
  • dbserverpassword – user password. Server, database, user and password usually defined as a specific release environment variable. Password data may be set as secured.
  • buildAliasName – it is a name for your release pipeline which is used by Azure DevOps as a name of the target directory. Usually taken from $(Build.DefinitionName) variable.
  • rootPath – the path to the local directory where all artifacts will be copied by Azure DevOps. Usually taken from $(System.ArtifactsDirectory) variable.

It is important to add the “migrate.exe” file to your project from <your Project>/packages/EntityFramework.6.2.0/tools/ and set it to “Copy Always” so that this utility is copied to the output directory when you build the project and you can access it in the Azure DevOps release.

Note: If your project also uses WebJob then deploying to Azure App Service is a bit unsafe. We have faced a similar situation where Azure launches the first available executable file in the folder where your WebJob is published. If your WebJob name is alphabetically located after “migrate.exe” (as was in our case) then Azure will try and to run “migrate.exe” instead of your WebJob executable!

So, now that we have learned how to update the version of the database by generating a script during the release, other steps will be way easier. We should disable checking of the migration version, so that in case of any failures in the execution of the script, the older version of our code continues to work.

As I mentioned earlier - your migrations should be non-destructive. To disable validation, you only need to add the following section to Web.config file:

<entityFramework>
    <contexts>
      <context type="<full namespace for your DbContext class>, MyAssembly" disableDatabaseInitialization="true"/>
    </contexts>
</entityFramework>

..where <full namespace for your DataContext class> is a full path with namespace to your DbContext descendant and MyAssembly is the name of the assembly with your Context class.

Finally, it is highly desirable to warm up the application before switching users to the new version. To do so, add a special section to the web.config with links that your application automatically follows during initialization:

<system.webServer>
    <applicationInitialization doAppInitAfterRestart="true">
      <add initializationPage="/" hostName="" />
    </applicationInitialization>
</system.webServer>

You could add several links by just adding more lines with the “initializationPage” attribute:

<add initializationPage=”/myInit1” />

Azure documentation states that during the swapping of slots, the platform will wait for application initialization and only then will switch traffic to the new version.

What about .NET CORE projects?

In .NET Core, things are much easier and at the same time, different.

Migration script generation is possible using the standard mechanism, but it is performed not on the basis of the compiled assembly, but on the basis of the project file.

Therefore, the script must be generated as part of the build process and must be included as a build artifact.

The script will contain all the SQL commands of all the migrations from the beginning. There are no problems with it because the script is idempotent, i.e. it can be applied to the target database again without any consequences. This has another useful consequence - we do not need to modify the script generation process to divide commands into batches - everything is already done for the same.

Here is the step-by-step process setup.

Step 1: Call Entity Framework Core CLI utility for script generation. Just add the appropriate task to the build pipeline:

dotnetcore-add-task

Figure 4: Add .NET Core task to the build pipeline

Step 2: Set up this task to generate migrations file:

dotnetcore-task-settings

Figure 5: .NET Core Task Settings

(official documentation is here: https://docs.microsoft.com/ru-ru/ef/core/miscellaneous/cli/dotnet#dotnet-ef-migrations-script )

The parameters are pretty clear: you have to specify your project file location, your startup project file (which usually the same as project file) and the path to the output SQL script file.

As a result, after the build finishes, you may have the following build artifacts:

dotnetcore-artifacts-explorer

Figure 6: .NET Core Artifacts Explorer

Step 3: Your build artifacts should contain additional PowerShell script (“easycicd.ps1” as seen in Figure 6) which you will use to execute your SQL migrations script in the corresponding Release pipeline. Your PowerShell script may be much smaller and include only database connection information as input variables:

param 
(
[string] [Parameter(Mandatory=$true)] $dbserver,
[string] [Parameter(Mandatory=$true)] $dbname,
[string] [Parameter(Mandatory=$true)] $dbserverlogin,
[string] [Parameter(Mandatory=$true)] $dbserverpassword,
[string] [Parameter(Mandatory=$true)] $migrationScript
)

& "SQLCMD" -S "$dbserver.database.windows.net" -U $dbserverlogin@$dbserver -P $dbserverpassword -d $dbname  -i $migrationScript

Please do not forget to add “PowerShell” task in your Release pipeline to run this script, as described above for EF 6.2.

Conclusion

Using the technique mentioned in this article, you can roll out your ASP.NET applications without any downtime. But notice that even within the same technology family, the process setup is totally different.

Each development environment requires its own recipe for continuous deployment.

This article was technically reviewed by Daniel Gimenez Garcia.

This article has been editorially reviewed by Suprotim Agarwal.

Absolutely Awesome Book on C# and .NET

C# and .NET have been around for a very long time, but their constant growth means there’s always more to learn.

We at DotNetCurry are very excited to announce The Absolutely Awesome Book on C# and .NET. This is a 500 pages concise technical eBook available in PDF, ePub (iPad), and Mobi (Kindle).

Organized around concepts, this Book aims to provide a concise, yet solid foundation in C# and .NET, covering C# 6.0, C# 7.0 and .NET Core, with chapters on the latest .NET Core 3.0, .NET Standard and C# 8.0 (final release) too. Use these concepts to deepen your existing knowledge of C# and .NET, to have a solid grasp of the latest in C# and .NET OR to crack your next .NET Interview.

Click here to Explore the Table of Contents or Download Sample Chapters!

What Others Are Reading!
Was this article worth reading? Share it with fellow developers too. Thanks!
Share on LinkedIn
Share on Google+

Author
Pavel is an experienced software architect who has repeatedly demonstrated his ability to complete finance-related software projects. He is a strong information technology professional with focus on modern cloud technology platforms. His banking core system software has worked all over the world from US to Papua New Guinea. He has also created a specialized processing system for national lottery operator.


Page copy protected against web site content infringement 	by Copyscape




Feedback - Leave us some adulation, criticism and everything in between!