DBA > Articles

Microsoft and Database Lifecycle Management (DLM): The DacPac

By: Phil Factor
To read more DBA articles, visit http://dba.fyicenter.com/article/

When you need to extract data and metadata from an existing database, deploy a database, report on changes, script out a database, compare databases or perhaps drift-check, it is often useful to have a standard package that contains everything about the database that you'll need: This means scripts, data, components, metadata and more.

In Database Lifecycle Management (DLM), the scripts and any other components of a particular version of a database can be usefully packaged up together into a file that becomes a release artefact. The canonical source is, as it should be, in source control, but the release artefact is a practical convenience because it has everything required for a clean build, but on the strict condition that it is 'versioned'.

You can, of course, put everything in a file directory structure but that's only useful if you have file-system access to the file server from the location of the installation, and that isn't always possible. Instead of a directory, we tend to use the NuGet .ZIP package format, because it is an open-source standard for application releases, and includes conventions for pre- and post-deployment scripting in PowerShell. This approach makes sense particularly where a database has a lot of non-scriptable components such as SSIS packages, or where permissions and server-specific settings need to be applied.

This article will describe Microsoft's approach to providing the database release artefact, called a DacPac, which is Microsoft's own package structure that is is manipulated via its DacFx (Data Tier Application Framework) technology. The DacPac is a .ZIP-based ('package-based') approach to database deployment, which strikes a chord with developers of reasonably small databases that are associated closely with a single application, but which has broader practical uses.

Working with DacFx and DacPacs
The DAC approach has been around since SQL Server 2008R2 (v1.0) and the current version at the time of writing is DacPac v3 for SQL Server 2012 and 2014. Earlier versions were restricted to supporting only the cut-down feature set of the original SQL Azure, and had a number of weaknesses that have now mostly been ironed out. The APIs have changed considerably, as a result. It is a good idea to ensure that you have the latest version.

The DAC .NET class library is used by SSDT for deployments that update existing databases, as well as modelling and validating database schemas. It used to be in the Microsoft.SqlServer.Management.Dac namespace, but nowadays is more separated from SMO, in Microsoft.SqlServer.Dac. Some essential functionality, such as getting the current version number from a registered database, got lost in the move; and the API changed significantly. There is therefore currently no simple method for getting a version number from a database. There is a DacServices API to create and use DacPac files for deployment-related tasks such as examining database schemas, modifying the deployment plan or doing Static Code Analysis.

You can work with DacPacs using one of the following:
Visual Studio or SSMS wizards
Visual Studio VSSC
The managed API, DacFx namespace , that is distributed with SQL Server, used with a language like F#, PowerShell or C#
An MSBuild Task
A command-line utility called SQLPackage.exe

The DacFx namespace and the SQLPackage.exe/DacUnpack.exe command-line utilities are intended to match feature-for-feature. You choose whichever is more convenient. DacFx provides the grunt whichever alternative you use.

What is a DacPac?
The 'portable' version of a database design is a DacPac, which is merely a zipped directory. It is described by the pundits as a 'self-contained database entity used by an application', representing the 'declarative model' of the database, which can be authored alongside application code.
A DacPac can be built from scratch, created from a Visual Studio 2010 Data-Tier project, compiled from source code, or extracted from existing databases. We can deploy DacPac both to Azure SQL Databases or SQL Server, either as a new database or to upgrade an existing database. In the zipped directory is a series of files called 'parts'. All DacPacs contain at least…:
An XML file that represents an XML 'model' of the database's metadata A file containing information about the DAC framework that produced the DACPAC file
The fact that a DacPac contains an XML 'model' of the metadata means that you can find out what's in the database. It also enables an application can use a DacPac for metadata comparisons and synchronization without having to parse a build script. The model is like a document-based object-oriented version of the INFORMATION_SCHEMA views, equivalent to Redgate's 'SQL Compare Snapshots' produced by SQL Snapper. If your comparison tool, like SQL Compare, can read a build script, or collection of object scripts just as easily as it can a live database or backup, then you don't need the model.

Zipping up all this information up is a good idea because the XML-based 'declarative model' is huge. Once unzipped, it is easy to extract the information from the model with XPath queries. Optionally, a DacPac may also include:
An XML-based refactor log for auditing changes
Pre- and post-deployment SQL scripts to take care of the creation of any build components that cannot be included in the update script
Data – if included this is stored, a directory per table, in JSON format Any other contributor artefacts such as SSIS packages.

You can unzip a DacPac simply by right-clicking the file, if DacFx is installed on the machine. If you are scripting, then it can be done from the DacFx .NET library. There is even a Package API within DacFx, to inspect the package parts in more detail. The advantage of these approaches over a simple ‘unzip’ is that they create a build script ‘Model.SQL’ whilst unpacking the DacPac. If you have specified data as well as tables, you’ll see a data directory with a subdirectory for every table, containing one or more files with JSON data like this:

Full article...

Other Related Articles

... to read more DBA articles, visit http://dba.fyicenter.com/article/