For a new data migration project, we have a SSIS SSDT solution with quite a few projects in it. Visual studio does not all to deploy all projects at once with a single mouse click. So I created a simple powershell script that does the job. It reads the .sln file, grabs each project and then pushes the .ISPAC file into the SSIS catalog. Also, it creates the catalog itself when required. I only use it to deploy to the dev box, but with some extra logging and options it could even be used within a CI/CD pipeline.
DACPAC’s are a nice way to deploy your SQL database projects. I have been using SSDT database projects for a few years now, and could not live without them. Especially handy during a development cycle when you deploy (publish) to a local dev instance multiple times per day. Beyond that, deploying your release to UAT and eventually production could be more challenging. Not all data migrations for instance are not supported out of the box.
Recently I came across the need to recreate all database objects for each release. We figured this was the most stable option, and it was possible as the database -which is used for reporting- is fetching data from another database, and basically only contains a bunch of views and procedures along with some static data. The in house deployment solution was not supporting DACPAC (sqlpackage.exe) based deployments. So it had to be plain SQL.
As I like to have things as automated as possible, I tend to leverage processes such as TFS build. This was the initial approach:
- Build task: Fetch two predefined SQL scripts from source control:
- Script 1: to drop all desired objects – drop database was not possible.
- Script 3: to reinsert the static data, after deploying the DACPAC sql.
- Build task: build the reporting solution (.sln), so it generates a new DACPAC.
- Build task: use sqlpackage.exe to compare the freshly built reporting DACPAC against an empty database, and have it output the SQL script ( /a:script option). This is script 2.
We could then have the deployment tooling execute these 3 scripts in sequence when releasing to UAT/production. Having an empty database hosted up for this is quite an overkill, so I wanted to have the reporting database DACPAC to compare against an “empty DACPAC” during the build process. Sadly SQLpackage.exe does not support this. After some research I was relieved that the underlying DAC framework API’s do support it though. So I ended up creating a simple C# console application, which we are now using during the build process.
If you can use this approach, have a look at: https://github.com/vminds/Daccompare
consoleLogger.WriteInfoLine("Generating deploy script"); string script = DacServices.GenerateDeployScript(sourcePac, targetPac, _targetDatabaseName, _deployOptions); File.WriteAllText(_scriptpath, script);
Yet another post on this topic? Well yes, historically deploying -the now called paginated reports- was pretty much relying on the cumbersome RS.exe. Luckily Microsoft started a powershell based ReportingServicesTools github repo back in 2016. Which is quite an improvement, but still I wanted a fully fledged way to deploy reports based on the actual release build of my SSDT reporting solution. And guess what, turns out Tim Abell had already created an awesome project on that. It had a few shortcomings for what I wanted to get done, hence I forked the project, and added a few features such as:
- Logging integration using ScriptLogger module
- Log dump of reportfolder hierarchy
- Caller script (DeploySSRSSolution.ps1) which will call the solution deploy
- Remove of RDL/datasource which are no longer in the project
- Hash compare of RDL files (OverwriteReports param)
- Support for database based credentials on datasources
Eventually I will refactor it to make full use of the Reporting Services Tools from Microsoft.
Check it out here: https://github.com/vminds/ssrs-powershell-deploy