ER Diagram Software Options for SQL Server

I’ve seen this come up several times recently in the SQL Server Community. There are diagrams built in to SSMS, but they’re a bit limited in their usefulness. You can’t easily break them down into sub-models or tweak them without affecting the underlying database. There are 3rd party diagramming options for SQL Server, some free and some paid.

Simple-Talk ran an article calling for people to submit their favorite ERD software.

Database Answers has a page dedicated to modelling tools. I’m not sure when the page was last updated, but as far as I know the author is still maintaining the page as it points to some more recent offerings.

Database Answers also pointed to a Wikipedia page on Data Modelling Tools.

Robert Sheldon has a great article on Simple-Talk reviewing Online ER Software offerings. Vertabelo gets his top review, but others are mentioned.

 

I’ve used ER/Studio for years and generally prefer it to ErWin, though that is probably moot at this point seeing that Embarcadero bought ErWin. It’s a pretty expensive piece of software, though. Not everyone can afford the licenses needed to run it. I hope the pointers to some other lists will help people find the right tool for their needs.

SSIS Catalog (SSISDB) Cleanup

I really appreciate what MS has done w/ the SSIS Catalog. We have built-in logging at a level that wasn’t possible in prior releases, but that comes at a cost. The default retention is pretty high (365 days) and MS chose to handle cleanup using cascading deletes. This sort of coding makes life easier for the developers, but almost always performs poorly. That’s especially the case when you have 1 parent row with 100’s of thousands or child rows related to it.  The problem is compounded because the first time people realize they need to lower the retention limit is about the same time that the database is starting to fill up or has filled up. At that point, it’s too late to lower the retention by a large number because the corresponding delete will still cause issues.

I knew someone had written some code for this and a quick search on StackOverflow found this question and the answer from Bill Fellows ( b | t ) :
http://stackoverflow.com/questions/21781351/how-can-i-clean-up-the-ssisdb

I’m including the code here for a quick reference, though I recommend reading his answer, up-voting it, and checking out his other resources. The only change I’d consider making is perhaps clearing out the child tables in batches of 1000-5000 rows at a time instead of all rows. I found that I had to work in increments of no more than 10 days at a time to delete history. Any more than that and the time to execute increased quite a bit. In this case, look for your max days currently retained, and drop the @DaysRetention variable to that less a couple of days to see what is removed. Keep dropping that until you reach the point at which you want to stop, set your SSIS Catalog retention days to that value, and you should be good from that point on.

It’s probably worth noting that in SQL Server 2012, MS didn’t include proper indexing by default. You may need to add some indexes in order to have the delete perform in a timely manner and avoid excessive table scans. The index issue was addressed in SQL Server 2014 and higher (and I think in SQL Server 2012 SP2 as well).

Troubleshooting Database Mail on SQL Server 2016

tl/dr – Install the .NET Framework 3.5 feature on your server

I was setting up a new SQL Server 2016 machine today and going through the normal configurations – setting permissions, setting backup/data locations, configuring database mail for alerts, and such.  I had no issues until I started testing database mail and getting … nothing.  No emails received, nothing in the logs, a vague hint that there were 3 pending entries in an external queue, but overall – nothing. I hit the MS Troubleshooting page for DB Mail and verified my configuration was correct. I tested connectivity to the SMTP server, verified the firewall settings, and even TELNET’d to the box.

I finally pulled up Sean McCown’s Troubleshooting Guide for DB Mail and started stepping through it.  I got to the “run the DatabaseMail.exe app” step and was then greeted by a big message telling me that the .NET Framework v3.5 couldn’t be installed.  That rang several bells and I remember reading before that SQL Server 2016 still required .NET 3.5 in order to use DB Mail. I added the feature and kicked off the app again just to see if that would work. No errors and all queued mails were received shortly afterwards.

Quick thanks to Sean for the very thorough guide, complete with examples, and for pointing me in the right direction. I definitely will remember this for the next 2016 server I need to configure and will make sure our Ops team adds the framework as part of their server config scripts for SQL Server.

Long-Running Queries and Extended Events

This has been something I’ve wanted to investigate for a while now. I’ve know you could use Profiler and set up server-side traces to capture long-running events, but was curious how to do the same with Extended Events. I then came across this post from Pinal Dave ( b | t ) that pointed me in the right direction. I followed along with the guidelines he was suggesting and was having trouble finding the “Duration” filter. Turns out I had a bit too much selected in my filtering options or perhaps the Wizard was giving me fits seeing it, but I eventually selected just the Batch Completed or RPC Completed events to see and set the Duration filter. The one change that I’d make from Dave’s script is to set the duration to 500,000 because Duration in SQL 2012 is in microseconds, not milliseconds. I also want queries longer than 5 seconds to start.

I’m including my script below for tracking queries taking longer than 5 seconds.

 

SQL Server 2016, Database Mail and .Net 3.5 Framework

I came across this post in the SQL Server Slack chats and wanted to make sure that this information was shared so others don’t fall into this trap. Not having .NET 3.5 installed w/ SQL 2016 means no DB Mail and worse, no errors or alerts that it’s not working. I hope that MS will address this a bit more if they keep saying we don’t need .NET 3.5 for SQL 2016. In the meantime, if you want DB Mail, you’ll need the framework, too.

SQLSwimmer

There were so many cheers when Microsoft announced that the .Net 3.5 Framework was no longer a pre-requisite for installing SQL Server starting with SQL Server 2016. Folks were so excited, until they started testing certain pieces of the product, specifically Database Mail. That’s right, if you want to use Database Mail in SQL Server 2016 you have to install the .Net 3.5 Framework.

If you are installing SQL Server 2016 on Windows Server 2012 R2, this might be a problem for two reasons. First, the .Net 3.5 Framework is no longer included by default when spinning up a new install. Second, you will get no errors when testing Database Mail in SQL Server 2016. Mail will be queued, but will not be sent and there will be no errors in the Database Mail log, the SQL Agent log or even the SQL Server error log.

So if you’ve gone…

View original post 50 more words

Getting All Permissions for Server

We recently had to work out a way to fetch and display all effective permissions granted to our Production SQL Server. I had been required to get all permissions at the DB level before, but even then that tended to be more granular and only showed to what roles you belonged and what explicit permissions you’d been granted. We wanted to run something through a loop of all users and get that information into a report. It turns out this was easier than I’d expected, especially after my first attempts to dig through the system tables.

We used a script something like the following:

EXECUTE as login = ‘DomainName\Login’;
select ‘DomainName\Login’ AS Login, * from fn_my_permissions (NULL, ‘SERVER’);
REVERT;

With a little work this can be used within PowerShell or SQL to loop through all logins and capture the information in a table or report.

 

The function could also be used to loop through all databases by using the database and then substituting Database for Server above:

USE AdventureWorks2012;
EXECUTE as login = ‘DomainName\Login’;
SELECT ‘DomainName\Login’, * FROM fn_my_permissions (NULL, ‘DATABASE’);
REVERT;

For an interesting take on revamping sp_msforeachdb, check out this script from Edward Pollack (t). It might help you if you need to loop through all of your databases to run something like this for your auditors.

SQL Server Audits and Action_IDs

We were recently re-doing our SQL Server Audits and I was reminded again how painful setting the filters can be. MS expects an integer for “action_id”, but to actually use, them you need to know what those actually mean.

I came across this blog post by Chris Provolt listing the text versions of the action_id’s. That was helpful, especially the quick query to see what’s available:

Select DISTINCT action_id,name,class_desc,parent_class_desc from sys.dm_audit_actions

However, as you can tell by running this, the action_id’s returned are all text values. That doesn’t help when trying to set up your SQL Audits.

MSDN provides code for a function to translate the text audit_id into the expected numeric value.

CREATE FUNCTION dbo.Getint_action_id (@action_id VARCHAR(4))
returns INT
BEGIN
DECLARE @x INT;

SET @x = CONVERT(INT, CONVERT(VARBINARY(1), Upper(
Substring(@action_id, 1, 1))));

IF Len(@action_id) >= 2
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), Upper(
Substring(@action_id, 2, 1))
))
* Power(
2, 8) + @x;
ELSE
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), ‘ ‘)) * Power(2, 8) + @x;

IF Len(@action_id) >= 3
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), Upper(
Substring(@action_id, 3, 1))
))
* Power(
2, 16) + @x;
ELSE
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), ‘ ‘)) * Power(2, 16) + @x;

IF Len(@action_id) >= 4
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), Upper(
Substring(@action_id, 4, 1))
))
* Power(
2, 24) + @x;
ELSE
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), ‘ ‘)) * Power(2, 24) + @x;

RETURN @x;
END; 

Once you create that function, you can use it to generate a list of the Integer action_ids, the text action_ids, and the name/description of those action_ids. (You can also expand it out to see which actions are applicable to various objects at the server and database level if you so desire.)

SELECT DISTINCT dbo.Getint_action_id(action_id) Action_ID_Int,
action_id, NAME AS Action_Description
–,class_desc,parent_class_desc
FROM   sys.dm_audit_actions
ORDER  BY action_id

This will result in the following values (as of SQL 2012):

Action_ID_Int action_id NAME
1329873729 ACDO DATABASE_OBJECT_ACCESS_GROUP
542065473 ACO SCHEMA_OBJECT_ACCESS_GROUP
1329742913 ADBO BULK ADMIN
1346651201 ADDP DATABASE_ROLE_MEMBER_CHANGE_GROUP
1347634241 ADSP SERVER_ROLE_MEMBER_CHANGE_GROUP
538987585 AL ALTER
1313033281 ALCN ALTER CONNECTION
1397902401 ALRS ALTER RESOURCES
1397967937 ALSS ALTER SERVER STATE
1414745153 ALST ALTER SETTINGS
1381256257 ALTR ALTER TRACE
1280462913 APRL ADD MEMBER
538989377 AS ACCESS
1129534785 AUSC AUDIT SESSION CHANGED
1179866433 AUSF AUDIT SHUTDOWN ON FAILURE
1213486401 AUTH AUTHENTICATE
538984770 BA BACKUP
541868354 BAL BACKUP LOG
1111773762 BRDB BACKUP_RESTORE_GROUP
1179595331 C2OF TRACE AUDIT C2OFF
1313813059 C2ON TRACE AUDIT C2ON
1196180291 CCLG CHANGE LOGIN CREDENTIAL
1196182851 CMLG CREDENTIAL MAP TO LOGIN
1430343235 CNAU AUDIT_CHANGE_GROUP
538988355 CO CONNECT
538988611 CP CHECKPOINT
538989123 CR CREATE
538976324 D DENY
1179074884 DAGF FAILED_DATABASE_AUTHENTICATION_GROUP
1279738180 DAGL DATABASE_LOGOUT_GROUP
1397178692 DAGS SUCCESSFUL_DATABASE_AUTHENTICATION_GROUP
1178681924 DBAF DATABASE AUTHENTICATION FAILED
1396785732 DBAS DATABASE AUTHENTICATION SUCCEEDED
1128481348 DBCC DBCC
1195590212 DBCG DBCC_GROUP
541868612 DBL DATABASE LOGOUT
538987588 DL DELETE
1280462916 DPRL DROP MEMBER
538989124 DR DROP
541284164 DWC DENY WITH CASCADE
538990661 EX EXECUTE
538989638 FT FULLTEXT
541545542 FTG FULLTEXT_GROUP
538976327 G GRANT
1111773767 GRDB DATABASE_PERMISSION_CHANGE_GROUP
1329877575 GRDO DATABASE_OBJECT_PERMISSION_CHANGE_GROUP
542069319 GRO SCHEMA_OBJECT_PERMISSION_CHANGE_GROUP
1330860615 GRSO SERVER_OBJECT_PERMISSION_CHANGE_GROUP
1448301127 GRSV SERVER_PERMISSION_CHANGE_GROUP
541546311 GWG GRANT WITH GRANT
1346653513 IMDP DATABASE_PRINCIPAL_IMPERSONATION_GROUP
542133577 IMP IMPERSONATE
1347636553 IMSP SERVER_PRINCIPAL_IMPERSONATION_GROUP
538988105 IN INSERT
541214540 LGB BROKER LOGIN
1195525964 LGBG BROKER_LOGIN_GROUP
1094993740 LGDA DISABLE
1111770956 LGDB CHANGE DEFAULT DATABASE
1095059276 LGEA ENABLE
1279674188 LGFL FAILED_LOGIN_GROUP
1179207500 LGIF LOGIN FAILED
1397311308 LGIS LOGIN SUCCEEDED
1196181324 LGLG CHANGE DEFAULT LANGUAGE
541935436 LGM DATABASE MIRRORING LOGIN
1196246860 LGMG DATABASE_MIRRORING_LOGIN_GROUP
1296975692 LGNM NAME CHANGE
542066508 LGO LOGOUT
1146308428 LGSD SUCCESSFUL_LOGIN_GROUP
538988364 LO LOGOUT_GROUP
1111772749 MNDB DATABASE_CHANGE_GROUP
1329876557 MNDO DATABASE_OBJECT_CHANGE_GROUP
1346653773 MNDP DATABASE_PRINCIPAL_CHANGE_GROUP
542068301 MNO SCHEMA_OBJECT_CHANGE_GROUP
1330859597 MNSO SERVER_OBJECT_CHANGE_GROUP
1347636813 MNSP SERVER_PRINCIPAL_CHANGE_GROUP
1196182862 NMLG NO CREDENTIAL MAP TO LOGIN
538988623 OP OPEN
1111773263 OPDB DATABASE_OPERATION_GROUP
1448300623 OPSV SERVER_OPERATION_GROUP
1380013904 PWAR APPLICATION_ROLE_CHANGE_PASSWORD_GROUP
541284176 PWC CHANGE PASSWORD
1195595600 PWCG LOGIN_CHANGE_PASSWORD_GROUP
1396922192 PWCS CHANGE OWN PASSWORD
1480939344 PWEX PASSWORD EXPIRATION
1129142096 PWMC MUST CHANGE PASSWORD
1280333648 PWPL PASSWORD POLICY
542267216 PWR RESET PASSWORD
1397905232 PWRS RESET OWN PASSWORD
542463824 PWU UNLOCK ACCOUNT
538976338 R REVOKE
538985298 RC RECEIVE
538986066 RF REFERENCES
538989394 RS RESTORE
541284178 RWC REVOKE WITH CASCADE
541546322 RWG REVOKE WITH GRANT
538987603 SL SELECT
538988115 SN SEND
1313624147 SPLN SHOW PLAN
1448301651 STSV SERVER_STATE_CHANGE_GROUP
1313953107 SUQN SUBSCRIBE QUERY NOTIFICATION
1313035859 SVCN SERVER CONTINUE
1146115667 SVPD SERVER PAUSED
1146312275 SVSD SERVER SHUTDOWN
1381193299 SVSR SERVER STARTED
1095975252 TASA TRACE AUDIT START
1347633492 TASP TRACE AUDIT STOP
538988372 TO TAKE OWNERSHIP
1111773012 TODB DATABASE_OWNERSHIP_CHANGE_GROUP
1329876820 TODO DATABASE_OBJECT_OWNERSHIP_CHANGE_GROUP
542068564 TOO SCHEMA_OBJECT_OWNERSHIP_CHANGE_GROUP
1330859860 TOSO SERVER_OBJECT_OWNERSHIP_CHANGE_GROUP
1195594324 TRCG TRACE_CHANGE_GROUP
542069332 TRO TRANSFER
1346847573 UCGP USER_CHANGE_PASSWORD_GROUP
1195459669 UDAG USER_DEFINED_AUDIT_GROUP
1430340693 UDAU USER DEFINED AUDIT
538988629 UP UPDATE
1178686293 USAF CHANGE USERS LOGIN AUTO
1196184405 USLG CHANGE USERS LOGIN
1129599829 USTC COPY PASSWORD
1414743126 VDST VIEW DATABASE STATE
1414746966 VSST VIEW SERVER STATE
1413699414 VWCT VIEW CHANGETRACKING
538984792 XA EXTERNAL ACCESS ASSEMBLY
538989912 XU UNSAFE ASSEMBLY

SSAS Tabular and Excel PowerPivot

I don’t know why it took so long before making the connection between PowerPivot models and SSAS Tabular models, but in the last couple of days I came across several blogs and videos tying the two together. With that in mind, I took the plunge into PowerPivot and built a quick model off of some SQL queries to test out a concept.

I started with the basics of mocking up a fact table. I stripped out decorator text, “renamed” a couple of columns in the query, filtered the data so I wouldn’t pull back a crazy amount into my local workstation, verified the counts and look of the data in a SQL query, and added that. Excel quickly loaded 1.2M rows and didn’t blink.

After that, I knew I needed a couple of Dimension-like tables. I wrote queries for those in a similar fashion and added the data again. Each addition went smoothly. I was able to add the proper relationships quickly to tie everything together.

To finalize the model, I created a couple of hierarchies to make querying easier. After messing around with these several times in SSAS Multi-Dimensional, I was shocked at how easy it was to create a hierarchy for dates and even certain domain-specific entries in our data: Right-click, create new hierarchy, drag and drop columns. Done.

After the model was finished, I dropped the relevant fact data into a Pivot Table and could easily analyze the data in all sorts of interesting ways. I worked around this with OLAP for quite some time to come up with a similar concept. With PowerPivot, I wrote a couple of queries, linked the tables, and had something workable within an hour. As an added benefit, I know I can scale that up to an SSAS server without too much trouble and have a similar experience.

Hat tip to the folks @ www.powerpivotpro.com and www.sqlbi.com for the many articles and pointers to get me started and over that initial hump.

SSIS, MySQL, and invalid Dates–Take #2

A while back I posted an article on handling dates in select statements from MySQL. The idea was that I wanted a quick way to generate select statements for MySQL that would automatically set any invalid date to NULL. (an invalid date being one such as “0000-00-00” or “1973-02-00”)  After some more trial and error, I realized that this script was incomplete. I’ve created an updated script that should better handle those edge cases.

The script is largely the same as the former, but I’ve added handling for invalid years or months or days. We don’t want to redirect the rows for further error handling. We want to treat the dates that won’t successfully import as NULL. This could be done several ways within SSIS, but handling the transform within the select statement reduces our need to write many transforms to handle each date time for each table.  (I suspect that BIML could be my friend for this task, but I’m not quite ready to tackle this with BIML.)

SELECT CONCAT(CASE
  WHEN ordinal_position = 1 THEN ‘SELECT ‘ 
  ELSE ‘, ‘ 
  END
, CASE data_type WHEN ‘date’
THEN CONCAT(‘CASE WHEN ‘, column_name, ‘= ”0000-00-00” THEN NULL ‘
, ‘WHEN ‘, column_name, ‘ LIKE ”%0000%” THEN NULL ‘
, ‘WHEN ‘, column_name, ‘ LIKE ”%-00%” THEN NULL ‘
, ‘else ‘, column_name, ‘ END as ‘, column_name)
WHEN ‘timestamp’
THEN CONCAT(‘CASE WHEN ‘, column_name, ‘= ”0000-00-00” THEN NULL ‘
, ‘WHEN ‘, column_name, ‘ LIKE ”%0000%” THEN NULL ‘
, ‘WHEN ‘, column_name, ‘ LIKE ”%-00%” THEN NULL ‘
, ‘else ‘, column_name, ‘ END as ‘, column_name)
WHEN ‘time’
THEN CONCAT(‘CAST(‘, column_name, ‘ AS CHAR(8) ) as ‘, column_name ) 
ELSE column_name
END
, CASE WHEN ordinal_position  maxord THEN  
  ELSE CONCAT(‘     FROM MySchema.’, c.table_name, ‘;’) 
  END) AS Select_Column_Name
FROM   information_schema.columns AS c
       JOIN (SELECT table_name,
                    Max(ordinal_position) AS MaxOrd
             FROM   information_schema.columns
             WHERE  table_schema = ‘MySchema’
             GROUP  BY table_name) AS t1
         ON c.table_name = t1.table_name
WHERE  table_schema = ‘MySchema’
ORDER  BY c.table_name, ordinal_position
LIMIT  0, 50000;

SSDT: Setting Different Permissions per Environment

One of the areas that SSDT doesn’t address adequately is permissions. Many users have different users, logins, and permissions set up across their environments, but SSDT takes a “one size fits all” approach and assumes that you will have the exact same configuration in each environment. While that should be the case for your schema, this area requires some interesting and somewhat involved techniques to handle well in SSDT.
Attribution – I’m borrowing heavily from Jamie Thomson’s ( blog | twitter ) post
A strategy for managing security for different environments using the Database Development Tools in Visual Studio 2010
Jamie addressed this problem for the older DB Projects (Datadude, VSTSDB, DBPro, etc.). He came up with an excellent, if somewhat complex, solution to effectively handle pushing different permissions to different environments. This is my attempt to update this and distill it into a “how to” for SSDT and SQL Projects, but I freely admit that this is based on Jamie’s excellent work.
In general, the steps we’ll want to take are:

  1. Generate a set of SQL Logins that will be used on the server.
    1. These should be run on the servers or local box already – we will not use these within the SQL Project itself though you can if you wish.
  2. Generate the files for users/roles/permissions for each database project and environment.
  3. Add the files to your SSDT SQL Project
  4. Create and set a variable for your environments in your SSDT SQL Project.
  5. Adjust your Security Wrapper file to handle your environments.
  6. Adjust your post-deploy script to call the Security Wrapper.

First, let’s look at what happens when you import Permissions by default into an SSDT SQL Project.
clip_image001
As you can see from the screenshot, we have a “Permissions” file containing granted permissions, a “RoleMemberships” file that adds users to various roles, a “Sales” file for the Sales schema, and 3 files each for the users – one for its schema, one for its login, and another for its user. Confusing, but not necessarily a problem until you need a release where User1 does not and will not exist on the server or where User2 belongs to a different domain than your production servers and can’t and should not be created there because there’s no trust relationship.
Let’s remove all of those files for objects other than Schemas from the project before we start. You’ll probably have to look at each file to verify before removing it from the project if you’re not sure which file is which. If you accidentally remove the files for the schemas, you’ll get errors and need to re-add a “Schema” to this security folder for each schema that is used.
 
Generate SQL Logins
This Microsoft KB article details the sp_help_revlogin stored procedure. Create this on your source database, run it, and use the output to re-create those logins elsewhere. This will copy all logins and users, but no permissions or database users.
 
Generate the Security Files through PowerShell
First, download the Powershell_GeneratePermissions.zip file. Extract that to a folder on your hard drive.
Note – the script assumes that you will use Windows authentication to connect to your SQL Server. The script will need to be adjusted if you need to use SQL Authentication.
Edit the “GeneratePermissions.ps1” file. Replace the “DB1”, “DB2”, etc with whatever database names you’ll be using. Also uncomment any of the extra lines that you’ll need in the file. (There is likely a better way to do this in Powershell and I’m more than open to some suggestions on how to build up this array in a more concise manner.)
The assumption is made that the databasename == the projectname. If that’s not the case, adjust your DB/Project names accordingly.
Open a PowerShell prompt in that folder after editing/saving the file.
Run the file using something like this commandline:
.\GeneratePermissions.ps1 -SQLInstance “localhost” -Environment DEV
“localhost” should be replaced with your servername
DEV in this case indicates that this will be tied to a variable value of “DEV” when the project is published.
I’d recommend running this for every environment at one time if you can. It will make adding the files to the project much easier.
If you run for several environments, you’ll see a folder structure something like the following for each project (in this case for Adventureworks2012):
clip_image002
Inside your “SecurityAdditions” folder, you’ll see the following files and folders:
clip_image003
Each of these files will be used to add permissions for each environment. In this case, we have permissions set for our Development, QA, Local, Production, and Staging environments.
Now, let’s add the “SecurityAdditionsWrapper.sql” file from the files we extracted earlier into your “SecurityAdditions” folder. Feel free to edit it as needed to match the environment(s) you want to use for your database project. By default, it’s set to look for the above files plus a “Default” file to be a catch-all. Adjust that so all files/environments match the files you created and save it.
 
Add the files to your SSDT SQL Project
Copy your newly created folder(s) to the appropriate project. Open each project, and you will notice that the files still don’t show up. Click the project file. You should see a menu item that’s now enabled to “show all files”.
clip_image004
You’ll see a bunch of files that aren’t part of the project under a “Scripts” folder:
clip_image005
We want to add them to the project. Right-click the “Scripts” folder and select the option to “Include in Project”. Click the Project file again and turn off the option to show all files.

  • We need to set all file properties to “Not in Build” or there will be errors when trying to build the project.

Unfortunately, we can’t select just the folder to exclude from the build. We need to select the files. You can ctrl+click to select multiple files. Select all of the files, open the Properties tab, and change the option from “Build” to “None”
clip_image006
Create and set your Variable to change environments
Open the Project Properties window
clip_image007
We want to change the SQLCMD variable to add a new variable called “DeployType”. Set it’s default value to “Local” for now, or whatever environment for which you have a set of security scripts.
clip_image008
 
Adjust your Security Wrapper files
The SecurityAdditionsWrapper.sql file in the code set I’ve created comes with settings for “Dev”, “QA”, “Staging”, “Production”, “Local”, and “Default”. It’s likely that you will not use all of these in your setting. One drawback to using SQLCMD to wrap all these files together is that if you don’t have a file or set of files, the script will start throwing errors because an expected file doesn’t exist. Edit the SecurityAdditionsWrapper.sql file to remove any of the configurations you will not use or change the names to match the environment values you used earlier.
 
Adjust your Post-Deploy script
Finally, we need to open your Post-Deploy script.
Add the following to your script:
:r .\SecurityAdditions\SecurityAdditionsWrapper.SQL
This will include your SecurityAdditionsWrapper file as part of your post-deploy process. See my earlier post on selectively running pre/post deploy scripts if you want finer control.