SQL Server Audits and Action_IDs

We were recently re-doing our SQL Server Audits and I was reminded again how painful setting the filters can be. MS expects an integer for “action_id”, but to actually use, them you need to know what those actually mean.

I came across this blog post by Chris Provolt listing the text versions of the action_id’s. That was helpful, especially the quick query to see what’s available:

Select DISTINCT action_id,name,class_desc,parent_class_desc from sys.dm_audit_actions

However, as you can tell by running this, the action_id’s returned are all text values. That doesn’t help when trying to set up your SQL Audits.

MSDN provides code for a function to translate the text audit_id into the expected numeric value.

CREATE FUNCTION dbo.Getint_action_id (@action_id VARCHAR(4))
returns INT
BEGIN
DECLARE @x INT;

SET @x = CONVERT(INT, CONVERT(VARBINARY(1), Upper(
Substring(@action_id, 1, 1))));

IF Len(@action_id) >= 2
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), Upper(
Substring(@action_id, 2, 1))
))
* Power(
2, 8) + @x;
ELSE
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), ‘ ‘)) * Power(2, 8) + @x;

IF Len(@action_id) >= 3
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), Upper(
Substring(@action_id, 3, 1))
))
* Power(
2, 16) + @x;
ELSE
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), ‘ ‘)) * Power(2, 16) + @x;

IF Len(@action_id) >= 4
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), Upper(
Substring(@action_id, 4, 1))
))
* Power(
2, 24) + @x;
ELSE
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), ‘ ‘)) * Power(2, 24) + @x;

RETURN @x;
END; 

Once you create that function, you can use it to generate a list of the Integer action_ids, the text action_ids, and the name/description of those action_ids. (You can also expand it out to see which actions are applicable to various objects at the server and database level if you so desire.)

SELECT DISTINCT dbo.Getint_action_id(action_id) Action_ID_Int,
action_id, NAME AS Action_Description
–,class_desc,parent_class_desc
FROM   sys.dm_audit_actions
ORDER  BY action_id

This will result in the following values (as of SQL 2012):

Action_ID_Int action_id NAME
1329873729 ACDO DATABASE_OBJECT_ACCESS_GROUP
542065473 ACO SCHEMA_OBJECT_ACCESS_GROUP
1329742913 ADBO BULK ADMIN
1346651201 ADDP DATABASE_ROLE_MEMBER_CHANGE_GROUP
1347634241 ADSP SERVER_ROLE_MEMBER_CHANGE_GROUP
538987585 AL ALTER
1313033281 ALCN ALTER CONNECTION
1397902401 ALRS ALTER RESOURCES
1397967937 ALSS ALTER SERVER STATE
1414745153 ALST ALTER SETTINGS
1381256257 ALTR ALTER TRACE
1280462913 APRL ADD MEMBER
538989377 AS ACCESS
1129534785 AUSC AUDIT SESSION CHANGED
1179866433 AUSF AUDIT SHUTDOWN ON FAILURE
1213486401 AUTH AUTHENTICATE
538984770 BA BACKUP
541868354 BAL BACKUP LOG
1111773762 BRDB BACKUP_RESTORE_GROUP
1179595331 C2OF TRACE AUDIT C2OFF
1313813059 C2ON TRACE AUDIT C2ON
1196180291 CCLG CHANGE LOGIN CREDENTIAL
1196182851 CMLG CREDENTIAL MAP TO LOGIN
1430343235 CNAU AUDIT_CHANGE_GROUP
538988355 CO CONNECT
538988611 CP CHECKPOINT
538989123 CR CREATE
538976324 D DENY
1179074884 DAGF FAILED_DATABASE_AUTHENTICATION_GROUP
1279738180 DAGL DATABASE_LOGOUT_GROUP
1397178692 DAGS SUCCESSFUL_DATABASE_AUTHENTICATION_GROUP
1178681924 DBAF DATABASE AUTHENTICATION FAILED
1396785732 DBAS DATABASE AUTHENTICATION SUCCEEDED
1128481348 DBCC DBCC
1195590212 DBCG DBCC_GROUP
541868612 DBL DATABASE LOGOUT
538987588 DL DELETE
1280462916 DPRL DROP MEMBER
538989124 DR DROP
541284164 DWC DENY WITH CASCADE
538990661 EX EXECUTE
538989638 FT FULLTEXT
541545542 FTG FULLTEXT_GROUP
538976327 G GRANT
1111773767 GRDB DATABASE_PERMISSION_CHANGE_GROUP
1329877575 GRDO DATABASE_OBJECT_PERMISSION_CHANGE_GROUP
542069319 GRO SCHEMA_OBJECT_PERMISSION_CHANGE_GROUP
1330860615 GRSO SERVER_OBJECT_PERMISSION_CHANGE_GROUP
1448301127 GRSV SERVER_PERMISSION_CHANGE_GROUP
541546311 GWG GRANT WITH GRANT
1346653513 IMDP DATABASE_PRINCIPAL_IMPERSONATION_GROUP
542133577 IMP IMPERSONATE
1347636553 IMSP SERVER_PRINCIPAL_IMPERSONATION_GROUP
538988105 IN INSERT
541214540 LGB BROKER LOGIN
1195525964 LGBG BROKER_LOGIN_GROUP
1094993740 LGDA DISABLE
1111770956 LGDB CHANGE DEFAULT DATABASE
1095059276 LGEA ENABLE
1279674188 LGFL FAILED_LOGIN_GROUP
1179207500 LGIF LOGIN FAILED
1397311308 LGIS LOGIN SUCCEEDED
1196181324 LGLG CHANGE DEFAULT LANGUAGE
541935436 LGM DATABASE MIRRORING LOGIN
1196246860 LGMG DATABASE_MIRRORING_LOGIN_GROUP
1296975692 LGNM NAME CHANGE
542066508 LGO LOGOUT
1146308428 LGSD SUCCESSFUL_LOGIN_GROUP
538988364 LO LOGOUT_GROUP
1111772749 MNDB DATABASE_CHANGE_GROUP
1329876557 MNDO DATABASE_OBJECT_CHANGE_GROUP
1346653773 MNDP DATABASE_PRINCIPAL_CHANGE_GROUP
542068301 MNO SCHEMA_OBJECT_CHANGE_GROUP
1330859597 MNSO SERVER_OBJECT_CHANGE_GROUP
1347636813 MNSP SERVER_PRINCIPAL_CHANGE_GROUP
1196182862 NMLG NO CREDENTIAL MAP TO LOGIN
538988623 OP OPEN
1111773263 OPDB DATABASE_OPERATION_GROUP
1448300623 OPSV SERVER_OPERATION_GROUP
1380013904 PWAR APPLICATION_ROLE_CHANGE_PASSWORD_GROUP
541284176 PWC CHANGE PASSWORD
1195595600 PWCG LOGIN_CHANGE_PASSWORD_GROUP
1396922192 PWCS CHANGE OWN PASSWORD
1480939344 PWEX PASSWORD EXPIRATION
1129142096 PWMC MUST CHANGE PASSWORD
1280333648 PWPL PASSWORD POLICY
542267216 PWR RESET PASSWORD
1397905232 PWRS RESET OWN PASSWORD
542463824 PWU UNLOCK ACCOUNT
538976338 R REVOKE
538985298 RC RECEIVE
538986066 RF REFERENCES
538989394 RS RESTORE
541284178 RWC REVOKE WITH CASCADE
541546322 RWG REVOKE WITH GRANT
538987603 SL SELECT
538988115 SN SEND
1313624147 SPLN SHOW PLAN
1448301651 STSV SERVER_STATE_CHANGE_GROUP
1313953107 SUQN SUBSCRIBE QUERY NOTIFICATION
1313035859 SVCN SERVER CONTINUE
1146115667 SVPD SERVER PAUSED
1146312275 SVSD SERVER SHUTDOWN
1381193299 SVSR SERVER STARTED
1095975252 TASA TRACE AUDIT START
1347633492 TASP TRACE AUDIT STOP
538988372 TO TAKE OWNERSHIP
1111773012 TODB DATABASE_OWNERSHIP_CHANGE_GROUP
1329876820 TODO DATABASE_OBJECT_OWNERSHIP_CHANGE_GROUP
542068564 TOO SCHEMA_OBJECT_OWNERSHIP_CHANGE_GROUP
1330859860 TOSO SERVER_OBJECT_OWNERSHIP_CHANGE_GROUP
1195594324 TRCG TRACE_CHANGE_GROUP
542069332 TRO TRANSFER
1346847573 UCGP USER_CHANGE_PASSWORD_GROUP
1195459669 UDAG USER_DEFINED_AUDIT_GROUP
1430340693 UDAU USER DEFINED AUDIT
538988629 UP UPDATE
1178686293 USAF CHANGE USERS LOGIN AUTO
1196184405 USLG CHANGE USERS LOGIN
1129599829 USTC COPY PASSWORD
1414743126 VDST VIEW DATABASE STATE
1414746966 VSST VIEW SERVER STATE
1413699414 VWCT VIEW CHANGETRACKING
538984792 XA EXTERNAL ACCESS ASSEMBLY
538989912 XU UNSAFE ASSEMBLY
Advertisements

Database Lifecycle Management – Dacpacs

I came across an excellent article recently @ Simple-Talk discussing Dacpacs, working with them, structures, etc. in quite a bit of detail for a short article. This post discuses benefits, drawbacks, and potential areas of concern for working with dacpacs for your database management.

https://www.simple-talk.com/sql/database-delivery/microsoft-and-database-lifecycle-management-%28dlm%29-the-dacpac/

The PowerShell scripts are well worth checking out and there are several interesting calls for SQLPackage along the way. I appreciated his findings with some of the issues – not always seeing changes that should be detected. I suspect that there’s a setting not configured properly there, but couldn’t say for sure. I know that I haven’t hit any issues when using SQLPackage and dacpacs for normal DB development of the sort discovered at the end of the article.

The conclusion is spot on, whatever solution you use – check your scripts, test, and verify before just putting the solution into production.

Pre-deploy SQL Scripts before checking for changes

I just came across a post from Ed Elliot (b | t ) about running Scripts before the SQL Package process runs. Sort of a “Pre” Pre-deploy script.  The concept is simple enough, but it’s something people have wanted for a while.

Here’s the main article.

The general idea is that you add a SQL Script to your project with:
Build Action = “None”
Copy To Directory = “Copy Always” (or “Copy if Newer”)

When you set up your publish actions, look for this script first, run it in PowerShell, SQLCMD, or whatever other option you might have, then run your SQLPackage commands as normal. It will run through whatever scripts you want to run first, then compare against the modified database to bring it up to date.

The idea is pretty clever, though I haven’t come across a lot of need for it when trying to deploy DB changes. I’m not quite comfortable with a script running to change the state of the database to something unknown prior to running the script to bring it in line with version control. I suppose it has some use-cases, though, as I’ve come across this request several times on Stack Overflow.

SSAS Tabular and Excel PowerPivot

I don’t know why it took so long before making the connection between PowerPivot models and SSAS Tabular models, but in the last couple of days I came across several blogs and videos tying the two together. With that in mind, I took the plunge into PowerPivot and built a quick model off of some SQL queries to test out a concept.

I started with the basics of mocking up a fact table. I stripped out decorator text, “renamed” a couple of columns in the query, filtered the data so I wouldn’t pull back a crazy amount into my local workstation, verified the counts and look of the data in a SQL query, and added that. Excel quickly loaded 1.2M rows and didn’t blink.

After that, I knew I needed a couple of Dimension-like tables. I wrote queries for those in a similar fashion and added the data again. Each addition went smoothly. I was able to add the proper relationships quickly to tie everything together.

To finalize the model, I created a couple of hierarchies to make querying easier. After messing around with these several times in SSAS Multi-Dimensional, I was shocked at how easy it was to create a hierarchy for dates and even certain domain-specific entries in our data: Right-click, create new hierarchy, drag and drop columns. Done.

After the model was finished, I dropped the relevant fact data into a Pivot Table and could easily analyze the data in all sorts of interesting ways. I worked around this with OLAP for quite some time to come up with a similar concept. With PowerPivot, I wrote a couple of queries, linked the tables, and had something workable within an hour. As an added benefit, I know I can scale that up to an SSAS server without too much trouble and have a similar experience.

Hat tip to the folks @ www.powerpivotpro.com and www.sqlbi.com for the many articles and pointers to get me started and over that initial hump.

Exploring BIML

Getting Started

I attended Varigence’s BIML Workshop last year prior to the PASS conference and got to learn a little BIML while seeing what it can do. Recently, I realized that I had a job to do that required applying the same SSIS ETL to several different sets of tables in our system. I figured this would be a great time to put some of that BIML into use.

To get started, you’ll need either the BIDS Helper add-on or a copy of Varigence’s Mist. While Mist is an excellent tool, it’s also priced towards the people who are really working with BIML and not targeted at those who aren’t quite ready to justify the cost. ($249/month or $3999 at the time I’m writing this – worth it if you use a lot of BIML, but not for everyone)

I made sure my BIDS Helper bits were up to date, started a new SSIS Project, and added a new BIML file by right-clicking the project and selecting “Add New BIML File”. After that I followed a couple of excellent walkthroughs and tutorials to get me started and refresh my memory. To name a couple of sources:

 

Challenges

1. Actually getting my connection string correct. I don’t often type in connection strings so hit the ConnectionStrings.com website for SQL OLEDB connections. That told me to use this format for my connect strings:

Provider=SQLNCLI11;Server=myServerAddress;Database=myDataBase;
Trusted_Connection=yes;

Regardless of what I tried, this refused to connect as expected. I eventually changed back to the older syntax and that worked:

Provider=SQLNCLI11;Data Source=myServerAddress;Initial Catalog=myDataBase;Integrated Security=True;

Not a huge deal, but this was giving me fits with my connections with either security issues or having to specify DB Names in my code.

 

2. Next up – not knowing C# or BIML very well. Nothing for this except time and practice. 🙂  I got better along the way, slowly catching things as I typed them in and having some feel for what would happen.

 

3. For some reason, my Intellisense completely stopped working at one point. It worked for everything prior to when I added my “import” lines for various namespaces, but anything after that wouldn’t work at all. I got around this a little by using the BimlScript Editor online to check some code, but obviously this doesn’t work when you’re importing local columns. I never did get this working but also haven’t tried to reproduce it with a new project.

Update: It seems that if I move my @ Import statements before I open the first BIML tag in the XML, my intellisense works again. Putting the statements within the Biml section is perfectly valid, but putting them at the top/beginning of the file before I start writing Biml makes a lot more sense and results in Intellisense working again. This would have made my life easier.

 

4. The last major challenge this time through was trying to get BIML to generate a logical “AND” in the SSIS expression I wanted to use. I typed it in, used “&&” for my AND statement, and checked syntax. I repeatedly got errors about syntax. The first time around, I figured my expression might be too long. It was and I broke it apart only to get the same error. I looked at the string generated using some MessageBox.Show() hacks and it looked okay. I finally came across a tip indicating that BIML doesn’t work well with the “&” character. The suggestion was to replace that with “&” whenever you wanted to output an ampersand as part of the expression. This worked well and my package was generated.

 

Next

I still need to tweak my package. I have more tables to import from a different source on the same server, but targeting a single database on my destination server. I need to figure out tiers and included files to better reuse and organize code. I’d love to get the correct connections going so BIML and SSIS read the correct column data types, especially when it comes to BLOB data columns. I need to rework some of the package components to better adhere to best practices. Overall it’s been an interesting ride trying to figure these things out and while it may not have saved time with this particular project, I’ll be able to use the lessons learned for my next project.

SQL Projects & Change Data Capture

I ran across this article from MSSQLTips.com discussing the use of CDC with SQL Projects. We don’t use CDC with our current systems so I’d never tried to investigate this, but I would also have expected better support out of the box. In the meantime, Ray Barley wrote up a good article on enabling CDC for the Database and then using post-deploy scripts to enable CDC for tables.

Find the article here: www.mssqltips.com/sqlservertip/3316/how-to-enable-change-tracking-in-a-sql-server-database-project/

SSDT Issue–Online Index Option Ignored

I came across this as we’d recently received a corporate directive that all indexes should be created with the ONLINE=ON option. That’s not too bad. Add that to all indexes as needed, publish the DB project, and we’re good to go. Except that we’re not.
Due to an issue with SSDT “working as designed”, the ONLINE=ON option is ignored as any of the criteria for comparing indexes within SSDT. On the positive side, it means that any index created that way won’t have the option dropped. On the negative side, it means that any index that is changed or added will not have that option.
If this is an issue to you, comment and vote on the related Connect ticket. If we can garner enough votes and interest, perhaps MS will hear us and change their mind about whether or not the ONLINE=ON should be part of the release script.
In the meantime, if you absolutely need this functionality, generate scripts for your releases and ensure that all index creates/alters include the option you want.

Edit: There’s been some discussion about this on the MSDN forums around this. Kevin Cunnane posted several responses pointing to a Codeplex example of how this might be possible by compiling a DLL and referencing that through the SQLPackage command line. I have not had a chance to try that out yet, but am including the forum and Codeplex links for those who want to investigate further.

Forum post
Codeplex example

All SSDT Articles

SSDT: Adding Logins for “Execute As”

We recently encountered an issue requiring us to use “Execute As” syntax in some of our stored procedures. Of course, we have multiple environments which all have different permission definitions so this would cause a bit of an issue. In this case, the login itself is standard in all of our environments, but we don’t define those within the project for the most part.

First problem: Add a new user called “MyLogin” to the project.

Solution: This was easy enough, I just added a new user to the project. However, I then ran into an issue that the Login for “MyLogin” did not exist in our master.dacpac reference.

 

Second Problem: Create a SQL Login reference within the master dacpac file that could be seen by all of our projects.

Solution: This proved to be a bit trickier. The model.xml file inside the default master.dacpac files contains many system-level objects. We would need to find the appropriate XML syntax for a new SQL Login element, add it to the model.xml file, and then update the hash for the dacpac file.

 

Extract the Model.xml file from our master.dacpac file

This can be easily done by finding your master.dacpac file that you referenced in your project and either renaming it to a .zip file or using a tool such as 7-zip to right-click the file and open it. Extract its “model.xml” file to some working folder.

Add a new SQL Login element to the model.xml file

Now that there’s a copy of model.xml to edit, open it in your favorite XML or Text editor. I like Notepad++, but even plain Notepad will work. Navigate to the very bottom of the file. We want to add the new element just before the tag. (for ease of use, not a requirement) The example XML is below. Note the “GloballyScoped” annotation type. This treats the entity as something that is available outside of the current database.




msFT7_&#$!~Make a new Dacpac file

msFT7_&#$!~Now that we’ve added the new login to our model.xml file, either add it back to the existing dacpac file by using something like 7-zip, or re-compress everything that was extracted from the old dacpac into a new zip file, then rename it to master.dacpac.

msFT7_&#$!~But what about the checksum?

msFT7_&#$!~All of the above works great, but the dacpac files provided by MS all have a checksum associated with them. If that checksum is off, it means that someone tampered with the file and it won’t work when you try to use it as a database reference. I will now give props to Gert Drapers (twitter) for writing up how to add objects to the master.dacpac file.

msFT7_&#$!~To summarize, you basically extract the model.xml, edit it, re-add it to the dacpac, but then you can run a small app Gert wrote called dacchksum.exe to modify the checksum. I’m quoting Gert’s article here, but you basically put that exe in a place in your path or in the folder you want to use and run something like this:

dacchksum.exe /i:Master.dacpac /u /v

This will change the checksum of your master.dacpac file so it will now work as a database reference.

 

Add your user to the project

Now that there’s a Login for the user to reference, you can now add the user to the project. Right-click the project, select “Add”, and then choose the “User” option. You’ll have a new User created which you can easily set up to reference the new login in your master file.

Overall, not my favorite way to do these things, but if I know that a login will exist on the server I am more comfortable with adding a user to the project than I would be otherwise. You could still run into issues if this login doesn’t exist everywhere you want to use the project.

 

Final Thoughts

As Gert mentions, this trick is for adding any object to master. If you have a copy of sp_whoisactive or sp_blitz that you want to reference for some reason, you can add them to some one-off project, then copy that XML Element node into your master’s model.xml, regen the hash, and enjoy a new globally referenced object. It’s a useful trick, though one I hope not to have to use very often.

SSIS, MySQL, and invalid Dates–Take #2

A while back I posted an article on handling dates in select statements from MySQL. The idea was that I wanted a quick way to generate select statements for MySQL that would automatically set any invalid date to NULL. (an invalid date being one such as “0000-00-00” or “1973-02-00”)  After some more trial and error, I realized that this script was incomplete. I’ve created an updated script that should better handle those edge cases.

The script is largely the same as the former, but I’ve added handling for invalid years or months or days. We don’t want to redirect the rows for further error handling. We want to treat the dates that won’t successfully import as NULL. This could be done several ways within SSIS, but handling the transform within the select statement reduces our need to write many transforms to handle each date time for each table.  (I suspect that BIML could be my friend for this task, but I’m not quite ready to tackle this with BIML.)

SELECT CONCAT(CASE
  WHEN ordinal_position = 1 THEN ‘SELECT ‘ 
  ELSE ‘, ‘ 
  END
, CASE data_type WHEN ‘date’
THEN CONCAT(‘CASE WHEN ‘, column_name, ‘= ”0000-00-00” THEN NULL ‘
, ‘WHEN ‘, column_name, ‘ LIKE ”%0000%” THEN NULL ‘
, ‘WHEN ‘, column_name, ‘ LIKE ”%-00%” THEN NULL ‘
, ‘else ‘, column_name, ‘ END as ‘, column_name)
WHEN ‘timestamp’
THEN CONCAT(‘CASE WHEN ‘, column_name, ‘= ”0000-00-00” THEN NULL ‘
, ‘WHEN ‘, column_name, ‘ LIKE ”%0000%” THEN NULL ‘
, ‘WHEN ‘, column_name, ‘ LIKE ”%-00%” THEN NULL ‘
, ‘else ‘, column_name, ‘ END as ‘, column_name)
WHEN ‘time’
THEN CONCAT(‘CAST(‘, column_name, ‘ AS CHAR(8) ) as ‘, column_name ) 
ELSE column_name
END
, CASE WHEN ordinal_position  maxord THEN  
  ELSE CONCAT(‘     FROM MySchema.’, c.table_name, ‘;’) 
  END) AS Select_Column_Name
FROM   information_schema.columns AS c
       JOIN (SELECT table_name,
                    Max(ordinal_position) AS MaxOrd
             FROM   information_schema.columns
             WHERE  table_schema = ‘MySchema’
             GROUP  BY table_name) AS t1
         ON c.table_name = t1.table_name
WHERE  table_schema = ‘MySchema’
ORDER  BY c.table_name, ordinal_position
LIMIT  0, 50000;