Very Simple Data Entry with C# Winforms & Datasets

I recently needed to generate a quick and dirty app for data entry into a normalized database. I’ve done this sort of thing in Access before so started to play with it a little. After getting one table’s lookup almost done, I decided to try out a very quick Winform instead.  It’s not a solution I’d recommend for production use, but worked for my purposes.

I figured I’d share in case anyone needed something quick, but not necessarily great-looking for some data entry work.  This follows the “to learn something, have a small starting goal, learn what you need for that goal, then continue” pattern.  That involved Creating a winform, Connecting to a DB, displaying that data, and being able to edit that data, as well as gracefully launching and closing windows.

The end result is one form per table that I needed to use, with some lookups. The forms are basic data grids so you can see all data on one page.  You can add controls to make things prettier, but that’s way beyond the scope of this project.

This assumes you have a copy of AdventureWorks to reference and a copy of C#.

I would not recommend trying this against a table with a large number of rows. We don’t place any limits on our tables when fetching data in this basic app.  (That might come in a future project.)

 

Start by creating a new Winforms project.

devenv_2018-08-02_18-08-49

I renamed my main form to “frmMain” and changed the title (name) to “AdventureWorks Data Entry”.

devenv_2018-08-02_18-11-55

 

Now we’re going to Add a New Data Source in the “Data Sources” toolbox.  (If this isn’t visible, show it using the “View” menu, “Other Windows”.)  There should be a simple “Add New Data Source” link. Click that or the button at the top of the Data Sources toolbox.  We’re going to choose “Database”, click “Next”, then “Dataset”, and click “Next”.  Choose the “SQL Server” option and browse for your server/database. I’m using my local machine and local AdventureWorks database.

devenv_2018-08-02_18-15-52

Hit “OK” and click through the dialog to save the Connection.  I’m just using the default name of “AdventureworksConnectionString”.

Your next screen will ask which objects you want to include.  For our purposes, we’re going to keep the dataset very simple.  I’m choosing the “Production.ProductInventory”, “Production.Location”, and “Production.Product” tables.  We’re going to concentrate on maintaining the Inventory  for our example.  Select the “Location”, “Product”, and “ProductInventory” tables and click OK.

You should now have a new “AdventureworksDataSet.xsd” file if you chose the defaults. Double-click that as we might want to tweak some settings.

devenv_2018-08-02_18-28-59

In this case, the “rowguid” and “ModifiedDate” columns for ProductInventory have defaults. We want to use those when we add new rows.  Click the “ProductInventoryTableAdapter” in the dataset window.  That will change your “Properties” window.  We want to edit the “InsertCommand CommandText” property by clicking the ellipsis (…) next to the “INSERT INTO” statement.  This will bring up a window familiar to many of us. Expand or move the screens around as needed until you can see the “rowguid” and “ModifiedDate” columns.  De-select those.  This will prevent the code from attempting to insert values into those columns.  Hit OK.

devenv_2018-08-02_18-33-58

Now click on the rowguid column in the dataset view.  This should allow you to edit the Properties of the column.  We want to set “AllowDBNull” to “True”.

devenv_2018-08-02_18-36-36   devenv_2018-08-02_18-36-57

Do the same for “ModifiedDate” and then Save All.  We’re now ready to add our Product Inventory form.  Go to your Solution Explorer, right-click the Project and select “Add” then “Windows Form”.  I’m choosing to call mine “frmProductInventory”.

2018-08-03_19-10-18

 

With your new Windows form, go to the Properties panel, Appearance section, and change the “Text” property to “Product Inventory”.

devenv_2018-08-03_19-12-22

 

Now we’re going to make sure the “Data Sources” toolbox is open. I pinned it so it would be visible. From the “AdventureworksDataSet”, we’re going to simply drag out the “ProductInventory” table to the Product Inventory windows form.  I’ve also resized it to have anchoring at the edges of the form. We’ll set those in a little bit to ensure the grid resizes with the window.

2018-08-03_19-14-04

 

Now, select the grid within the form, go to the Properties panel, scroll all the way down and look for the Layout – Anchor property. We’re going to change that to “Top, Bottom, Left, Right”.

devenv_2018-08-03_19-17-05

 

We don’t want the users to add data in the “rowguid” or “ModifiedDate” columns, so let’s remove those from the grid to avoid any confusion. Right-click the grid and select “Edit Columns”, then highlight each column and click “Remove”, one at a time. Click OK when done and you’ll see that the columns are no longer present in the grid.

2018-08-03_19-19-20

 

Now we want our users to easily be able to access the Product and Location values rather than trying to remember the underlying IDs.  For that, we’ll need to edit the columns again, change the Column Types for those to a Combo Box, and lookup data in the appropriate tables. While we’re at it, we’ll change the visible names to just “Product” and “Location”.

  1. Edit Columns
  2. Change the HeaderText from “ProductID” to “Product”
  3. Change the ColumnType to “DataGridViewComboBoxColumn”
  4. Change the DataSource to the “Product” table by drilling through the Other Data Sources
  5. Set the DisplayMember to “Name”
  6. Set the ValueMember to “ProductID”
  7. Optionally adjust the size of the columns. I’ve set mine to 150 minimum, with the AutoSizeMode set to “AllCells”

2018-08-03_19-27-00

Repeat for the “LocationID” column, but using the Location table.

 

For our last step, we need a way to make this form visible.  Let’s switch over to our “frmMain” form.  (Re-open it, if you closed it.)  We’re going to add two “Button” controls from the “Common Controls” area in the main Toolbox.  Just drag them onto the form.  I resized the form as well to make it a bit smaller as we’ll just have two buttons on it.

devenv_2018-08-03_19-34-09

Select “button1” and change its Text property to “Product Inventory”.  Change its Name to “btnProductInventory”.  Resize the button so all text shows.

Select “button2” and change its Text property to “E&xit”.  (This will allow for a hotkey of Alt+X to exit the app.)  Change its Name to “btnExit”.

devenv_2018-08-03_19-36-03

 

Now – double-click the Product Inventory button. This will open up your code window to a new function called “btnProductInventory_Click()”. We’re going to add the following code inside the curly braces of the function:

var ProductInventoryForm = new frmProductInventory();

ProductInventoryForm.Show();

Save that, go back to the frmMain design view and double-click the “Exit” button.  This will open up your code to a new function called “btnExit_Click()”.  Add the following lines of code inside the curly braces of the function:

Application.Exit();

 

Save everything and start the app. You should now have a functioning form that will let you open a basic data entry grid for a table with lookups for the ID values. You’ll need to click the built-in “save” button to save your changes as this is a pretty basic app, but it should suffice for really basic use cases.

AdventureWorks Data Entry_2018-08-04_15-41-59

 

I was having some issues when trying to close this form after adding new data. I ended up just throwing a basic DataError event handler with no code on the data grid view. You can do this by select the data grid on the Product Inventory form in design mode, then going to the Properties window, but click the lightning bolt icon to bring up Events. Look for the “DataError” event under Behavior. Click the drop-down there and you’ll be taken to a new function to handle DataError events. I just left it as-is, saved, and re-ran. The form now closes without any issues.  Without that, it would complain that a combobox wasn’t bound properly and throw errors for every Location.  With the basic catch, it just closes the window as expected.

I’m aware this isn’t best practices, but it worked for this basic case.

devenv_2018-08-04_15-53-05

 

In conclusion, we now have a really, really basic data entry app using data grids and C#.  I know this isn’t following a whole lot of best practices, but for someone who doesn’t know much C# it gets the job done.  I’m open to feedback on better (but still easy) ways to do this and if I’m doing something that is just wrong, please let me know.

Advertisements

TSQL Tuesday #102: Giving Back

T-SQL Tuesday LogoThis TSQL Tuesday comes to us from Riley Major (b | t), He asks us in what ways we can give back to the SQL Community. He suggests blogging, speaking, answering questions, mentoring, or if we have done some of those already, to discuss what motivated us to contribute.

I knew of the helpfulness of the SQL Community from way back in the old USENET days when that was the best way to get help. Some users would download all of the headers and a good portion of information for offline use during their commutes on a train and spend that time reading and responding to others. SQLServerCentral.com popped up during this time period and was a great free to use resource to share information and articles.  The early days of Twitter saw the #sqlhelp hashtag pop up as a way to ping the community for ideas, with several keeping an eye on it and responding.

I learned quite a bit from the regulars on USENET. Jamie Thomson blogged about SSIS and Database Projects, which helped me to overcome some hurdles. We still have people who share regularly – SQL Skills and Brent Ozar’s team both provide regular writings to teach and inform others, even while maintaining a consulting business. Grant Fritchey has now taken to vlogging to share information.  Red-Gate provides a pretty good online newsletter. When attending SQL events, the attendees are often more than willing to share their advice and experiences.  I haven’t seen this level of sharing in many other tech communities.  I’ve had some great co-workers over the years who shared their knowledge to train others and grow/challenge the people around them. They’ve also been great to bounce ideas off of over the years.

I knew I wanted to give back early. I’ve learned from others and can share what I know to help new people grow their skills or not have to learn by making mistakes I’ve already made. I try to keep up with the SQL Community on Slack and answer questions that fall within my strengths. I’ve put together a guide on using SSDT some time ago. It’s old, but the information is largely still relevant.  I still blog as I discover something that could be helpful to others. I also try to contribute on StackOverflow in the DB Project/SSDT category. (I’ve found my response time for the TSQL tag is often too slow to help out too much.)

In a couple of weeks, I’ll be presenting at SQL Saturday Dallas 2018. My topic will be “Getting Your DB Schema Under Control with SSDT”. I have learned quite a bit about DB Projects over the years and the idea of getting your databases into some sort of version control is a passion of mine. I’ve dealt with trying to recreate changes over the years and it’s always been painful trying to handle changes to DB Objects. If you’re planning to go and this is of interest to you, feel free to check out my session in Slot #2. I plan to make the slides and code available after the session is over for anyone who wants to follow along.

Not related to SQL, I also try to serve the church community with my technical knowledge. I’m not a full-time person for that area, but have learned quite a bit over the years and have a passion for helping the local church use technology effectively for their mission. I know my strengths well enough to be able to contribute and advise. I’ve run a couple of tech support workshops to help people clean up their PCs or do basic hardware upgrades that they’d otherwise pay the Geek Squad $$$ to perform. I know that I’ve acquired these skills over the years for a reason, so giving back just makes sense.

I want to close with an encouragement to anyone who happens to read this. You may be just starting out or an old pro. Regardless, you probably know something that nobody else does. Write up your experiences and share them. Check out some of the tech support sites such as stackoverflow or ask.sqlservercentral.com and read up on some of the questions. Answer where you can contribute.  If speaking in public is something you can do, volunteer for a local SQL User Group meetup or meeting or consider presenting for a Virtual Chapter.  Learn a bit about presenting and watch some good presenters to grow those skills.  But don’t just take in all of that information and never share it. You’ll grow as you grow others.

Azure DB, ROUTE ADD, and VPNs

One of the things that I have to content with every now and then with Azure SQL is accessing our Azure SQL databases through the corporate VPN. Our servers are set up to only allow access from within the company office. We have a VPN set up to let me get into the company office servers. There are several routes set up for existing entities, but every now and then an Azure IP address changes or we start using a new server for which a route isn’t already configured.  Until that route is added to our VPN configuration, I found that using the old ROUTE ADD functionality has served to get me working while waiting.

From within an Admin Command Prompt (ROUTE ADD requires admin rights):

  • Run ROUTE PRINT
    • Determine your VPN Mask, Gateway, and Interface
    • May need to get the interface (IP address) and Net Mask (mine is 255.255.255.255) from an “IPCONFIG” statement.  Look at the values for your VPN connection.
  • Run a ROUTE ADD with the appropriate parameters:
    • ROUTE ADD destination MASK net mask    gateway IP    METRIC metric  IF  interface #
      • Destination == your target IP
      • Net Mask == the net mask for the interface – probably 255.255.255.255
      • Gateway IP == whatever gateway is used for getting out through the VPN
      • Metric == cost for the destination. I’ve been using something like 3
      • Interface # == the interface number listed in the “Interface List” from ROUTE PRINT

I’ve been able to get this running without the METRIC or INTERFACE numbers and the Gateway IP being the IP address of the VPN connection.

ROUTE ADD AzureDBIPAddress MASK 255.255.255.255 VPNIPAddress

T-SQL Tuesday #99 : Singing and Music

T-SQL TuesdayThis month’s T-SQL Tuesday is hosted by Aaron Bertrand (b/t), and he offers as a topic what he calls Dealer’s Choice. We can either share something about which we’re passionate outside of the world of SQL Server or we can discuss a T-SQL bad habit. I’m choosing the former.

I’ll start with singing and music. I grew up in a musical family. My grandmother played organ at her church for decades. She started when she was in her teens and kept going for a long time. imageShe also gave private piano and organ lessons. She directed children and adult choirs along the way. All of her children are musical in some way. My mom played cello and piano and sang soprano for quite some time. We lived next door to my grandmother in an old (1700’s) duplex home so I was over there regularly to play around on the piano, color, sneak a cookie or other treat, or just to visit. Along the way, I got free piano lessons until I eventually switched over to clarinet and saxophone.

I played clarinet as my primary through HS and added Alto Sax for our school’s jazz band. I never made the State band, though part of that was because I tended to not work as hard at my music as I should have. I had quite a bit of talent to handle first chair in our school band with little effort and was not very disciplined to practice harder for outside auditions. I knew music wasn’t going to be my career even then, but still enjoyed playing. Our band did pretty well at band competitions and it wasn’t until I left that I realized how many talented and gifted players we actually had in that small school. I miss playing in that band, but I don’t miss the q-tips we wore during marching season.

Once I got to college, I continued to play clarinet in the Wind Ensemble for fun. However, I was introduced to Barbershop style harmony during my time. One member dragged me along to a local chapter meeting and I was hooked. I love the tight harmonies, the a capella style, the “old songs” and the new ones done in that style. I hadn’t actually done much singing since about age 10, but found I had a pretty good voice for singing the “Lead” part (2nd Tenor) in Barbershop. That’s stuck ever since and I’ve joined a local chapter everywhere I’ve lived.  When I travel, I try to visit the local chapters to join them.

VM_6 1200w.jpgI currently sing with the Vocal Majority, a 160 voice men’s chorus in the DFW, TX area. We sing more than just Barbershop, branching out to show tunes, pop tunes, and other styles. The chorus has won the gold medal at the annual Barbershop Harmony Society conventions 12 times. I was there for 3 of those. You can see some of our videos on YouTube.

imageI’ve helped out by singing in pick-up quartets to deliver Valentines and currently help run the attendance app for meetings (developed by another member, but it runs on a SQL Server backend).

The bass is 5’26” – and has almost resorted to making cards to hand out with that information. He’s a great guy all around and enjoyed going into the high schools.  The number of students trying to snap selfies with him as he walked down the halls was pretty amusing.

During the week, I sing in my local church choir. I find that a great use of the skills and talents I’ve been given. I appreciate that we are encouraged to be worship leaders, not just performers or people singing on a stage. The church encourages musicians of all types and ages, and has a worship orchestra for members who want to play instruments. If I weren’t singing, I’d pull out my clarinet to join them. My oldest is able to sing along with the student choir. The little ones are encouraged to sing and play instruments from a young age. I’m glad that they get to enjoy that and are trained by people who love music.

T-SQL Tuesday #97 – Learning Goals for 2018

T-SQL-Tuesday-LogoThis month’s T-SQL Tuesday is brought to us by Mala Mahadevan.  Mala has challenged us to set some goals for our own learning in 2018. This could be learning to improve our social skills, technical skills, or perhaps some other area of our lives.  The challenge is to be intentional about our learning by answering three questions:

  1. What do you want to learn? (specific skills and talents)
  2. How and when do you want to learn? (methods of learning and timeline on learning)
  3. How do you plan to improve on what you learned? (Putting it to use at work/blogging/speaking)

I know that I’ve wound down the year with some time to learn new skills, sometimes out of necessity, sometimes because the end of the year has allowed a little more freedom to pursue that time to learn. One year I taught myself enough Python to script out some tasks that were giving me huge amounts of trouble at work. I figured out how to talk to SQL Server, manipulate files, call out to command line utilities, and enable some easier re-use of the code so it could be parameterized for similar uses with different customers. This was back when PowerShell was still just hinted at by MS and their only options were VBScript, Batch files, or some 3rd party option.  Had this come a couple of years later, I’d likely have tried to do the same in PowerShell.

What Do I Want to Learn?

In 2018, I can see a couple of areas that I’d like to grow in.

  • Biml – I see quite a bit of work coming soon that will have me working on similar SSIS packages that will require tweaks and adjustments. I really don’t want to repeat some of the fun of “adjust a UNION ALL” transform and click through the package to adjust the rest of it. If I can get enough Biml under my belt to even start to automate this or reduce my work, it will be a win. We’re looking at doing ETL from one source to many databases as well to support Power BI. I can see Biml being a powerful tool to help with that.
  • Power BI – What can I say? Data Visualization is big and MS has been investing heavily in Power BI. It’s what our customers want and what we’re planning to use to enable them to see all sorts of interesting data. I need to keep up with what MS is doing here and be able to support loading and querying the data so we can present it to our customers. I still plan to leave the “make pretty” work to those who are more talented than I in that area.
  • PowerShell – With more of my work going to Azure SQL, I really need to ramp up my PS skills. I’ve primarily worked as a DB Developer – TSQL, SSIS, SSAS, some SSRS, and design. PowerShell hasn’t been a high priority there because, apart from deploying those things, I don’t really have a need for it day to day.  With Azure SQL in the mix, if I want to query all of our databases, I no longer have the ability to use something like “sp_msforeachdb” or run a cross-database query. I need to connect to each DB separately, run my code, then connect to the next.  Knowing more PowerShell will help me quite a bit to handle those repeated tasks.

How and when do I want to learn?

Now for more details. This is where things get harder to plan, but where a plan is most needed.

Biml – my main starting point here will be to get through the Biml Book that was just released by APress.  I’ve already started reading through this and have worked with Biml before, though not to the extent of others. I’m aware of its capabilities and know that it would help me greatly. I will need to ramp up some C# skills to be able to make more effective use of my code, but even without that I can help make packages that are easier to tweak.  My plan here is to get through about one chapter per week.  I’ll then take that knowledge and rewrite some of our existing SSIS packages into BIML code.  Any new ETL tasks that come up I’ll try to generate them in Biml so I can materialize packages that are easier to change as needed.
Goal: Able to handle simple-medium level complexity tasks in Biml by June 2018

Power BI – My biggest gains here will be through better understanding of DAX and the Tabular model. I’ve followed the PowerPivot Pro team, regularly read the writings of Russo and Ferrari, follow Adam Saxton’s “Guy In A Cube” videos, and try to keep up with the PowerBI User Groups.  However, I’ll need to take some time to really go through a good Power BI book or course. I plan to use Pluralsight, EdX, and any decent books that I can find to get me beyond the beginner stage for DAX and Tabular data.  This is an ongoing area for me. My goal is to be able to design optimized data structures and answer some more complex questions, especially when it comes to comparing data over similar periods
Goal: Complete the EdX PowerBI course by March 2018. Complete one book on Tabular/DAX by end of 2018.

PowerShell – I’ve gone down this road before. In the early PS days, almost all of the books/articles dealt with administration of Windows domains/services with very little about SQL Server. Even the SQLPS module got little love and community alternatives sprang up. That’s a little different now. My first priority will be connecting to Azure SQL Databases to run queries – especially ones that need to run against a list of databases. After that, I’ll move on to things like deploying and building SSDT Projects, capturing the scripts/builds for later replay and such. There are more tasks for DB Developers now with automated releases, builds, CI/CD, and a plethora of tools that weren’t around 10 years ago.  I can take advantage of books and the SQL Community’s contributions to learn more about PS.
Goal: Complete at least one PS book by the end of 2018. Write a couple of scripts or modules to run SQL against a list of databases/instances/servers in Azure SQL by June 2018 at the latest.

How Do I Plan To Use These Skills?

Most of these will be applied directly to my current job. BIML to build SSIS packages and make them easier to generate – ideally moving towards a metadata-driven framework that I’ve seen others use.  PowerBI will be used to visualize data about all sorts of things – SSIS, performance, customer data/metrics.  I’ll use PowerShell to help me run SQL scripts against multiple databases. I’ve had a need for this for a while and it’s just going to grow when using Azure SQL. I’ll also use it to help me build and deploy database changes to those databases.

TSQL Tuesday #96: Folks Who Have Made a Difference

T-SQL Tuesday Logo

This month’s T-SQL Tuesday, hosted by ewald, touches on the people who have made a positive difference in our journeys in the SQL Server world. Please pardon if you’re reading this and feel left out – that’s not my intention. There are so many people who’ve made a difference for me along the way that it’s hard to list them all.

My first shout-out goes to the folk I worked with way back when I was just starting out. I worked for a company just coming out of start-up mode with SQL Server 4.21a.  The first time I heard of “sequel” server had me scratching my head until I saw the posters around for “SQL” Server. I was given an opportunity to learn, mostly through BOL, SQL Server and become an MCSE through much studying and some horrible exams. I discovered that I kind of liked this database thing and learned more, got better at my TSQL, and learned where to ask questions from the people who were really involved in the product.

I owe a huge debt to the folk who regularly participated in the old “microsoft.sqlserver” USENET groups. I could search through those posts for people with similar problems to what I was trying to solve, or just interesting questions. I would find out pretty quickly about major bugs (anyone remember SQL 6.5 SP2 and SP4?).  Regulars there such as Joe Celko, Steve Jones, Andy Warren, Andy Leonard, and Jamie Thomson would all respond pretty quickly and often provide additional food for thought.  I could follow along with people wanting to argue with Mr. Celko about proper ANSI SQL – usually good for some amusement. I could learn some interesting tips for making DTS jump through the hoops necessary to move data around.  One thing I learned from those USENET groups is they were vastly different from other tech groups at the time. If I asked a question in those forums, I got a real response – not just a terse “RTM and go away” that I saw so often in other forums. I knew that the SQL Community was much more welcoming than other groups at the time and that was quite attractive.

I have to give a shout-out to co-workers along the way who challenged me and let me bounce ideas off of them, as well as for sharing a laugh when the situation didn’t really call for laughing. Jeff Rush, Mark Hill, Nick Floyd, and many of the people from Fellowship Tech, several along the way from First City/Drive/Santander, and many at my current gig at Healthwise.  I appreciate the desire to follow through, think of new ways to solve problems, and do all sorts of cool things along the way to increase our efficiency by spending less time on “keeping the lights on”. As an admitted “lazy DBA”, I can definitely appreciate time spent doing meaningful work instead of busywork.

The people in the SQL Community – many of whom I have not met face to face, but chat with in the Slack forums.  There are just too many to mention here, but I appreciate the regulars who share a joke, a neat way of doing things, ask questions that make me think, or answer questions I have because they’ve been there already.  If you’re in the Slack community – you’re likely in this group. Thank you. I appreciate you all taking part not just in general chit-chat, which is fun, but in helping everyone grow in their skills.

The folk in the PASS Prayers group. As a Christian, it’s great to have a group of people who will pray for each other. Yes, we have that in our local communities, but it’s great to have a group of people who share the same types of career and can join together with a bit more understanding about our day-to-day struggles. It’s also great to meet up with them when I actually get to attend the Summit.  Thanks to Mike Walsh and others for getting us all together.

I know there are groups that I’m leaving out and that’s because there are so many people who’ve been helpful in this journey. Thanks for taking the time to walk with me or guide me. I hope that I can provide that same assistance to others along the way.  And a hat-tip to Ewald Cress for a timely TSQL Tuesday Topic.

Handling Late-Arriving Members in SSIS

I was chatting with some other SQL Community members recently and the topic of handling late-arriving members for a warehouse load came up. I figured it was worth taking a couple of minutes to share the general concept I’ve been using to handle late-arriving members in SSIS. The steps are relatively straightforward if you’ve done much SSIS, but sometimes the little things can trip you up. The example below is based loosely on the WideWorldImporters database samples from MS.

The general steps are

  1. Set up your source query.
  2. Pass the data through a Lookup for your Dimension with the missing results routed to a “No Match” output.
  3. Insert those “No Match” rows into your Dimension using a SQL task – checking to make sure that this particular row hasn’t already been inserted (this is important).
  4. Do another lookup using a “Partial Cache” to catch these newly-inserted members.
  5. Use a UNION ALL transform to bring the existing and late-arriving members together.

Inferred Member Data Flow

Sample code for the Inferred Insert:

DECLARE @CustomerID INT

SELECT @CustomerID = ?

IF NOT EXISTS (SELECT *

               FROM   WideWorldImportsDW.Dimension.Customer

               WHERE  [WWI Customer ID] = @CustomerID)

  INSERT WideWorldImportersDW.Dimension.Customer

         ([WWI Customer ID],

          Customer,

          [Bill To Customer],

          Category,

          [Buying Group],

          [Primary Contact],

          [Postal Code],

          [Valid From],

          [Valid To],

          [Lineage Key])

  VALUES ( @CustomerID,— WWI Customer ID – int

           N’Unknown:’

           + Cast(@CustomerID AS NVARCHAR(10)),— Customer – nvarchar(100)

           N”,— Bill To Customer – nvarchar(100)

           N”,— Category – nvarchar(50)

           N”,— Buying Group – nvarchar(50)

           N”,— Primary Contact – nvarchar(50)

           N”,— Postal Code – nvarchar(10)

           Sysdatetime(),— Valid From – datetime2(7)

           ‘9999-12-31 23:59:59.9999999’,— Valid To – datetime2(7)

           2 — Lineage Key – int

  ) 

ER Diagram Software Options for SQL Server

I’ve seen this come up several times recently in the SQL Server Community. There are diagrams built in to SSMS, but they’re a bit limited in their usefulness. You can’t easily break them down into sub-models or tweak them without affecting the underlying database. There are 3rd party diagramming options for SQL Server, some free and some paid.

Simple-Talk ran an article calling for people to submit their favorite ERD software.

Database Answers has a page dedicated to modelling tools. I’m not sure when the page was last updated, but as far as I know the author is still maintaining the page as it points to some more recent offerings.

Database Answers also pointed to a Wikipedia page on Data Modelling Tools.

Robert Sheldon has a great article on Simple-Talk reviewing Online ER Software offerings. Vertabelo gets his top review, but others are mentioned.

 

I’ve used ER/Studio for years and generally prefer it to ErWin, though that is probably moot at this point seeing that Embarcadero bought ErWin. It’s a pretty expensive piece of software, though. Not everyone can afford the licenses needed to run it. I hope the pointers to some other lists will help people find the right tool for their needs.

SSRS, Data Sources, Stored Credentials, and Service Accounts

I was helping with the setup of a new SSRS instance on our servers recently and was puzzled when we tried to configure the data source. I wanted to set up a shared data source with a service account that would have the appropriate permissions. All reports were going to be automated – no direct user access. Security in that case wasn’t a huge concern. I pulled up the data source in the Report Manager, entered the credentials, and tested the connection, only to be given a “Login Failed” message.

I checked my username and password – both correct. I tried my credentials – they worked. I tried the service account again – failure.  I verified the service account’s permissions on the target server – correct. Finally, I dug into the Event Logs and noticed that the service account was requesting an interactive login when it failed.  I played around with different groups and found that it would work if in the Administrators group. Not wanting to leave it there, I removed it from the group and did some more digging.

I found this article on MSDN that indicated the accounts used for stored credentials needed both “Allow Log On Locally” and “Log On As a Batch Job” permissions.

I launched “secpol.msc” from a Run dialog and checked the Local Policies –> User Rights Assignments. This showed me that the only group/user allowed to log on locally was the Administrators group. I changed that to add this one service account.  I checked the groups allowed to log on as a batch and verified that this account was included in one of the allowed AD groups.  After setting this permission in the security policy, I tried again – this time to a “Success” message.

SSIS Catalog (SSISDB) Cleanup

I really appreciate what MS has done w/ the SSIS Catalog. We have built-in logging at a level that wasn’t possible in prior releases, but that comes at a cost. The default retention is pretty high (365 days) and MS chose to handle cleanup using cascading deletes. This sort of coding makes life easier for the developers, but almost always performs poorly. That’s especially the case when you have 1 parent row with 100’s of thousands or child rows related to it.  The problem is compounded because the first time people realize they need to lower the retention limit is about the same time that the database is starting to fill up or has filled up. At that point, it’s too late to lower the retention by a large number because the corresponding delete will still cause issues.

I knew someone had written some code for this and a quick search on StackOverflow found this question and the answer from Bill Fellows ( b | t ) :
http://stackoverflow.com/questions/21781351/how-can-i-clean-up-the-ssisdb

I’m including the code here for a quick reference, though I recommend reading his answer, up-voting it, and checking out his other resources. The only change I’d consider making is perhaps clearing out the child tables in batches of 1000-5000 rows at a time instead of all rows. I found that I had to work in increments of no more than 10 days at a time to delete history. Any more than that and the time to execute increased quite a bit. In this case, look for your max days currently retained, and drop the @DaysRetention variable to that less a couple of days to see what is removed. Keep dropping that until you reach the point at which you want to stop, set your SSIS Catalog retention days to that value, and you should be good from that point on.

It’s probably worth noting that in SQL Server 2012, MS didn’t include proper indexing by default. You may need to add some indexes in order to have the delete perform in a timely manner and avoid excessive table scans. The index issue was addressed in SQL Server 2014 and higher (and I think in SQL Server 2012 SP2 as well).