Tuesday, 12 May 2015

Email query results from a SQL Server Agent Job step

I recently discovered sp_send_dbmail. Finally I have an easy way to send the results of overnight scheduled tasks to my inbox. However there were a few unexpected things that came up...

I am assuming that you have setup database mail? If so, then please continue.

So lets imagine that this is your TSQL job step. The query is trivial (but is actual quite useful when learning about who is running your SQL Agent jobs. Ill leave that to another blog post though.)

EXEC msdb.dbo.sp_send_dbmail
@profile_name = 'Default',
@recipients = 'your@email.com',
@subject = 'Sent from SQL Agent Job',
@execute_query_database = 'DB_TO_RUN_QUERY_ON',
@query = N'
SELECT  
ORIGINAL_LOGIN() AS [ORIGINAL LOGIN]
,SUSER_SNAME() AS [EFFECTIVE LOGIN]
,USER_NAME() AS [DB USER]',
@attach_query_result_as_file = 0;

Under the advanced page of your job step, you need to set Run As User to be a user that is a member of the DatabaseMailUser database role in the msdb database. For ease of use and to get things working I chose 'dbo' If you dont do this, then the TSQL will run under the SQL Agent service - and by default, this wont be a member of DatabaseMailUser database role in the msdb database.

The next thing was to set the database drop down list field to be msdb. 
I thought this was totally unexpected because I had already fully qualified the stored procedure with DB and schema. i.e. msdb.dbo.
And no, it makes no difference if you put a 'USE msdb;' at the top of your TSQL job step! 
MSDB MUST be set on the database drop down list. 

Tuesday, 21 April 2015

SSIS Custom Components and a Tale of Versions


I have been investigating SSIS custom components. These will allow non programmers in my team to use SSIS components which will perfectly fit our business data needs. e.g. custom data conversion. 
I am assuming that you know how to make an SSIS custom component?

  1. Create a class library: 
    1. Reference some dlls from the SQL Server Client Tools SDK.
    2. Inherit from PipelineComponent.
    3. Override some methods.
    4. See this for a starting point: https://msdn.microsoft.com/en-us/library/ms135993.aspx
  2. Take your newly built custom component:
    1. Copy it into a SQL Server installation folder.
    2. Registered into the GAC.
  3. Open your SSIS package IDE and register the component/refresh your toolbox.
  4. Your SSIS toolbox will then hopefully show this component.
  5. Bish bash bosh!
Hmmm....


I had a few issues getting this to work though because I had so many environment version 'variables' to deal with. For example:
  • Architecture - 32bit or 64bit? (of SQL Server/.Net/gacutil)
  • .Net framework - 2.0, 3.5, 4.0, 4.5 or 4.51?
  • Gacutil version?
  • SQL Server - 2008 or 2014?
  • SSIS Version - Business Intelligence Development Studio (BIDs) or Data tools for Visual Studio 2013?
What has made my situation more complicated is that I am running two versions of SQL Server: 2008 (for production) and 2014 (for development - getting ready for production to be upgraded). 
I am also running 64 bit version of SQL Server/Windows 7 and so my installation folders are across Program Files and Program Files (x86). On top of this, when I installed 2014 I didn't include the Client Tools SDK feature - so for a while I was missing the latest dlls - I wasn't even sure there were later dlls as my installation looked like it had everything...

On that note - you need to know about your SQL Server installation folders. Within the Microsoft SQL Server installation folder you will find lots of numbered folders. The folders relate to the versions of SQL Server installed. For example:
  • 80 = SQL Server 2000
  • 90 = SQL Server 2005
  • 100 = SQL Server 2008
  • 105 = SQL Server 2008 R2
  • 110 = SQL Server 2012
  • 120 = SQL Server 2014
These numbered folders can exist in both Program Files and Program Files (x86) installation folders. However, just because you have the folder, it doesn't mean you that have any or all of the features installed for that version. Folders are created regardless for backward compatibility reasons - e.g. access to older versions of DLL's & tools.

Ok, Now what?

Below are the dlls that you need to reference from your custom component dll. They come from the SQL Server Client Tools SDK, They are not listed under the .net framework assemblies (In Visual Studio 'Add Reference' dialog). You will need to manually browse to this directory and add them. 
  • Microsoft.SqlServer.DTSPipelineWrap.dll
  • Microsoft.SQLServer.DTSRuntimeWrap.dll
  • Microsoft.SQLServer.ManagedDTS.dll
  • Microsoft.SqlServer.PipelineHost.dll
With all this said, we have two routes for creating a custom component:

SQL Server 2005/2008 & BIDs 

  • Reference dlls from:
    • C:\Program Files (x86)\Microsoft SQL Server\100\SDK\Assemblies\
  • Copy your custom component dll to: 
    • C:\Program Files (x86)\Microsoft SQL Server\100\DTS\PipelineComponents\
  • Your custom component should target .Net 2.0, 3.0 or 3.5.
  • Use SQL Server Business Intelligence Development Studio to make your SSIS packages.
  • Ensure that the SQL Server 2005/2008 Client Tools SDK feature is fully installed.
  • Use gacutil from .Net 2.0, 3.0 or 3.5 or higher to register your component dll.
  • Add component to BIDs toolbox by right clicking the tool box and select 'Choose Items...'. Find your component under the 'SSIS Data Flow Items' or 'SSIS Control Flow Items' tab (depending on what type of component you have created). Check the component checkbox. Click OK.

SQL Server 2012/2014 & Data Tools

  • Reference dlls from:
    • C:\Program Files (x86)\Microsoft SQL Server\120\SDK\Assemblies\
  • Copy your custom component dll to: 
    • C:\Program Files (x86)\Microsoft SQL Server\120\DTS\PipelineComponents\
  • Your custom component should target .Net 4.0 or higher.
  • Use SQL Server Data Tools for Visual Studio to make your SSIS packages.
  • Ensure that the SQL Server 2012/2014 Client Tools SDK feature is fully installed.
  • Use gacutil from .Net 4.0 or higher to register your component dll.
  • Add component to Data Tools toolbox simply by right clicking the tool box and select 'Refresh Toolbox'.


Hope this helps you through a bad day!


Friday, 17 April 2015

Basic But Effective Backup Software


After a few years of using vice versa backup software, and attempting to use Acronis (urgh) I just discovered Windows Robocopy!

Its been built into Windows for a while (doh!) and works really well (hooray!)

I prefer to have two separate harddisks running on my home server. D for data and E for backup. Both are 2TB (they died) 4TB in size. I dont want mirroring because I dont want to automatically mirror any mistakes I might make. I don't like big bulky and bloated backup software. I don't mind one liner command prompts... I also like free stuff.

This is how I did it:

Put this in a batch file (on a single line):

robocopy D:\ E:\ /MIR /XD "D:\$RECYCLE.BIN" "D:\System Volume Information" /R:10 /W:10 /LOG+:"D:\RoboCopyLog.txt"

Where:

D:\ = source
E:\ = destination
/MIR = mirror everything
/XD = exclude directories
/R = number of retries if failure
/W = seconds to wait between retries
/LOG+ = append to log file

Run the batch file from windows task scheduler e.g. every night at 4am. It will need to be run under an administrator account.

I have about 1.5TB of data to backup - it only takes approx 30 seconds to run if there were no changes. Sweet!

Cheap and effective. Hope it can help you...

Monday, 16 June 2014

Optimising Index Optimisations on SQL Server


So you have created an maintenance plan for your databases which will rebuild indexes? Nice.
Is it taking a long time? Hmmm... Not so nice.

How can you improve this? Firstly, sometimes you can and should avoid messing with your indexes. Blindly rebuilding all indexes all of the time is a bit 'brute force' - you can however just get away with reorganising the index, and in other cases the index might not need any attention whatsoever. This is a more intelligent and optimised approach which could get you valuable time during a maintenance plan.
The sql code snippet below will 'intelligently' generate index optimising DDL. 

Remarks:
  1. Assumes that you know the difference between an index rebuild and an index reorganise.
  2. Avoid using DBCC DBREINDEX. It is obsolete and will be removed from a future version. (although still present in sql server 2014). Instead use ALTER INDEX.
  3. Using the Microsoft suggested fragmentation thresholds i.e. 
    1. <= 5% - dont do anything. The performance gains will be negligible compared relative to the cost of  rebuilding or reorganising.
    2. > 5% and <= 30% - perform a reorganise
    3. > 30% - perform a rebuild
  4. Assumes you are optimising a single database - i.e. the database where this sp will execute.
  5. Allows accuracy to be tweaked via @statsMode parameter.
  6. Assumes that you want to optimise all tables
  7. Assumes you have a sysadmin role on the server.
  8. Assumes you have dbo or db_ddladmin role on database.
  9. Assumes a constant fill factor for all tables e.g. 90%. Its a good starting point default but is it really the best setting for every index in your database? Probably not! Some tables will grow very fast and some wont grow at all. Meaning 90% will not be enough or be overkill respectively. Experiment by seeing how often you need to defrag each of your indexes. This suggests a further improvement for this sql - how to automatically decide good fill factors based on past fragmentation. This assumes you are regularly optimising indexes e.g. every day or every week and the database is growing at a fairly constant rate, then you could use the fragmentation percentage as a guide to 'predicting' what the fill factor should be during the next index rebuild. (you cant set fill factor during a reorganisation)
  10. In real life, I run the generated DDL statements script during my overnight maintenance plan.

IF  EXISTS (SELECT * FROM sys.objects WHERE OBJECT_ID = OBJECT_ID(N'[db_ddladmin].[spGenerateIndexOptimisationDDL]') AND TYPE IN (N'P', N'PC'))
DROP PROCEDURE [db_ddladmin].[spGenerateIndexOptimisationDDL]
GO

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

-- =============================================
-- Author: Joe
-- Create date: 16/06/2014
-- Description: Build appropriate DDL to optimise indexes
-- =============================================
CREATE PROCEDURE [db_ddladmin].[spGenerateIndexOptimisationDDL]
@forceRebuild BIT = 0,
@forceReorganise BIT = 0,
@defaultFillFactor TINYINT = 90,
@statsMode nvarchar(20) = 'LIMITED' --(least accuracy, least time) LIMITED | SAMPLED | DETAILED (most accuray, most time)
AS
BEGIN

SET NOCOUNT ON;

if(@forceRebuild = 1)
begin
SELECT 
'ALTER INDEX [' + i.name + '] ON [' + SCHEMA_NAME(t.schema_id) + '].[' + t.name + '] REBUILD WITH (FILLFACTOR = ' + CAST(@defaultFillFactor AS VARCHAR(3)) + ', SORT_IN_TEMPDB = ON, STATISTICS_NORECOMPUTE = ON);'
FROM sys.indexes AS i
INNER JOIN sys.tables AS t ON t.object_id = i.object_id
WHERE i.name IS NOT NULL;
end
else if(@forceReorganise = 1)
begin
SELECT 
'ALTER INDEX [' + i.name + '] ON [' + SCHEMA_NAME(t.schema_id) + '].[' + t.name + '] REORGANIZE;'
FROM sys.indexes AS i
INNER JOIN sys.tables AS t ON t.object_id = i.object_id
WHERE i.name IS NOT NULL;
end
else
begin 
SELECT 
CASE 
WHEN s.avg_fragmentation_in_percent > 5.0 AND s.avg_fragmentation_in_percent <= 30.0 THEN 'ALTER INDEX [' + i.name + '] ON [' + SCHEMA_NAME(t.schema_id) + '].[' + t.name + '] REORGANIZE;'
WHEN s.avg_fragmentation_in_percent > 30.0 THEN 'ALTER INDEX [' + i.name + '] ON [' + SCHEMA_NAME(t.schema_id) + '].[' + t.name + '] REBUILD WITH (FILLFACTOR = ' + CAST(@defaultFillFactor AS VARCHAR(3)) + ', SORT_IN_TEMPDB = ON, STATISTICS_NORECOMPUTE = ON);'
ELSE NULL
END
FROM sys.dm_db_index_physical_stats (DB_ID(), NULL, NULL, NULL, @statsMode) AS s
INNER JOIN sys.indexes AS i ON s.object_id = i.object_id AND s.index_id = i.index_id
INNER JOIN sys.tables AS t ON t.object_id = i.object_id
WHERE s.avg_fragmentation_in_percent > 5
AND i.name IS NOT NULL;
end
END;
GO

/* 
-- TESTING

EXEC [db_ddladmin].[spGenerateIndexOptimisationDDL] 
@forceRebuild = 0,
@forceReorganise = 0,
@defaultFillFactor = 90,
@statsMode = 'SAMPLED' --(least accuracy, least time) LIMITED | SAMPLED | DETAILED (most accuray, most time)
*/



References:
http://msdn.microsoft.com/en-gb/library/ms189858.aspx

Thursday, 15 May 2014

SQL Server Intellisense Not Working?

Im using SQL Server 2008 - or to be exact (as sometimes these things make a difference...:)

Microsoft SQL Server 2008 (SP3) - 10.0.5512.0 (X64)   Aug 22 2012 19:25:47   Copyright (c) 1988-2008 Microsoft Corporation  Developer Edition (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1.

With that said, you should check the following:

  1. With an open query in SSMS > Edit > Intellisense > Refresh Local Cache (Ctrl + Shift + R)
  2. SSMS > Tools > Options > Text Editor > Transact-SQL > General : > Statement completion:
    1. auto list members = checked
    2. parameter information = checked
  3. SSMS > Tools > Options > Text Editor > Transact-SQL > IntelliSense > Enable IntelliSense
  4. With an open query in SSMS > Query > Intellisense Enabled (should already be enabled because of point 3 but just in case...)
  5. With an open query in SSMS > Query > SQLCMD Mode should be disabled (if you want intellisense to work!) This is what caught me out recently and prompted this post. I had set my queries set to open in SQLCMD mode by default i.e. SMS > Tools > Options > Query Execution > SQL Server > General > 'By default, open new queries in SQLCMD mode' = checked.
If its still not working then try the latest services packs, cumulative updates, installation repair or just turn it off and on again. Good luck. Happy sensing!

Wednesday, 12 October 2011

Visual Studio Not Showing GAC Assemblies

While using Visual Studio 2010 i recently noticed that i couldnt see references to the Microsoft.SQLServer set of dlls.
I could see them if I went to C:\Windows\assembly.
They were definitely installed - and i could manually reference them by browsing to:
C:\Program Files\Microsoft SQL Server\100\SDK\Assemblies\

And in fact my project was using them...
Other websites suggested that they should be listed under the .Net tab when adding references.
I checked the registry:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework
and under AssemblyFolders i had the key:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\AssemblyFolders\SQL Server Assemblies - with the value: C:\Program Files\Microsoft SQL Server\100\SDK\Assemblies\

I then found this article:
See section "To display an assembly in the Add Reference dialog box"

So i tried adding this key to the registry:


HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\v4.0.30319\AssemblyFoldersEx\SQL Server Assemblies

Restarted visual studio, and now my .Net assemblies are back under the .Net tab...

UPDATE:

I have just installed the DirectX 9 with managed code support SDK...
the same problem. I was able to find this:
http://connect.microsoft.com/VisualStudio/feedback/details/464577/

"The underlying issue is that AssemblyFolders is not multi-targeting aware and so does not work well with our new multi-targeting features in VS 2010. AssemblyFoldersEx is version specific and so works with multi-targeting.
If you have a control that works with multiple versions of the .Net Framework you register it with the lowest version of the framework that it works with. So for example if you have a control that works with .Net 3.0 and up you would register it under the .Net 3.0 entry of AssemblyFoldersEx. It would then show up in the Choose Items and Add References dialog for 3.projects targeting 3.0, 3.5 and 4.0 but not projects targeting 2.0. "

The libraries that i have been trying to install are not aware of this and so just use the old registry location

Wednesday, 3 November 2010

Debugging code started from the Global.asax Application_Start event when using IIS 7 integrated pipeline

The new integrated pipeline with IIS7 is good... I think!? what I mean is that I know its good under the bonnet but it really doesn't affect me much once my website is running. So lets assume that we have converted our web.config to use integrated pipeline and everything is working fine... (this alone can take a while, but that's another blog...)

Then one day you need to step into some code that occurs whenever the Application_Start event fires from Global.asax... You find that even though you have breakpoints in the code, they appear to be ignored and the application starts up. The trick to getting into this code is...

  1. Run the website from visual studio - allow it to start normally - your breakpoints will be ignored as usual.
  2. Edit the web.config file - just adding a space character and then re-saving will do - you don't actually have to change any settings. 
  3. Editing the web.config in any way will cause the application to stop.
  4. Make a request to the website e.g. request a page, web service etc. This will cause the application to start.
  5. Because we are ALREADY attached to the process with the visual studio debugger, we will now get to step into the application_start event. 
  6. debugging joy!
I believe the reason we cant step into the application_start on normal visual studio debug startup is because by the time we have attached to the website process, the application_start event has already occurred.

This solution may not work if you are unable to make the website run in the first place, i.e. if the code in application_start doesn't allow the website to run because it is faulty/buggy etc.

Solutions that didnt work but looked good at the time...

  1. Visual Studio > Web site project property pages > Start Options > Start Action > Select "Dont open a page. Wait for a request from an external application" - I thought this would perhaps delay the first request to IIS and therefore delay application_start - meaning that when we do request a page, our debugger would be attached. This doesnt work though as the application will start regardless of this option. There must be another request sent from visual studio during the start up of the debugger. Todo: we should really investigate this by analysing the requests made to IIS when we start the debugger. There may be another setting?
  2. Recycling the application pool associated with the website you are debugging - unfortunately although this does cause application_event to fire, it also causes visual studio to detach from the process! No stepping in goodness to be had here :(