Tuesday, December 22, 2009

Backup Failures in OLAP

I've recently come across backup failures on an SSAS database with the below error...

"Memory error: Allocation failure : Not enough storage is available to process this command."

So I was thinking about the errors & think these can be gotten around to a degree.

In AS there is a setting called MemoryLimitErrorEnabled that controls whether an error will be generated if the estimated memory needed for an operation can’t be granted. See this technet article for more in-depth info.

I’m pretty sure this is causing at least some of the errors. Since this is for estimated memory usage only there shouldn't’t be any detrimental effect to disabling it for backups (presumably you won't be processing cube dimensions while taking backups). The worst case is the backup will just fail a bit later when it actually runs out of memory; at best the backup will succeed & not require manual intervention or a restart.

The plan would be to

1. set MemoryLimitErrorEnabled to false

2. take a backup

3. set MemoryLimitErrorEnabled back to true

Setting MemoryLimitErrorEnabled does not require a restart to take effect & can be scripted.

I've tested this method on a Development server & it worked perfectly.

Quick update, on implementing this solution I realized that the workflow of the SQLAgent job would need to change. This workflow is designed to always leave MemoryLimitErrorEnabled - true regardless of whether the backup step succeds or fails. Please forgive the roughness of the document, but I don't have access to Visio.

Tuesday, October 20, 2009

SQLSaturday 21 - Encrypting SSIS Connection Strings

I had the chance to speak at the recent SQLSaturday in Orlando on Encrypting SSIS Connection Strings. Many thanks to Andy Warren for giving me the opportunity.

As always the day was top notch all the way. Andy & team put on a great event.

I've commented the code used for the demo (it's fully functional, just needs the passwords & file path for the master key backup changed) with enough information to implement the solution & placed it here.

Thanks again to Andy & team for a great event. Also a special thanks for all those who attended my session & restrained from any rotten fruit throwing :-).

Wednesday, October 7, 2009

Finding Linked Servers with Open Mappings to SA

Recently I've been working on implementing PBM (Policy Based Management) in our environment. In addition to using the predefined conditions & facets I've been creating custom Policies using executesql().

As part of this effort we wanted a policy to alert us to Linked Servers with an open mapping to SA. Since this isn't something that was really obvious it seemed worthy of a quick post. There are 2 versions of the code, one for SQL2000 & one for SQL2005. The code is written with PBM in mind so parses as one string, you can take this code out & it will still work just fine. The code is here.

Don't forget SQLSaturday #21 in Orlando is less than 2 weeks away.

Friday, September 4, 2009

SQLSaturday #21 is Coming to Orlando...

...On October 17th.

For anyone who's not familiar with SQLSaturday, it's a free (a $10 fee for lunch is being charged this year due to funding challenges) all day event for SQL Pro's & those wanting to expand their SQL skills. There's a wide variety of sessions throughout the day organized in "tracks". For example there's a track for Administration which focuses more on, well, the administrative aspects of working with SQL & a track on SSIS which focuses on ETL. The sessions cover a broad range of depth & there are sessions suitable for beginners all the way through advanced users.

I had the privilege of attending the very first SQLSaturday & several more since then & I can't recommend it highly enough. Top notch speakers (for example Andy Warren, Brian Knight, Buck Woody & Kendal Van Dyke to name a few), lots of info, lots of fun & you certainly can't beat the cost. In today's economic climate, with training budgets being slashed or eliminated this is a great way to pick up high quality training for the cost of lunch.

In addition to the free event, EndToEnd Training is also presenting 5 full day seminars the week leading up to SQLSaturday. Each seminar is $149 with a 20% discount to those that attend 2 or more sessions.

<shameless plug>
This year I'll be presenting a 15 minute session on Encrypting SSIS Connection Strings.
</shameless plug>

So, get signed up & plan on being in Orlando on the 17th. Trust me, you'll thank yourself.

Thursday, August 27, 2009

Checking SQL & SSIS Versions with Powershell

I'm currently in the process of applying SP3 + CU5 to all our SQL2005 Instances. I wanted to be able to quickly check whether the install was successful (we're experimenting with using SCCM to push to our standalone Instances).

Checking SQL was easy enough, just using @@version in a multi-query was enough. However SSIS is a little tougher, you can't check it via T-SQL and I don't have the patience to login to every machine and check it that way.

The solution I came up with uses SERVERPROPERTY('ProductVersion') to check the SQL build & [system.diagnostics.fileversioninfo]::GetVersionInfo() to check the Product version on the SSIS executable.

It returns a table with the Instance, Version, SQL & SSIS Builds and Notes (for example whether SSIS is installed & if installed do the SQL & SSIS Builds match). The code is here.

Thursday, August 20, 2009

Finding Event ID's

A while back, I had the need to look for a particular EventID across all our SQL servers. The script I came up to do the work is here.

This was one of my earliest efforts so its a bit primitive, but works just fine. The main working portion of the script is a straighforward query of the Win32_NTLogEvent wmi class.

The fun part was converting the wmi time into something more usable/readable. I did this in a function that breaks the wmi time into discrete chunks & then strings them all back together in my preferred format. Additionally it modify's the new date/time to account for daylight savings.
Function Convert-WMITime

    #Break Date & Time into discreet elements
    $ds = [string]$WMITime.substring(21,4)
    $yr = [string]$WMITime.Substring(0,4)
    $mo = [string]$WMITime.substring(4,2)
    $dy = [string]$WMITime.substring(6,2)
    $tm = [string]$WMITime.substring(8,6)
    $hr = [string]$tm.Substring(0,2)
    $min = [string]$tm.substring(2,2)
    $sec = [string]$tm.substring(4,2)
    #Create string in desired format
    $s = "$mo/$dy/$yr " + $hr + ":" + $min + ":" + $sec
    #cast result as DateTime type
    $result = [DateTime]$s
    #Account for DST
    if ($cp = $ds.Contains("-240"))
        $result = $result.AddHours(-1)
    return $result

Friday, August 14, 2009

Get Files, Sizes & Percent Free by DB

This week I've been working on our space needs for the upcoming budget year. As part of that effort I spent a few minutes writing a script to return infrmation on database files; including file group, logical name, physical name, allocated size, used size & percent free space for each file. The code is here.

In the next few weeks I'm planning to do a write up on the entire process I used to generate my space requests, it involves several pieces, inluding a process to populate database size (this is not mine).

Monday, August 3, 2009

Generating log file shrinks

We had a situation where we needed to shrink all log files for all databases on a couple of drives preparatory to moving all the files to a new LUN which would then be mounted as the same drive letter. This was being done to correct improperly aligned partitions on the drives causing performance issues.

When disk partitions are not properly aligned there is a significant performance degradation (30-40%) on the disk, see Chad Miller's blog post on this for more detail. Additionally, correcting this will reduce throughput to the SAN array which can benefit overall performance of the array.

I don't normally recommend (and would typically strongly oppose) shrinking files, but in this case we had a limited time frame to make this move & shrinking these files meant 150GBs less data to move which was a make or break for the timeline we were allowed to complete the work in.

Since these were all log files I choose to shrink them all to 1MB for maximum benefit. For the sake of paranoia (I like to double/triple/quadruple check scripts prior to running in Production) I had it generate the statements to run & then ran them manually. To customize for your drives you would just change the T-SQL defined in the @sql variable. The script is here.

Monday, July 27, 2009

Simple-Talk article

The folks at Simple-Talk were kind enough to give me an opportunity to write an article for them about Powershell, Reg-Gate SQLCompare & Snapshots.
The result is here, please let me know what you think by voting. Thanks!

Friday, July 17, 2009

Granting use of Graphical Execution Plan in SQL2005

I recently had the need to allow non-sysadmin, non-db_owner users to use the graphcal execution plan in SQL2005 on a query using tempdb (temp table). Assuming this worked just like "SET SHOWPLAN_* ON" I granted the user Showplan on the DB in question.

To test, I created a user with access to only the DB in question & tested "SET SHOWPLAN_TEXT ON", "SET SHOWPLAN_XML ON" & "SET SHOWPLAN_ALL ON" with no issues. However, when I tried to use the graphical execution plan ("Include Actual Execution Plan" option on the query) it failed with the below error...
Msg 262, Level 14, State 4, Line 29
SHOWPLAN permission denied in database 'tempdb'.

So, for a "normal" user who uses the graphical execution plan it seems my only option is to explicitly add them to tempdb & grant Showplan permissions (this would entail adding them to model as well so their perms persist on startup when tempdb is recreated). The other option would be to grant Showplan perms to guest on model/tempdb which given the official & observed (for example) vulnerabilities associated with SHOWPLAN seems a little crazy.

I don't normally knock MS around too much, but I really dislike both these options. Basically I'm being forced to choose between a gaping security hole (the guest scenario), a smaller security hole + a (to me) bad practice of granting a user access to a system DB that they really shouldn't need OR denying the perms to a user who's actually trying to do right.

Wednesday, July 15, 2009

Using Powershell to gather NIC info

In the course of troubleshooting a performance issue (in part caused by replication latency), we noticed that the NIC on our SQL server had been changed from Full Duplex to Half Duplex. Needless to say as soon as we switched back to Full Duplex our latency dropped back to what we'd come to expect. This was renforced by the behavior we observed when observing replication latency where it would be delivering transactions in bursts which lines up with the Half Duplex arcitecture of one-way communication.

As as outgrowth of this I thought it would be prudent to check the rest of our environment to make sure we didn't have any more mis-configured NICs. The majority of our Prod SQL Instances are clustered & have 2 NICs, one for the Heartbeat network & one for the Public Network. Per MS best practices for Clustered SQL the Heartbeat NIC is set to "100 Mb Full" & the Public NIC is set to "1000 Mb Full" so differentiating bewteen the NICs is important.

My first thought was to use the Win32_NetworkAdapter class which has properties for speed & duplex, unfortunantly looking at msdn it appears the properties haven't been implemented yet (this was verified in my testing). After viewing an excellent post on the topic by Hugo Peeters, I decided his method would work with a few small changes. My main gripe about this is the fact that MS forces us to (yet again) go to the registry for information that should be in WMI. Since the registry path changes by vendor (and in some cases model number) you'll need to modify your version of the script to account for the NIC vendors/models in your environment.

The script is here, I was not able to get it to show in a usable format on the page so we'll try this instead.

Tuesday, June 16, 2009

Fun with Powershell & Red-Gate SQL Compare

Last week I was asked to help out on a project where a group of users that didn't have access to Production needed to be able to perform schema compares of Dev <-> Prod DBs. We currently use Red-Gate SQL Compare which is a great product, BTW so a solution was designed around it. The solution decided on was to use SQL Compare to generate Snapshots of Prod nightly that could then be compared against without the need for access to Prod (Note: a SQL Compare Snapshot is a flat file which contains all information needed to perform a complete schema compare, but no actual data).

We wanted the process that generated the Snapshots to automatically run nightly against all online, non system DBs on all Prod SQL Instances, also excluding any DBs reserved for DBA usage only. The process should also be able to be manually run for an individual DB, one Instance or a group of SQL Instances. My final solution involved 2 powershell scripts & a scheduled job. One script actually creates the Snapshots & one script runs 4 iterations of the create script in parallel (more on this later). There are a few considerations to take into considerations when using this method, SQL Compare 8.0 is required to create Snapshots from the command line & to compare against the Snapshots. There is a free utility from Red-Gate that allows for command line creation of Snapshots (they are also usable by SQL Compare 7.0 and higher) that can be found here "http://www.red-gate.com/downloads/labs/RedGate.SQLSnapper.zip" in our case we needed some of the features offered by Professional Edition so we went that route.

The automated steps looks like this...

  1. Scheduled job "Create RG Snaps" runs...
  2. PWS Script "Run-RGSnaps.ps1" which make 4 parallel calls to...
  3. PWS Script "Create-RGSnaps.ps1" which save the snaps to... \\server\FOLDER\RedGate_Snaps\ in...
  4. SERVER$INSTANCE.DB naming format

There are 3 methods by which the PWS Script "Create-RGSnaps.ps1" can be run manually.

  • One Database
  • Open Powershell
  • Navigate to scripts folder, issue a "sl C:\\Script" (no quotes) command
  • To run against yourdb on SERVER1\INSTANCE1, for example, issue a "./Create-RGSnaps.ps1 SERVER1\INSTANCE1 yourdb" (no quotes) command. Output will look like the below...

SQL Compare Command Line V8.0.0.965==============================================================================

Copyright c Red Gate Software Ltd 1999-2009 Serial Number: 012-345-678901-2345 Creating database snapshot file '\\SERVER\FOLDER\RedGate_Snaps\SERVER1$INSTANCE1.yourdb .snp' from database 'yourdb ' on 'server1\instance1'...


  • One Instance
  • Open Powershell
  • Navigate to scripts folder, issue a "sl C:\\Script" (no quotes) command
  • To run against SERVER1\INSTANCE1, for example, issue a "./Create-RGSnaps.ps1 SERVER1\INSTANCE1" (no quotes) command. Output will look the same as above...

  • Multiple Instances
  • Open powershell
  • Navigate to scripts folder, issue a "sl C:\\Script" (no quotes) command
  • To run against SERVER1\INSTANCE1 & SERVER2\INSTANCE2, for example, create a file named Instances.txt (or a name of your choice) in "C:\\Script" that contains entries for SERVER1\INSTANCE1 & SERVER2\INSTANCE2 (one line per instance)
  • Issue a "Get-Content Instances.txt ForEach {./Create-RGSnaps $_}" (no quotes) command. Output will look the same as above...

The script that does the work "Create-RGSnaps.ps1" takes a required parameter of Instance & an optional parameter of Database. If a value for Database is passed the script calls the command line interface for SQL Compare with no further processing. If no value for Database is passed the script parses all DBs on the Instance that are active/accessable & calls SQL Compare for each DB. The code is below...

  1. param($Instance = $(throw 'Instance is required.'), $Database)   
  3. $ErrorActionPreference = "SilentlyContinue"  
  5. $Path = "\\SERVER\FOLDER\RedGate_Snaps\"  
  6. $Query = "SELECT name FROM dbo.sysdatabases (NOLOCK) WHERE dbid > 4 AND name NOT IN ('distribution','DBADatabase') AND has_dbaccess(name) = 1"  
  8. #########################  
  9. function Get-SQLData  
  10. {  
  11.     param($Instance = $(throw 'Instance is required.'), $DBName = $(throw 'Database is required.'), $Query = $(throw 'Query is required.'))  
  13.     Write-Verbose "Get-SqlData Instance:$Instance Database:$DBName Query:$Query"  
  15.     $ConnString = "Server=$Instance;Database=$DBName;Integrated Security=SSPI;"  
  16.     $DataAdapter = New-Object "System.Data.SqlClient.SqlDataAdapter" ($Query,$ConnString)  
  17.     $DataTable = New-Object "System.Data.DataTable"  
  18.     [void]$DataAdapter.fill($DataTable)  
  19.     $DataTable  
  20. } #Get-SQLData  
  22. #########################  
  23. function Get-Databases  
  24. {  
  25.     param($Instance = $(throw 'Instance is required.'))  
  27.     Get-SqlData $Instance 'master' $Query  
  29. } #Get-Databases  
  31. #########################  
  32. function Invoke-SqlCompare  
  33. {  
  34.     param($Instance = $(throw 'Instance is required.'), $Database = $(throw 'Database is required.'), $Path = $(throw 'Path for Snap files required.'))  
  36.     $InstanceStrng = ($Instance -replace('\\','`$')).ToUpper()  
  37.     $Command = '&"C:\Program Files (x86)\Red Gate\SQL Compare 8\sqlcompare.exe" /Server1:' + $Instance + ' /database1:"' + $Database + '" /makesnapshot:"' + $Path + $InstanceStrng + '.' + $Database + '.snp" /force'   
  39.     Invoke-Expression $Command  
  41. #Invoke-SqlCompare   
  43. #########################   
  44. #Main   
  46. If ($Database)   
  47.     {Invoke-SQLCompare $Instance $Database $Path}   
  48. Else   
  49.     {Get-Databases $Instance | ForEach {Invoke-SQLCompare $Instance $_.name $Path}}   

The script that runs the Create script in parallel "Run-RGSnaps.ps1" was written by my boss, his blog is here http://chadwickmiller.spaces.live.com/default.aspx & was only modified by me to meet my needs. Since in Powershell 1.0 there's no way to spawn background processes the Run script launches new PWS threads unitil it reaches the number defined as it's max & then checks every 15 seconds for less than the max threads (excluding itself) & if less than the defined max, spawns a new thread. I take no credit for his excellent work, just want to include for completeness. The script takes no parameters, it has a built-in function to get a list of Prod Instances we have defined in a DB & pulls back the ones marked active & passes them to the create script to run in parallel.

I tested running serially and with 4 & 8 iterations in parallel with nothing else running on the host server. The best balance between performance & resource utilization was at 4 parallel iterations (your results will probably vary). Serially resource usage was minimal we saw 8% CPU, 447 DBs across 26 Instances completed in 56 minutes average; at 4 parallel iteration we saw 33% CPU & completion of the same DBs/Instances took 15 minutes average; at 4 parallel iteration we saw 85% CPU & completion of the same DBs/Instances took 12 minutes average. the server we're running this on has 4 single core procs, my suspicion is that you can run as many threads as you have processor cores. Code is below...

  1. $ScriptRoot = "C:\<yourdir>\Script\"  
  2. $SQLPSXServer = "SERVER3\INSTANCE3"  
  3. $SQLPSXDatabase = "DB3"  
  4. $Query = "SELECT <column> FROM dbo.<table> (NOLOCK) WHERE IsEnabled = 'true'"  
  5. $MaxThread = 4  
  6. $ServerList = New-Object System.Collections.ArrayList  
  8. #########################  
  9. function Get-SQLData  
  10. {  
  11.     param($Instance = $(throw 'Instance is required.'), $DBName = $(throw 'Database is required.'), $Query = $(throw 'Query is required.'))  
  13.     Write-Verbose "Get-SqlData Instance:$Instance Database:$DBName Query:$Query"  
  15.     $ConnString = "Server=$Instance;Database=$DBName;Integrated Security=SSPI;"  
  16.     $DataAdapter = New-Object "System.Data.SqlClient.SqlDataAdapter" ($Query,$ConnString)  
  17.     $DataTable = New-Object "System.Data.DataTable"  
  18.     [void]$DataAdapter.fill($DataTable)  
  19.     $DataTable  
  20. } #Get-SQLData  
  22. ########################  
  23. function Get-SqlList  
  24. {     
  25.     param ($SQLPSXServer = $(throw 'SQLPSX Instance is required.'), $SQLPSXDatabase = $(throw 'SQLPSX Database is required.'), $Query = $(throw 'Query is required.'))  
  27.     Get-SqlData $SQLPSXServer $SQLPSXDatabase $Query | ForEach {$ServerList.Add($_.Server) > $null}  
  28. }# Get-SqlList  
  30. ########################  
  31. function LaunchThread  
  32. {  
  33.     param($SQLServer)  
  35.     $StartInfo = new-object System.Diagnostics.ProcessStartInfo  
  36.     $StartInfo.FileName = "$pshome\powershell.exe"  
  37.     $StartCommand = $ScriptRoot + "Create-RGSnaps.ps1 " + $SQLServer  
  38.     $StartInfo.Arguments = " -NoProfile -Command " + $StartCommand  
  39.     $StartInfo.WorkingDirectory = $ScriptRoot  
  40.     $StartInfo.LoadUserProfile = $true  
  41.     $StartInfo.UseShellExecute = $true  
  42.     [System.Diagnostics.Process]::Start($StartInfo) > $null  
  43. }# LaunchThread  
  45. ########################  
  46. #Main  
  48. Get-SqlList $SQLPSXServer $SQLPSXDatabase $Query  
  50. While ($ServerList.Count -gt 0)  
  51. {  
  52.     If ($(Get-Process | where {$_.ProcessName -eq 'Powershell' -and $_.Id -ne $PID} | Measure-Object).count -lt $MaxThread)  
  53.     {     
  54.         $Server = $ServerList[0]  
  55.         LaunchThread $Server  
  56.         $ServerList.Remove("$server")   
  57.     }   
  58.     Else   
  59.     {      
  60.         [System.Threading.Thread]::Sleep(15000)   
  61.     }   
  62. }  

Friday, May 22, 2009

Checking for a Service in Powershell

Recently I had the need to check if a service was across all our SQL servers. In the interests of efficiency I thought it would be fun to write a little powershell script to do the checking for me.

Here's what I came up with, just pass the service name & machine(s) you want (either singly on in a *.txt file) & it will spit out a list of all with the machine, friendly service name (if installed) or whatever you choose for those where it's not installed (I chose blank).

The cmd would look like...
./Check-ServiceInstalled SERVICENAME "C:\Lists\Servers.txt"

The Code (using Registry)...
$SvcName = $Args[0]
$ServerName = $Args[1]

$ErrorActionPreference = "SilentlyContinue"

$RegPath = "SYSTEM\\CurrentControlSet\\Services\\" + $SvcName

if ($ServerName -like "*.txt*")
$Servers = Get-Content $ServerName}
$Servers = $ServerName}

ForEach ($Computer in $Servers)
$Reg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('LocalMachine',$Computer)
$RegKey = $Reg.OpenSubKey($RegPath)
if ($RegKey -eq $null)
new-object psobject |
add-member -pass NoteProperty Computer $Computer.ToUpper() |
add-member -pass NoteProperty Installed ""
new-object psobject |
add-member -pass NoteProperty Computer $Computer.ToUpper() |
add-member -pass NoteProperty Installed $RegKey.GetValue('DisplayName')

The Code (using gwmi & service)...
$SvcName = $Args[0]
$ServerName = $Args[1]

if ($ServerName -like "*.txt*")
$Servers = Get-Content $ServerName}
$Servers = $ServerName}

ForEach ($Computer in $Servers)
$Svc = Get-WmiObject win32_service -ComputerName $Computer | where {$_.name -eq $SvcName} -ErrorAction SilentlyContinue
if ($Svc -eq $null)
new-object psobject |
add-member -pass NoteProperty Computer $Computer.ToUpper() |
add-member -pass NoteProperty Installed ""
new-object psobject |
add-member -pass NoteProperty Computer $Computer.ToUpper() |
add-member -pass NoteProperty Installed $Svc.DisplayName

Thursday, May 14, 2009

Determining Duration

Recently I had the need to determine the duration of a peice of code. After playing around with a few methods I came up with the below. I did a search & didn't see a lot out on the net regarding determining duration so thought I'd post this.

declare @startdt datetime,
@enddt datetime,
@diff int

set @enddt = '2009-05-12 15:27:29.560'
set @startdt = '2009-05-12 15:27:26.560'

set @diff = (datediff(s, @startdt, @enddt))

select right('00' + convert(varchar(30),@diff/3600),2) + ':' + right('00' + convert(varchar(30),(@diff%3600)/60),2) + ':' + right('00' + convert(varchar(30),@diff%60),2) as [Duration]

Wednesday, April 29, 2009

More fun with the February 2009 SQL Security Patch

On one of our clustered instances, we had to leave the services running for the February 2009 SQL Security Patch install to initiate successfully. We did not investigate the root cause since it was in the middle of a very busy period of work & we unfortunately didn't have the time. After applying the patch, we started getting complaints from users that their application was unable to login to the DB. Looking at the SQL error logs we saw the following errors...

Login failed for user 'Fake_User'. The user is not associated with a trusted SQL Server connection. [CLIENT:]
Error: 18452, Severity: 14, State: 1.

The login in question is a SQL login not Windows so this made no sense to us, on parsing the SQL error logs we noticed this (the instance had been running in MIXED mode)...

Authentication mode is WINDOWS-ONLY.

We changed the instance back to MIXED (SQL Server and Windows Authentication) mode & that resolved the issue. Thought I'd share this since it made for an interesting few minutes.

Friday, April 3, 2009

Searching for a peice of text across a DB

Awhile back I had the need to search for a specific peice of text across a database. After playing with the code I came up with the below stored proc. I placed the SP in our "Scripting/Administation" DB. It takes a param of database & the value to find, you can use wildcards in the value to find. The SP returns the table, column & number of occurences of the searched value.


CREATE PROC dbo.usp_SearchTextAcrossDB
    @DBToSearch VARCHAR(512),
    @ValToFind VARCHAR(512)
/*    --------------------------------------------------------
        Author  :    
        Date    : 
        Purpose : Search across all Tables/Columns in any DB for a text value. 
                            Wildcards may be used in @ValToFind.
        DB(s)        : DBAScriptDB, all DBs
        Modification History
        Date        Programmer         Description
        --------    -------------     ----------------   
*/    ---------------------------------------------------------
    DECLARE @errTxt VARCHAR(256),
                    @objName VARCHAR(192),
                    @own VARCHAR(384), 
                    @tbl VARCHAR(384), 
                    @col VARCHAR(384),
                    @sqlSrch VARCHAR(8000),
                    @sqlFld VARCHAR(8000)
    CREATE TABLE #Results
            Table_Name VARCHAR(384),
            Column_Name VARCHAR(384),
            Occurrence_Count INT      

            TblOwner VARCHAR(384),
            TblName VARCHAR(384),
            ColName VARCHAR(384)

    SELECT    @objName = 'dbo.usp_SearchTextAcrossDB'

    SELECT @sqlFld = 
        'INSERT INTO #TblFlds (TblOwner, TblName, ColName)' + CHAR(10) +
        '    SELECT u.name, o.name, ''['' + c.name + '']''' + CHAR(10) +
        '    FROM ' + @DBToSearch + '.dbo.syscolumns c (NOLOCK)' + CHAR(10) +
        '        INNER JOIN ' + @DBToSearch + '.dbo.sysobjects o (NOLOCK) ON c.id = o.id' + CHAR(10) +
        '        INNER JOIN ' + @DBToSearch + '.dbo.sysusers u (NOLOCK) ON o.uid = u.uid' + CHAR(10) +
        '        INNER JOIN ' + @DBToSearch + '.dbo.systypes t (NOLOCK) ON c.xusertype = t.xusertype' + CHAR(10) +
        '    WHERE o.type = ''U''' + CHAR(10) +
        '                AND (t.name = ''varchar'' OR t.name = ''char'' OR t.name = ''nvarchar'' OR t.name = ''nchar'' OR t.name = ''text'' OR t.name = ''ntext'' OR t.name = ''sysname'')' + CHAR(10) +
        '    ORDER BY 1, 2'

    EXEC (@sqlFld)
            SELECT @own = (SELECT TOP 1 TblOwner FROM #TblFlds)
            SELECT @tbl = (SELECT TOP 1 TblName FROM #TblFlds WHERE TblOwner = @own)
            SELECT @col = (SELECT TOP 1 ColName FROM #TblFlds WHERE TblOwner = @own AND TblName = @tbl)
            DELETE #TblFlds 
            WHERE TblOwner = @own
                AND TblName = @tbl 
                AND ColName = @col
            IF @col IS NOT NULL
                    SET @sqlSrch = 'SELECT ''' + @own + '.' + @tbl + ''', ''' + @col + ''', COUNT(*) FROM ' + @DBToSearch + '.' + @own + '.' + @tbl + ' (NOLOCK) WHERE ' + @col + ' LIKE ''' + @ValToFind + ''''
                    INSERT INTO #Results (Table_Name, Column_Name, Occurrence_Count)
                        EXEC (@sqlSrch)
            IF (SELECT COUNT(TblOwner) FROM #TblFlds) <= 0

    IF @@ERROR <> 0
            SET @errTxt = 'Exception: ' + @objName + ' occured in SELECT Statement'
            RAISERROR(@errTxt, 1, 2)

    SELECT Table_Name, Column_Name, Occurrence_Count 
    FROM #Results 
    WHERE Occurrence_Count > 0 
    ORDER BY 1

    DROP TABLE #TblFlds, #Results


Tuesday, March 10, 2009

Proactive File Deletion

One of the things I've been trying to accomplish lately is automating more of the repetative "DBA 101" type tasks as much as possible. Among other things, freeing up this time allows me to concentrate your efforts in areas where there a greater value is added from a DBA's input like performance tuning or security auditing. It also greatly reduces the chance of making mistakes, I'm sure we've all had those "execute... whoops!" moments that just make you feel like a fool, or even worse, make you look like a fool in front of your boss. Recently I took a look at our drives & noticed 2 main areas where I thought a scripted solution made sense and was safe (this will vary according to your environment's needs). The areas I decided to focus on were deleting old Red-Gate SQLBackup log files & deleting old backup files. I'm a big fan of Reg-Gate so I don't want this taken as a jab against them in any way. That being said, the Red-Gate GUI places log files in "C:\Documents and Settings\All Users\Application Data\Red Gate\SQL Backup\Log\" by default with no retention policy, you can set a retention on these logs, but you have to do it on a case by case basis & frankly I'm too efficient (read: lazy) to do it this way, plus I tend to forget to change the setting. We store all our backups in a specific directory named sqlbu (name changed to protect the innocent) so my backup deletion script is geared to look for that directory on all drives & only delete from that location. The backup delete script also has a few filters that will allow backup files to be kept longer than the days to keep indicated. Specifically it will not delete the file if it has "goldcopy" or "donotdelete" in the name or has "do_not_delete" in the path. I created a SQL Agent job on all our servers that runs 2 powershell scripts that do the actual work, the scripts also return how many files were deleted. The scripts both take a param for the days to keep. SQL Agent Job...
  1. USE [msdb]   
  2. GO   
  4. EXEC msdb.dbo.sp_add_operator @name=N'DBA',    
  5.         @enabled=1,    
  6.         @weekday_pager_start_time=90000,    
  7.         @weekday_pager_end_time=180000,    
  8.         @saturday_pager_start_time=90000,    
  9.         @saturday_pager_end_time=180000,    
  10.         @sunday_pager_start_time=90000,    
  11.         @sunday_pager_end_time=180000,    
  12.         @pager_days=0,    
  13.         @email_address=N'dba@yourcomp.com',    
  14.         @category_name=N'[Uncategorized]'  
  15. GO   
  18. DECLARE @ReturnCode INT  
  19. SELECT @ReturnCode = 0   
  21. DECLARE @jobId BINARY(16)   
  22. EXEC @ReturnCode =  msdb.dbo.sp_add_job @job_name=N'00 Old File Cleanup',    
  23.         @enabled=1,    
  24.         @notify_level_eventlog=0,    
  25.         @notify_level_email=2,    
  26.         @notify_level_netsend=0,    
  27.         @notify_level_page=0,    
  28.         @delete_level=0,    
  29.         @description=N'The job will delete all *.bak & *.sqb files older than 3 days that do not have “goldcopy” or “donotdelete” in the name & do not have “do_not_delete” in the path.  The job will also delete all RG log files that are older than 7 days in the “C:\Documents and Settings\All Users\Application Data\Red Gate\SQL Backup\Log\” folder.',    
  30.         @category_name=N'[Uncategorized (Local)]',    
  31.         @owner_login_name=N'sa',    
  32.         @notify_email_operator_name=N'DBA', @job_id = @jobId OUTPUT  
  33. IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback   
  34. /****** Object:  Step [Backup Files]    Script Date: 02/25/2009 15:38:25 ******/   
  35. EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Backup Files',    
  36.         @step_id=1,    
  37.         @cmdexec_success_code=0,    
  38.         @on_success_action=3,    
  39.         @on_success_step_id=0,    
  40.         @on_fail_action=2,    
  41.         @on_fail_step_id=0,    
  42.         @retry_attempts=0,    
  43.         @retry_interval=0,    
  44.         @os_run_priority=0, @subsystem=N'CmdExec',    
  45.         @command=N'powershell "& C:\WINDOWS\Script\del-oldbackups-single.ps1 3"',    
  46.         @flags=0   
  47. IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback   
  48. /****** Object:  Step [RG Logs]    Script Date: 02/25/2009 15:38:25 ******/   
  49. EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'RG Logs',    
  50.         @step_id=2,    
  51.         @cmdexec_success_code=0,    
  52.         @on_success_action=1,    
  53.         @on_success_step_id=0,    
  54.         @on_fail_action=2,    
  55.         @on_fail_step_id=0,    
  56.         @retry_attempts=0,    
  57.         @retry_interval=0,    
  58.         @os_run_priority=0, @subsystem=N'CmdExec',    
  59.         @command=N'powershell "& C:\WINDOWS\Script\del-rglogs-single.ps1 7"',    
  60.         @flags=0   
  61. IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback   
  62. EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId, @start_step_id = 1   
  63. IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback   
  64. EXEC @ReturnCode = msdb.dbo.sp_add_jobschedule @job_id=@jobId, @name=N'Sat-Weekly',    
  65.         @enabled=1,    
  66.         @freq_type=8,    
  67.         @freq_interval=64,    
  68.         @freq_subday_type=1,    
  69.         @freq_subday_interval=0,    
  70.         @freq_relative_interval=0,    
  71.         @freq_recurrence_factor=1,    
  72.         @active_start_date=20090225,    
  73.         @active_end_date=99991231,    
  74.         @active_start_time=30000,    
  75.         @active_end_time=235959   
  76. IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback   
  77. EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)'  
  78. IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback   
  80. GOTO EndSave   
  81. QuitWithRollback:   
  83. EndSave:   
  84. GO   
  1. $factor = $Args[0]   
  3. $disks = Get-WmiObject Win32_Volume -filter "DriveType = 3"  
  4. $cutoff = (Get-Date).AddDays(-$factor)   
  6. ForEach ($drv in $disks)   
  7. {   
  8.     $path = $drv.Name + "sqlbu\"  
  10.     if ((Test-Path $path) -eq "True")  
  11.     {  
  12.         $count = (Get-ChildItem $path -include *.sqb,*.bak -Exclude *goldcopy*,*donotdelete* -recurse |   
  13.                     ?{$_.LastWriteTime -lt $cutoff -and !$_.PSIsContainer -and $_.PSPath -notlike "*do_not_delete*"}).Count  
  14.         if ($count -eq $null) {$count = 0}  
  16.         Get-ChildItem $path -include *.sqb,*.bak -Exclude *goldcopy*,*donotdelete* -recurse |  
  17.         ?{$_.LastWriteTime -lt $cutoff -and !$_.PSIsContainer -and $_.PSPath -notlike "*do_not_delete*"} |  
  18.         Remove-Item  
  19.         Write-Host "There were" $count "backup files deleted from" $path  
  20.     }   
  21. }  
  1. $factor = $Args[0]   
  3. $cutoff = (Get-Date).AddDays(-$factor)   
  5. $path = "C:\Documents and Settings\All Users\Application Data\Red Gate\SQL Backup\Log\"  
  7. if ((Test-Path $path) -eq "True")  
  8. {  
  9.     $count = (Get-ChildItem $path -include *.log -recurse |   
  10.                 ?{$_.LastWriteTime -lt $cutoff -and !$_.PSIsContainer}).Count  
  11.     if ($count -eq $null) {$count = 0}  
  13.     Get-ChildItem $path -include *.log -recurse |  
  14.     ?{$_.LastWriteTime -lt $cutoff -and !$_.PSIsContainer} |  
  15.     Remove-Item  
  16.     Write-Host "There were" $count "files deleted"  
  17. }  
  18. else {Write-Host "There are no files to be deleted"}   

Monday, February 23, 2009

SQL Agent job fails with xXx subsystem failed to load

We had a SQLAgent job that runs cmdexec fail. Looking at the job history, the following error was present "The CmdExec subsystem failed to load [see the SQLAGENT.OUT file for details]; The job has been suspended). The step failed.". Looking in the SQLAgent log the below errors were reported at startup [125] Subsystem 'SSIS' could not be loaded (reason: The specified module could not be found) [125] Subsystem 'ANALYSISCOMMAND' could not be loaded (reason: The specified module could not be found) [125] Subsystem 'ANALYSISQUERY' could not be loaded (reason: The specified module could not be found) [125] Subsystem 'QueueReader' could not be loaded (reason: The specified module could not be found) [125] Subsystem 'Merge' could not be loaded (reason: The specified module could not be found) [125] Subsystem 'Distribution' could not be loaded (reason: The specified module could not be found) [125] Subsystem 'LogReader' could not be loaded (reason: The specified module could not be found) [125] Subsystem 'Snapshot' could not be loaded (reason: The specified module could not be found) [125] Subsystem 'CmdExec' could not be loaded (reason: The specified module could not be found) [125] Subsystem 'ActiveScripting' could not be loaded (reason: The specified module could not be found) After a little research we found that the subsystems (a collection of DLLs) loaded by SQLAgent are stored in a table named "dbo.syssubsystems" in msdb. The entries in dbo.syssubsystems pointed to a location that did not exist. A search for one of the DLLs quickly revealed the actual location of the "missing" DLLs, in our case same drive, but a different folder. Once we had the true location, we updated dbo.syssubsystems.subsystem_dll to reflect to correct location (see script below). After restarting SQLAgent the previously logged errors were not present & the job was able to complete successfully. /* Allow updates to system tables */ sp_configure 'allow updates', 1 RECONFIGURE WITH OVERRIDE /* Update to correct path */ UPDATE msdb.dbo.syssubsystems SET subsystem_dll = REPLACE(subsystem_dll,'C:\Program Files\Microsoft SQL Server\MSSQL.1','C:\Program Files\Microsoft SQL Server\MSSQL.2') FROM msdb.dbo.syssubsystems WHERE subsystem_dll LIKE 'C:\Program Files\Microsoft SQL Server\MSSQL.1%' /* Dis-allow updates to system tables */ sp_configure 'allow updates', 0 RECONFIGURE WITH OVERRIDE /* Verify update */ SELECT * FROM msdb.dbo.syssubsystems

Friday, February 13, 2009

July 2008/Febuary 2009 SQL Security Patch – Known Issues and Resolutions

After going through the last few SQL Patches, I thought it would be usefull to compile a list of the top errors encountered and their fixes.
Error # in Summary.txt: 29506 Description: MSP Error: 29506 SQL Server Setup failed to modify security permissions on file xxx.xxx for user xxx. To proceed, verify that the account and domain running SQL Server Setup exist, that the account running SQL Server Setup has administrator privileges, and that exists on the destination drive.
Fix: See the following MS Article KB916766. Notes: check on the file level, look for oddly named or abnormal modified dates as a starting place. Security must be granted at the individual file level.

Error # in Summary.txt: 11032 Description: Unable to install Windows Installer MSP file
Fix: A folder named FTData is not in the expected path, a search of the patch install log for "ERROR: FTE: InstallFTERef: Fail to create FTERef file:" (no quotes) will reveal the expected path. Create the folder in this pattern “F:\MSSQL.1\MSSQL\FTData” changing the path to match your install & what shows in the patch install log, when install complete copy all files to the existing FTData folder (for example) “F:\Microsoft SQL Server\MSSQL.1\MSSQL\FTData” and delete the path\folder you created.

Error #: N/A Description: User has lost custom permissions to SSIS after patching. i.e. “Connect to SSIS Service on machine failed: Access is denied”
Fix: Patch hjorks the custom perms we have setup for SSIS. Recreate the perms according to the following post Connecting to a Remote SSIS Server with Access Is Denied Error.

Error #: N/A Description: Patch installs & reports success, but, one or more nodes in the cluster are un-patched (i.e. still on previous build)
Fix: The problem occurs when the "Windows Installer" service is set to manual & is not running. Start the service & re-run the patch on the un-patched node(s). You should be prompted to reboot, if not, reboot manually. After reboot check @@version on the node(s) in question.

Connecting to a Remote SSIS Server with Access Is Denied Error

If you are getting the dreaded “Connect to SSIS Service on machine failed: Access is denied” error when you connect as follows: For Windows 2003 Server (SP1)
  • If the user running under non-admin account it needs to be added to Distributed COM Users group
  • Run %windir%\system32\Com\comexp.msc to launch Component Services
  • Expand Component Services\Computers\My Computer\DCOM Config
  • Right click on MsDtsServer node and choose properties
  • In MsDtsServer Properties dialog go to Security page
  • Configure your settings as described bellow
  • Restart SSIS Service
In the Security page we are interested in “Launch and Activation Permissions” section. Click Edit button to see “Launch Permissions” dialog. “Launch Permissions” dialog allows you to configure SSIS server access per user/group. In the bottom of the dialog you can select:
  • Local & Remote Launch permissions if you allow to a user/group to start service locally and remotely (they will need both).

  • Local & Remote Activation permissions if you allow to a user/group to connect to SSIS server locally and remotely (they will need both).

  • Read permissions on Change Configuration
  • And also, add those users to “Distributed COM Users” group.
If the users continue to experience errors, there is final set of permissions to check on the DCOM application, controlled by the Access Limits on the "COM Security" tab.
Access Permissions
Launch Permissions

Friday, February 6, 2009

Microsoft Best Practices for SQL

Recently we had a Microsoft SQL resource onsite to troubleshoot issues we were having with adding a node to a cluster. While he was onsite we did an assesment of our cluster configurations & came up with several best practices for clustered nodes running SQL.
MSCS Specific
Heartbeat NIC: HEARTBEAT should be 1st in Network Priority (Cluster Admin > Properties > Network Priority). Heartbeat NIC: Private Heartbeat address should not be registered in DNS (NIC Properties > Advanced TCP/IP Settings > DNS > "Register this connections' addresses in DNS" checkbox). Heartbeat NIC: NetBIOS over TCP/IP should be disabled (NIC Properties > Advanced TCP/IP Settings > WINS > NetBIOS Setting > "Disable NetBIOS over TCP/IP" checkbox).
File/Print Sharing: Change maximize data throughput to Network Application (NIC Properties > File & Printer Sharing for Microsoft > Properties > "Maximize data throughput for network applications" radio button). Processor Scheduling: Adjust for best performance to Programs (My Computer > Properties > Advanced > Performance > Settings > Advanced > Processor Scheduling > "Programs" radio button). Memory Usage: Adjust for best performance to Programs (My Computer > Properties > Advanced > Performance > Settings > Advanced > Memory Usage > "Programs" radio button).