Category Archives: SQLServerPedia Syndication

SQL Server 2008 – Intellisense – Update Local Cache

Since we recently upgraded to SQL 2008, now I have some small tidbits that I can share! First one is this: Intellisense updating.

In SQL 2008, it has built in Intellisense, pretty awesome. Until you add new objects, then everything is red underlined. What to do?

Well if you are writing some T-SQL in SQL Server Management Studio (SSMS) and run into this issue for newly created objects, just use this command

CTRL+SHIFT+R

and you are all set, the local Intellisense cache will be rebuilt and your new tables/procs/whatever will show up in Intellisense, cool!

Dev and Prod Systems, Using a HOST file to ensure your datasource is pointing the right system

In many shops, I am guessing there are multiple servers. Development, Production, maybe a Staging, etc.

With SQL Server Analysis Services (SSAS) and SQL Server Integration Services (SSIS), you set up data sources, connection to databases. In SSAS you usually have a connection which then you build your data source view off of, and in SSIS you have connections from which you push data to and pull data from.

Another thing, in SSAS you can “deploy” right from Visual Studio (BIDS). All these things have a server name. What we have run into is this:

You develop on your local machine, pointing at development server. You deploy to development, your connections are pointing to development, and everything works great. When you deploy to production (usually planned, every 2 weeks, or whatever) you deploy your stuff and what ends up happening?

In SSIS your config files should have a connection string (or however you store it) and it should point to production. But in SSAS, if you deploy from BIDS, your data source will have to change and in the cube project properties you need to change your deployment server.

I have seen countless times, a cube or a connection in SSIS without a config that is running in production, yet pointing at development. We keep our dev data as fresh or very close so sometimes we don’t even notice, but then it happens, something weird is reported and we dig into it, and we find the erroneous connection string.

Here is my solution to the problem:

Developers – go to C:WindowsSystem32DriversEtc and open your Hosts file with notepad or text editor. You then add a couple of entries

#production
#xxx.xxx.xxx.xxx datawarehouse

#development
yyy.yyy.yyy.yyy datawarehouse

where xxx is the ip of your production system, and yyy is the ip of your dev system. the # is the rem/comment out symbol. You can see above I have everything commented out but the line for the dev system. But notice each is pointed to “datawarehouse” so if I ping or connect to “datawarehouse” from Management Studio, or whatever, it goes to the IP I have commented out.

Now, go on to each server, but only add the line that corresponds to that server in the hosts file, or better yet just

127.0.0.1 datawarehouse

Now, when you deploy to either server, and your connections, etc are set to connect to “datawarehouse” you ensure it will always connect to the local server. Brilliant!

SQL Server Master Data Services (Nov CTP)

Last night I configured up SQL Server Master Data Services (MDS) on a test box. It looks good so far. I ran into a few issues with the box/setup that I had to tweak in order to get it working. I had to allow “handlers” and “modules” in the applicationHost config on the machine. IIS was also inadvertently set up with only anon access, which was an issue, after I got windows auth installed and turned on everything seemed to work.

The app/system itself is pretty slick. Very basic, but lets you do complex things. Once you get some users set up, and a few models (think: Product, Customer), you can add entities (think: Category1, Category2, etc) you can set up hierarchies, business rules, etc.

I haven’t played much more with it, but it seems like it could get the job done. I would say some things aren’t intuitive enough. Example – they could say “drag this over to this area” but there is nothing as far as what to do, its kind of guess and check.

I’m excited to see where MDS goes.

mds

SQL Server Schema Automatic Revision History using DDL Triggers and SVN

SQL 2005 introduced DDL Triggers, where you could catch events on your DDL statements in SQL, and hopefully most DBA’s are catching them, at least to a table and have some kind of report on who is adding, changing and deleting what objects.

What I wanted to do was capture that, but also keep and automatic running log in SVN (subversion) source control. Here is how I went about it.

First, you need SVN set up somewhere. We use unfuddle (http://unfuddle.com/) – which we also use for Agile BI stuff, but that is another post :) and unfuddle also let’s us do SVN in the cloud, but you could do it locally or internally or whatever you’d like. I had to create a user for this, to do automatic commits, and give it rights to commit to the repo.

Second, you probably already want your DDL triggers set up to capture events on your databases, and write to a table. Writing to central DB on a server is probably a good idea. If you have that, or something similar, I then created a trigger on that table, to capture the INSERTED action and take that record that was just inserted and parse what I need, and automatically commit to SVN.

But first, you want to script out all your objects for a database to a folder, say C:DBSchemas

I created a folder structure to help me

DBSchemasServerNameDatabaseNameObjectType

and then scripted out all the objects to each folder for ObjectType (Tables, StoredProcedures, Functions, and Views)

Once that was done, I did an initial import into SVN, and make the folder I was working with a working copy. Then the funs starts.

I created a couple of procs (which I found online, links to the blogs are below) to CREATE and DELETE files from T-SQL using OLE automation

CREATE/APPEND (http://sqlsolace.blogspot.com/2009/01/ole-automation-write-text-file-from.html)

CREATE PROCEDURE [Utils].[usp_OLEWriteFile] (@FileName varchar(1000), @TextData NVARCHAR(MAX),@FileAction VARCHAR(12)) AS

BEGIN
DECLARE @OLEfilesytemobject INT
DECLARE @OLEResult INT
DECLARE @FileID INT

EXECUTE @OLEResult = 
 sp_OACreate 'Scripting.FileSystemObject', @OLEfilesytemobject OUT
IF @OLEResult  0 
  PRINT 'Error: Scripting.FileSystemObject'

-- check if file exists
EXEC sp_OAMethod @OLEfilesytemobject, 'FileExists', @OLEresult OUT, @FileName 
-- if file esists
IF (@OLEresult=1 AND @FileAction = 'APPEND') OR (@OLEresult=0) 
BEGIN   

IF (@FileAction = 'CREATENEW')
 PRINT 'New file specified, creating...'
IF (@OLEresult=1 AND @FileAction = 'APPEND') 
 PRINT 'File exists, appending...'
IF (@OLEresult=0 AND @FileAction = 'APPEND') 
 PRINT 'File doesnt exist, creating...' 

 -- open file
 EXECUTE @OLEResult = sp_OAMethod @OLEfilesytemobject, 'OpenTextFile', @FileID OUT,
 @FileName, 8, 1
 IF @OLEResult 0 PRINT 'Error: OpenTextFile'

 -- write Text1 to the file
 EXECUTE @OLEResult = sp_OAMethod @FileID, 'WriteLine', Null, @TextData
 IF @OLEResult  0 
  PRINT 'Error : WriteLine'
 ELSE
  PRINT 'Success' 
END
IF (@OLEresult=1 AND @FileAction = 'CREATENEW')
 PRINT 'File Exists, specify APPEND if this is the desired action'

EXECUTE @OLEResult = sp_OADestroy @FileID
EXECUTE @OLEResult = sp_OADestroy @OLEfilesytemobject

END
GO

DELETE (http://www.kodyaz.com/articles/delete-file-from-sql-server-xp-cmdshell-ole-automation-procedures.aspx)

DECLARE @Result int
DECLARE @FSO_Token int

EXEC @Result = sp_OACreate 'Scripting.FileSystemObject', @FSO_Token OUTPUT
EXEC @Result = sp_OAMethod @FSO_Token, 'DeleteFile', NULL, 'C:delete-me-file.txt'
EXEC @Result = sp_OADestroy @FSO_Token

You need to make sure OLE Automation is on. You need to make sure that the account you are running SQL as has modify rights to your DBSchemas folder.

But the crux of the solution is the trigger that gets the DDL info, and writes/deletes the files and SVN Add/Del/Commit’s the file. Now this is some ugly 1 hour SQL script craziness, tons of IF statements, etc. It could be improved, but it works, and it is a start, it can be modified and tweaked to do whatever you want. Note, if your SVN repo isn’t authenticated you don’t need the username/password for the SVN commands.

You can see, it gets the DDL, checks the events (and I have it limited to one database), and it checks what type of object and what operation, and for and add, it adds and commits, for a updated, deletes file, recreates it, and commits, and for a delete it does and svn delete and commit. Pretty easy :)




CREATE TRIGGER DDLRevisionHistory
	ON dbo.DDLEventLog
	AFTER INSERT
AS

BEGIN
SET NOCOUNT ON;

DECLARE @EventType VARCHAR(50)
DECLARE @DatabaseName VARCHAR(50)
DECLARE @ServerName VARCHAR(50)
DECLARE @ObjectName VARCHAR(100)
DECLARE @SchemaName VARCHAR(10)
DECLARE @CommandText VARCHAR(MAX)
DECLARE @LoginName VARCHAR(50)

SELECT 
	@EventType = EventInstance.value('(//EventType)[1]', 'varchar(50)'),
	@DatabaseName = EventInstance.value('(//DatabaseName)[1]', 'varchar(50)'),
	@ServerName = EventInstance.value('(//ServerName)[1]', 'varchar(50)'),
	@ObjectName = EventInstance.value('(//ObjectName)[1]', 'varchar(50)'),
	@SchemaName =EventInstance.value('(//SchemaName)[1]', 'varchar(50)'),
	@CommandText = EventInstance.value('(//TSQLCommand//CommandText)[1]', 'varchar(max)'),
	@LoginName = EventInstance.value('(//LoginName)[1]', 'varchar(50)')
	FROM inserted

DECLARE @filepath VARCHAR(8000)
	
SET @filepath = 'C:DBSchemas' + @ServerName + '' + @DatabaseName + '' 

	IF (
		@EventType = 'CREATE_VIEW' OR @EventType = 'ALTER_VIEW' OR @EventType = 'DROP_VIEW'
		OR @EventType = 'CREATE_TABLE' OR @EventType = 'ALTER_TABLE' OR @EventType = 'DROP_TABLE'
		OR @EventType = 'CREATE_PROCEDURE' OR @EventType = 'ALTER_PROCEDURE' OR @EventType = 'DROP_PROCEDURE'
		OR @EventType = 'CREATE_FUNCTION' OR @EventType = 'ALTER_FUNCTION' OR @EventType = 'DROP_FUNCTION'
		) 

		AND @DatabaseName = 'YourDatabase' BEGIN


		-- write out new file to correct folder
		IF CHARINDEX('VIEW',@EventType) > 0 BEGIN 
			SET @filepath = @filepath + 'Views' + @SchemaName + '.' + @ObjectName + '.View.sql'
		END
		
		IF CHARINDEX('TABLE',@EventType) > 0 BEGIN 
			SET @filepath = @filepath + 'Tables' + @SchemaName + '.' + @ObjectName + '.Table.sql'
		END

		IF CHARINDEX('PROCEDURE',@EventType) > 0 BEGIN 
			SET @filepath = @filepath + 'StoredProcedures' + @SchemaName + '.' + @ObjectName + '.StoredProcedure.sql'
		END

		IF CHARINDEX('FUNCTION',@EventType) > 0 BEGIN 
			SET @filepath = @filepath + 'Views' + @SchemaName + '.' + @ObjectName + '.UserDefinedFunction.sql'
		END

		IF CHARINDEX('CREATE',@EventType) > 0 BEGIN
			
			-- create file
			EXEC dbo.usp_OLEWriteFile @filepath,@CommandText,'CREATENEW'
			
			-- svn add
			DECLARE @instrAdd VARCHAR(4000)
			SET @instrAdd='svn add ' + @filepath + ' --username dbschema --password yourpassword'
			EXEC xp_cmdshell @instrAdd

			-- svn commit
			DECLARE @instrCommitAdd VARCHAR(4000)
			SET @instrCommitAdd='svn commit ' + @filepath + ' --message "added by '+ @LoginName +'" --username dbschema --password yourpassword'
			EXEC xp_cmdshell @instrCommitAdd

		END
		
		IF CHARINDEX('ALTER',@EventType) > 0 BEGIN
			
			--delete and readd file
			EXEC dbo.usp_OLEDeleteFile @filepath
			EXEC dbo.usp_OLEWriteFile @filepath,@CommandText,'CREATENEW'
			
			-- svn commit
			DECLARE @instrCommitChange VARCHAR(4000)
			SET @instrCommitChange='svn commit ' + @filepath + ' --message "changed by '+ @LoginName + '" --username dbschema --password yourpassword'
			--PRINT @instrCommitChange
			EXEC xp_cmdshell @instrCommitChange
		END

		IF CHARINDEX('DROP',@EventType) > 0 BEGIN
			-- svn delete
			DECLARE @instrDel VARCHAR(4000)
			SET @instrDel='svn delete ' + @filepath + ' --username dbschema --password yourpassword'
			EXEC xp_cmdshell @instrDel

			-- svn commit
			DECLARE @instrCommitDel VARCHAR(4000)
			SET @instrCommitDel='svn commit ' + @filepath + ' --message "deleted by '+ @LoginName +'" --username dbschema --password yourpassword'
			EXEC xp_cmdshell @instrCommitDel
		END

	END

END

as you can see you can create a homegrown revision history of your DDL objects in SQL . I have tested this on the basic operations, no renames, etc using the GUI, but if you do use it, you might want to wrap it all in exception handling just to be on the safe side.

Happy DBA’ing :)

SQL Server Agent – Query To Find Long Running Jobs

I use SQL Agent a ton. It is like my go to place to schedule jobs and tasks. Problem is, there are pieces “missing”. Like if a job hangs, you can have it auto stop after a given amount of time. It doesn’t alert on changes to a job, etc, etc.

I asked on twitter what the DBA’s think, @SQLRockstar thinks using OpsMgr is the way, which probably would be great, if I had it. I need a quick and dirty solution.

What I did was query the sysjobs, sysjobactivity, and sysjobhistory tables to get some kind of “look” into the running jobs. That way, if a job that is normally 45 minutes has been running for 5 hours, I should be able to catch it early on, just trying to be more proactive instead of reactive.

SELECT 
	j.job_id AS 'JobId',
	name AS 'JobName',
	start_execution_date AS 'StartTime',
	stop_execution_date AS 'StopTime',
	avgruntimeonsucceed,
	DATEDIFF(s,start_execution_date,GETDATE()) AS 'CurrentRunTime',
	CASE WHEN stop_execution_date IS NULL THEN 
			DATEDIFF(ss,start_execution_date,stop_execution_date) ELSE 0 END 'ActualRunTime',
	CASE 
		WHEN stop_execution_date IS NULL THEN 'JobRunning'
		WHEN DATEDIFF(ss,start_execution_date,stop_execution_date) 
			> (AvgRunTimeOnSucceed + AvgRunTimeOnSucceed * .05) THEN 'LongRunning-History'
		ELSE 'NormalRunning-History'
	END 'JobRun',
	CASE 
		WHEN stop_execution_date IS NULL THEN
			CASE WHEN DATEDIFF(ss,start_execution_date,GETDATE())
						> (AvgRunTimeOnSucceed + AvgRunTimeOnSucceed * .05) THEN 'LongRunning-NOW'
			ELSE 'NormalRunning-NOW'
		END
		ELSE 'JobAlreadyDone'
	END AS 'JobRunning'
 FROM msdb.dbo.sysjobactivity ja
	INNER JOIN msdb.dbo.sysjobs j ON ja.job_id = j.job_id
INNER JOIN (
	SELECT job_id,
	AVG
	((run_duration/10000 * 3600) + ((run_duration%10000)/100*60) + (run_duration%10000)%100)
	+
	STDEV
	((run_duration/10000 * 3600) + ((run_duration%10000)/100*60) + (run_duration%10000)%100) AS 'AvgRuntimeOnSucceed'
	 FROM msdb.dbo.sysjobhistory
	WHERE step_id = 0 AND run_status = 1
	GROUP BY job_id) art 
	ON j.job_id = art.job_id
WHERE 
(stop_execution_date IS NULL) OR 
	(DATEDIFF(ss,start_execution_date,stop_execution_date) > 60
	AND 
	CAST(LEFT(start_execution_date,11) AS DATETIME) = CAST(LEFT(GETDATE(),11) AS DATETIME))
ORDER BY start_execution_date DESC

The inner query looks at the history to get the average runtime, and I added the standard deviation to that to make it more realistic, in case you have a one off that throws the average off. Of course this all depends on you having more than 3 entries in your job history as well.

I then just compare the average to the current runtime (if the job is executing) and to the actual (if it is complete).

You could use this query in a report, to view throughout the day to catch long running jobs, you could also use it to just give you the one that are currently running and over the time threshold, and alert you, so you can catch them right away.

I tested this on a few servers running SQL 2005, but I am guessing it will work on SQL 2008 as well. Find and kill those long running jobs!!

SSAS: Errors in the metadata manager. The dimension with ID of…

Crazy error today. In Analysis Services, (SSAS), some jobs got hung, sync command got hung, and what ended up happening is a cube on the default instance was totally unusable.

Try to delete, rename, resync, redeploy, anything, and I would get the error..

Errors in the metadata manager. The dimension with ID of..

Couldn’t do anything. The solution? I had to stop the SSAS service, go to the OLAP data directory under my SQL install and remove the folder for that cube. I then started the SSAS Service, and the SSAS Database was gone (finally!). I redeployed and reprocessed and everything is working again. Whew :)