The topic for the golden jubilee edition is, “Automation”.
What is T-SQL Tuesday?
T-SQL Tuesday is a monthly blog party hosted by a different blogger each month. This blog party was started by Adam Machanic (blog|twitter). You can take part by posting your own participating post that fits the topic of the month and follows the requirements (click adjacent image to know more). Additionally, if you are interested in hosting a future T-SQL Tuesday, contact Adam Machanic on his blog.
A few years ago, when I started of in a large project, the sheer number of instances we were supporting boggled my mind. What was even more interesting was that the installation of SQL*Server was semi-automated (it involved a configuration file and a Perl script andddd..we had some interesting calls with the good folks at MS support because of this 😀 ). This whole setup really got me thinking about how and what we could automate. Until then, I was just writing some scripts and saving them in a folder to use later on but, nothing at this scale. While I was still in awe at the setup and architecture of the solution I started to realize few things:
- You may not be able to automate somethings.
- You need to have a broad understanding of the shop you are running.
- A deeper understanding of our own technical skills.
- There should be an Opportunity. But, this is again tied to the above two points. If look closely enough you will find Opportunities to automate.
So, I started looking at Perl and got so engrossed in it that I started looking at automating summarizing perfmon logs using couple of modules from CPAN. After spending a week of sleepless days and nights on it, I had a working version which found only two interested people in our team who would even use it. That was my second set of lessons:
- Draw the line when you think the effort you put in outweigh the benefits (for you or the team).
- Packaging and Presentation of your solution makes a world of difference.
Since, I did not see any point in developing the module further so left it as is. But, I was not done with the summarizing CSV files business, I started looking at what other tools were available to do this kind of work and that is when I found PowerShell. The very first version was a little uncomfortable to work with but, I found what I was looking for, “*Import-Csv” 😀 .
I went through the first 4 steps, or did a feasibility analysis and then I went through the last 2 steps of implementation and introduction. Life was right again.
I was chatting with a colleague about why it worked the second time around and that too with a scripting language hardly anyone knew a that time. His response was that the second solution was more innovative. That was another important lesson I learnt about automation, Innovation.
Innovation, is all about doing “stuff” in a different way than the usual. It could be as simple as taking a new route to work or as complex as rethinking the way you commute to work. Well, these maybe bad examples but, the gist is that all innovations need not be discoveries.
I mostly work on SAP on SQL systems. Recently, we had a requirement at one of our clients to track the database growth on a daily basis. The initial thought was to use one of the canned scripts that I had written a while back. Something as elementary as the below would’ve worked just fine.
DECLARE @dbDataSize INT, @dbDataUsed INT, @dbLogSize INT, @dbLogUsed INT SELECT @dbDataSize = SUM(size) FROM sys.database_files WHERE [type] = 0 SELECT @dbDataUsed = SUM(FILEPROPERTY(name,'SpaceUsed')) FROM sys.database_files WHERE [type]= 0 SELECT @dbLogSize = SUM(size) FROM sys.database_files WHERE [type] = 1 SELECT @dbLogUsed = SUM(FILEPROPERTY(name,'SpaceUsed')) FROM sys.database_files WHERE [type]= 1 SELECT DBName = DB_NAME(), DataSize_MB = @dbDataSize/128, DataFree_MB = (@dbDataSize - @dbDataUsed)/128, DataFreePct = CAST((CAST((@dbDataSize - @dbDataUsed) AS DECIMAL(15,2))/CAST(@dbDataSize AS DECIMAL(15,2)))*100 AS DECIMAL(4,2)), LogSize_MB = @dbLogSize/128, LogFree_MB = (@dbLogSize - @dbLogUsed)/128, LogFreePct = CAST((CAST((@dbLogSize - @dbLogUsed) AS DECIMAL(15,2))/CAST(@dbLogSize AS DECIMAL(15,2)))*100 AS DECIMAL(4,2))
But, if you had worked with SAP systems long enough you know the BASIS admins get a lot of cool reports in DB02, one of which incidentally is a DB growth report. So, I started looking through the stored procedures in the database and found this beauty called ‘sap_dbszhist‘. Now this procedure takes four parameters ‘db_name’, ‘object_scheme’, ‘from_dt’ and ‘to_dt’. But, we do not need to pass any of these parameters. The default behavior is:
- If db_name is null then current DB context is assumed.
- If obj_schema is null schema details are gotten from sys.objects using @@PROCID
- If the from_dt parameter is null it assigns a date in the past 3yrs from today, and to_dt is today’s date.
So, without a lot of effort I now have a historical record of DB growth by day. All we need now is a process to put this in a central repository for some SSRS fun.
What I love about this is that the approach combined all the lessons learned over a period time and gave me a chance to think outside the normal.That is the final lesson, I learnt:
Automation, it’s all in the mind.