Tag Archives: SQL Server

Meme Monday – Backup Fun

Thomas LaRock (blog|twitter) has started Meme Monday and challenged folks to write a blog post in 11 words or less.

Daved Howard (blog|twitter) has tagged me so here it goes.

“SQL Backup you’re killing me, please stop failing, just work please”

Bottom line is I have been working along with the team on getting backups great. Not sure there is such a thing. One of the first things every DBA should be doing but seems to be the hardest to get right. Been working on this quiet a bit lately so it seemed appropriate for this fun blogging tag game.

Tagging:
Dave Levy has been quiet so I’m tagging him.

Enjoy!

Powershell Baby Steps

T-SQL Tuesday

First timer for T-SQL Tuesday so not even sure if I’m doing this right. I was going to write this post at some point but kept putting off. However reading all the great posts today and my passion for automation have driven me to share.

Old Backups
The way we have our SQL backups setup we have a few servers with numerous shares on them. So each servers backups are mapped to a specific share on the backup server(s). Our backup jobs are set to remove backups over x hours old when a new one is created so we should never have bak files over x hours on the backups servers.

In a perfect world we would never have old files but the world is not perfect. The problem starts to occur when servers are decomissioned or files are put in folders for a restore and forgotten. Essentially overtime we end up with 100’s of GB’s worth of old BAK files out on our backups servers. Eventually they get noticed and cleaned up when we get low disk space alerts on our servers or someone stumbles upon an old file. Not a huge problem but there is no reason to have these old files out there so why waste the space.

So one day I was frustrated by the amount of old files out there as I was arguing with someone over needing more disk space for another project. I was having an angry day and started looking through the folders manually one at a time. I decided that was a colossal waste of time so I did some windows searches on files *.BAK that were older than 60 days, etc. After asking Dave Levy (blog | twitter) how to copy out my Windows Explorer search results to excel to share with the team he challenged me to just write a Powershell script to dump to a CSV file.

I don’t get to code much these days in my role so I took the challenge. Well actually Dave sent me a script and I made  2 small changes to it and ran it. Magic! I had a file in seconds that I could send out to the team so they can go clean up the files. I’m sure you POSH experts are saying Duh that’s a no brainer. Keep this in mind, I turned in my coding badge years ago and struggle at times to grasp new coding concepts.

My Script

Get-ChildItem -Path \\YOURBACKUPSERVERNAME\d$\ -recurse -include *.bak | Where-Object {$_.lastwritetime -le ‘1/1/2011’} | Export-Csv OldBackupReport.csv

Within 1 day we re-gained 1TB on our backup server drive and no need to buy more disk!

Can you guess what day I sent out the report?

Look at all that disk we saved!

Conclusion
This could be an automation stretch as I’m pretty much dumping out info to a file to have people go manually delete the files. I could have the script delete them I’m sure but I wanted to make sure we were not removing files that may be in process for a restore. I also considered scheduling this to run monthly and email the team which is something I might do in the future and post about at a later date.

SQLShare Progress

In December I shared a post about how I was going to use SQLShare to do some continuous learning.It’s so cool to get an email every morning showing how well I’m tracking to my goal. As you can see below I have been tracking pretty good in January. I did adjust my goal down a bit to 60 minutes per month and as of today I’m at 35 minutes out of 60. Another neat thing is that I’m starting to see some videos done by folks I follow on Twitter it’s nice to hear a voice to go along with the profile pics and tweets.

Check it out can’t hurt to give it a try.

SQLShare Progress

More than half way to my goal!

Learn Something Every Day

Something I’m always trying to drive others towards is continuing to learn. I’m all about career development and taking control of my future via continued learning. To a fault it’s sort of a passion of mine. There are so many ways to do this it’s mind boggling. Reading books, blogs, online training, classroom training, #sqlhelp on twitter, etc. It makes me wonder why some folks don’t do anything at all.

I was in a meeting the other day and was told by someone that they simply don’t have time to learn anything new or time to get better at what they do today, they are simply too busy. This totally shocked me and I really tried to reach out and help guide this person to look inward and focus on themselves. One of my points was how will you ever get more time unless you optimize how you are working today. Sure, the we need more help argument, is always there but that is not in your control most of the time. So how can you get more time, well how about learning how to do something better.

And that’s what I’m going to share today.

Zero! Better get learning

One learning opportunity that I have been tracking on lately is SQL Share I get an email every day with a 1-5 minute short training video. If the content interests me I watch it, learn something new. If not I just delete the email and move on. What is great is it keeps track for me so every day I get a reminder of how much time I have spent. It’s totaled by month and there is a nice track record on the site for me to review what I have watched. My kids have reading logs at home they need to complete each month for school. They are crazy about beating there total hours one month to the next. Well this is my log. It drives me nuts when I start to lag behind. When I watched less one month to the next it irritates me and I’ll hammer off 3 or 4 in a row and get that number back up. This type of daily learning via short videos really appeals to me and my way of learning.

I’m not a big on reading technical manuals or books. I like blogs but if a blog post get’s too long and detailed I quickly move on to something else. When reading a BOL like Jen McCown is planning to do as described in her blog my eyes to to go blurry after about 2 minutes. That’s just not my style. Everyone has there own way of learning. I am really looking forward to following along as Jen reads BOL and blogs on it as I hope to let her do all the hard work and get what I need from her posts.

In summary I have found something here that works for me and I wanted to share. So if you are interested check it out, let me know if it works for you.

Thanks Andy Warren, Brian Knight, and Steve Jones for yet another great learning opportunity.
About SQL Share

Head in the Clouds

My last post was quite some time ago. I explained that I’d be away for a while as I had committed my self to what ended up being almost another full time job. It was a great experience and is now winding to a close. Last game is this Saturday. I have one more film night and practice and then the game. We will have some post season parties etc but I should have some more time on my hands.

My Other Team

Me mentoring my other Team!

So what’s next… Well I just returned from a 3 day trip to Redmond to visit the Microsoft campus and learn about the technical road maps of some very interesting technologies. Being currently in a role where SQL is my blood I was excited about what is coming. The thing that really stuck with me however is this idea of the cloud. I have been hearing about it but not quite grasping what it really means. I have seen demos on SQL Azure at SQL Saturday’s and such but I couldn’t correlate it to my day to day activities. Seeing it explained by the people that are paid to explain it shed some light on it and really made the idea sink in.

Essentially the cloud ends up being different for everyone. There are probably pieces of it in your infrastructure today and you probably don’t even realize it. I drew some tangents to our talent acquisition and performance management solution we use at my company. We are using this as a cloud app. Essentially its is saas or Software As A Service. We don’t have servers running apps here in our data center. I’m not supporting this tools SQL Database. It’s simply a website that is the app and it’s all hosted off site. This is essentially an application running on the cloud that is critical to the management of our talent.

Additionally we have been doing more and more integration with virtual servers via various technologies. We often look to go virtual first on new installations as the flexibility and built in support and recoverability are such a win. In essence this becomes a mini private cloud that our internal IT staff is supporting.

So those are 2 small real examples of what the cloud is to me at this time. I hope to keep researching and learning more about how others are using this. Additionally I have a personal goal to find a real project here that I can implement on the cloud.

It’s starting to make sense and I see the road ahead and I’m excited to jump on the puffy white clouds!

SQL 2008 Sparse Columns

Sparse Columns
I had my first exposure to one of the new SQL 2008 features that I frankly didn’t even know existed. The feature is the ability to turn on an option to make a column sparse. I won’t go into the details of why to use this as others have covered it pretty well here and here.

Job Failed
I was looking into some job failures this morning and one in particular stuck out at me. Essentially we have a new job setup that checks a few databases on a SQL 2008 server looking for tables and indexes that were created and not setup with page compression. The job then attempts to alter these tables or indexes with the page compression option turned on. The reasoning behind when to use compression is for another day and varies based on many factors. In this case we have determined that any new objects in these database should have page compression turned on.

What Happened?
The job is setup to run once a week on Sundays and this morning the server was red in my Quest Spotlight console so I started to dig into why. Here is the error from the Job with some names changed to protect the innocent.

Executed as user: XXXXXXX. Cannot alter table 'yourtablenamehere' because the table either contains sparse columns or a column set column which are incompatible with compression. [SQLSTATE 42000] (Error 11418). The step failed.

First off what a great informative error message, kudos Microsoft. I knew immediately why it failed but was not sure what sparse columns were. So I used my trusty search engine and found some info on sparse columns in SQL 2008. Within 2 minutes I knew what the issue was and that I needed to exclude tables with sparse columns in this process. Essentially leaving these tables without compression. Here is the query in the job that looks for uncompressed objects.

SELECT
DISTINCT OBJECT_NAME(A.object_id) AS [ObjectName]
FROM
sys.partitions A with (nolock)
INNER JOIN sys.objects B
ON
A.object_id = B.object_id
WHERE type = 'U'
AND data_compression = 0
ORDER BY 1;

To quickly fix this problem I added the following line of code to the where clause of this query that identifies objects that are uncompressed which filters out any objects with sparse columns.

AND A.object_id NOT IN ( select distinct object_id from sys.columns where is_sparse = 1

Green Again
There is probably a much more elegant way of doing this and I’ll continue to look for that but for now the job runs, skips the one table with the Sparse columns in it and does what I want. This server is no longer red on my Spotlight console and all is good for now anyway.

SQL Saturday #31 Chicago

The Event
Yes this was a month ago, but I wanted to get my thoughts down on this event and share the experience from my standpoint. First of thanks to the folks that spent all their time putting this great event together. Arron, Wendy, and Ted did a great job of planning and running the show.

Me Volunteer?
I showed up extra early as I had volunteered to assist. I wasn’t sure what I was in for but I’m easy-going so I told them whatever they needed. I ended up working the registration table first thing. Jumped right in putting the badge holders together, prepping the tables, then handing out packets, etc. Met some very nice people and it’s funny how with common interests you make instant friends. The first part was a bit hectic and I learned a lot of things if in that situation again. We got through it and I don’t think anyone was disappointed with the process so that is good.

I was pretty open about helping out wherever the rest of the day. I would have liked to sit in on various sessions but frankly I was more there to provide help and meet some of the speakers and twitter folks. To that end I was assigned as a monitor in one of the rooms. This actually worked out pretty well as I was able to see some really good sessions and didn’t have to jump around and make difficult decisions of where to go. I knew where I had to be and just took in the show.

A New Speaker Is Born
One great moment early in the day was seeing my coworker Dave Levy (Blog|Twitter) present his first SQL Community session to a packed house. He had practiced the content with me prior to the event so I know what it was about. The great part was seeing him spend so much time preparing for something and just nailing it to a packed house. It’s nice to see people work so hard for something and then do a good job. It was a real feel good moment for me and I think it has solidified Dave’s desire to keep sharing his wealth of knowledge.

The Real Benefit
The day did get a bit long towards the end but the best part didn’t come till after the sessions were officially over. At the end of the day it was nice to put some faces and real life personalities with twitter and blog personalities. In most cases I was able to share a few words with most and in some cases dinner and some drinks afterwards. The networking was the best part of the event and taking advantage of the after events was the best time spent in my case.

I urge anyone that is interested in getting more involved to volunteer for things like this. Being in the mix is the best way to make some good contacts, learn a lot of great things, and meet some real cool people. I plan on being an active member in the SQL Community and will look to take a greater role in the local PASS group as well as in planning and volunteering for the next SQL Saturday in the area.

Bottom Line
Best investment to date in my professional career. Cost nothing but a Saturday and time away from Family. I learned a ton, made 20 new contacts and felt good about giving back to all the people in attendance.