Category Archives: SQL Server

Using Multi Select with SSRS and FetchXML in CRM 2011

I have been doing quite a bit of report development lately with SSRS against CRM 2011 online for various client projects.

CRM 2011 online limits you to FetchXML queries only (no SQL!). You can read more about setting up a basic report using FetchXML on TechNet, I’ll assume you already have read that and can handle the basics.

What I’d like to describe is how to make a parameter allow multiple selections. I run into this all the time. Client want’s to pick an employee or all.

In SQL you can use an IN clause, well you can do the same in FetchXML.

There is an “in” operator in FetchXML.

So all you need to do is turn on the “Allow Multiple Values” option in the parameter as shown here.

Select Allow Multiple Values

Then change your FetchXML condition operator from “eq” to “in” as shown below.

The parameter selection will be sent to the query as a comma separated list of values and the proper data will be filtered as expected.

Change FetchXML operator

Good luck!

Meme Monday – Backup Fun

Thomas LaRock (blog|twitter) has started Meme Monday and challenged folks to write a blog post in 11 words or less.

Daved Howard (blog|twitter) has tagged me so here it goes.

“SQL Backup you’re killing me, please stop failing, just work please”

Bottom line is I have been working along with the team on getting backups great. Not sure there is such a thing. One of the first things every DBA should be doing but seems to be the hardest to get right. Been working on this quiet a bit lately so it seemed appropriate for this fun blogging tag game.

Tagging:
Dave Levy has been quiet so I’m tagging him.

Enjoy!

Powershell Baby Steps

T-SQL Tuesday

First timer for T-SQL Tuesday so not even sure if I’m doing this right. I was going to write this post at some point but kept putting off. However reading all the great posts today and my passion for automation have driven me to share.

Old Backups
The way we have our SQL backups setup we have a few servers with numerous shares on them. So each servers backups are mapped to a specific share on the backup server(s). Our backup jobs are set to remove backups over x hours old when a new one is created so we should never have bak files over x hours on the backups servers.

In a perfect world we would never have old files but the world is not perfect. The problem starts to occur when servers are decomissioned or files are put in folders for a restore and forgotten. Essentially overtime we end up with 100’s of GB’s worth of old BAK files out on our backups servers. Eventually they get noticed and cleaned up when we get low disk space alerts on our servers or someone stumbles upon an old file. Not a huge problem but there is no reason to have these old files out there so why waste the space.

So one day I was frustrated by the amount of old files out there as I was arguing with someone over needing more disk space for another project. I was having an angry day and started looking through the folders manually one at a time. I decided that was a colossal waste of time so I did some windows searches on files *.BAK that were older than 60 days, etc. After asking Dave Levy (blog | twitter) how to copy out my Windows Explorer search results to excel to share with the team he challenged me to just write a Powershell script to dump to a CSV file.

I don’t get to code much these days in my role so I took the challenge. Well actually Dave sent me a script and I made  2 small changes to it and ran it. Magic! I had a file in seconds that I could send out to the team so they can go clean up the files. I’m sure you POSH experts are saying Duh that’s a no brainer. Keep this in mind, I turned in my coding badge years ago and struggle at times to grasp new coding concepts.

My Script

Get-ChildItem -Path \\YOURBACKUPSERVERNAME\d$\ -recurse -include *.bak | Where-Object {$_.lastwritetime -le ‘1/1/2011’} | Export-Csv OldBackupReport.csv

Within 1 day we re-gained 1TB on our backup server drive and no need to buy more disk!

Can you guess what day I sent out the report?

Look at all that disk we saved!

Conclusion
This could be an automation stretch as I’m pretty much dumping out info to a file to have people go manually delete the files. I could have the script delete them I’m sure but I wanted to make sure we were not removing files that may be in process for a restore. I also considered scheduling this to run monthly and email the team which is something I might do in the future and post about at a later date.

SQLShare Progress

In December I shared a post about how I was going to use SQLShare to do some continuous learning.It’s so cool to get an email every morning showing how well I’m tracking to my goal. As you can see below I have been tracking pretty good in January. I did adjust my goal down a bit to 60 minutes per month and as of today I’m at 35 minutes out of 60. Another neat thing is that I’m starting to see some videos done by folks I follow on Twitter it’s nice to hear a voice to go along with the profile pics and tweets.

Check it out can’t hurt to give it a try.

SQLShare Progress

More than half way to my goal!

Learn Something Every Day

Something I’m always trying to drive others towards is continuing to learn. I’m all about career development and taking control of my future via continued learning. To a fault it’s sort of a passion of mine. There are so many ways to do this it’s mind boggling. Reading books, blogs, online training, classroom training, #sqlhelp on twitter, etc. It makes me wonder why some folks don’t do anything at all.

I was in a meeting the other day and was told by someone that they simply don’t have time to learn anything new or time to get better at what they do today, they are simply too busy. This totally shocked me and I really tried to reach out and help guide this person to look inward and focus on themselves. One of my points was how will you ever get more time unless you optimize how you are working today. Sure, the we need more help argument, is always there but that is not in your control most of the time. So how can you get more time, well how about learning how to do something better.

And that’s what I’m going to share today.

Zero! Better get learning

One learning opportunity that I have been tracking on lately is SQL Share I get an email every day with a 1-5 minute short training video. If the content interests me I watch it, learn something new. If not I just delete the email and move on. What is great is it keeps track for me so every day I get a reminder of how much time I have spent. It’s totaled by month and there is a nice track record on the site for me to review what I have watched. My kids have reading logs at home they need to complete each month for school. They are crazy about beating there total hours one month to the next. Well this is my log. It drives me nuts when I start to lag behind. When I watched less one month to the next it irritates me and I’ll hammer off 3 or 4 in a row and get that number back up. This type of daily learning via short videos really appeals to me and my way of learning.

I’m not a big on reading technical manuals or books. I like blogs but if a blog post get’s too long and detailed I quickly move on to something else. When reading a BOL like Jen McCown is planning to do as described in her blog my eyes to to go blurry after about 2 minutes. That’s just not my style. Everyone has there own way of learning. I am really looking forward to following along as Jen reads BOL and blogs on it as I hope to let her do all the hard work and get what I need from her posts.

In summary I have found something here that works for me and I wanted to share. So if you are interested check it out, let me know if it works for you.

Thanks Andy Warren, Brian Knight, and Steve Jones for yet another great learning opportunity.
About SQL Share

SQL Denali

I have been tuned in to both keynotes this week at the SQLPass Summit really wishing I was there. I just was in Seattle 2 week ago as a Microsoft customer visiting and learning about some neat new things coming, some of which is being discussed in these keynotes. Seattle is a great place and I can’t wait to get back and have more time to explore.

There are a lot of new things being presented as being in the next release of SQL code named Denali. Some of it looks promising on many aspects. Here are the things I am very interested in exploring more over the next 6-12 months.

Columnstore Indexes
This is what I’m most excited about. Mainly because I come from a BI backgrounnd and have spent years trying to make DW queries run faster. It’s almost a passion of mine so I’m exicted to take this for a test drive in a real situation. There is a very nice whitepaper that is linked on Simons SQL Blog that explains this way better than I ever could.

SSIS Integration within SSMS or as it would be Visual Studio
The current way that SSIS is supported in SSMS is clunky. Some questions I have right of the bat are. Can you manage SSIS packages stored on the file system or only within MSDB? Can you manage packages across versions which is a problem now? I have SSIS 2005 running and can’t manage from my SSMS 2008. I will be interested to see how this is all going to work.

SSIS Enhancements
The undo seemed to be what most folks in attendance were excited about the demo I saw. What I’m more interested in is the performance gains if any that this version will bring. I don’t recall hearing that mentioned or at least it was not a focus of the demo. The data cleansing piece seemed promising but I have a feeling that the setup and maintenance of the rules etc will make that a feature not well adopted. Every version of this has been a vast improvement so I’m sure this will just be more solid than the last. SSIS Junkie has some great links to explore more of whats in store.

Visual Studio vs SSMS
I’m excited to see the movement towards integrating the DBA’s and the developers tool sets. However I still struggle with how this will all work and the adoption rate. I still have folks using Query Analyzer, old habits are hard to break and yes we still have SQL 2000 instances running production apps. The big win here for me is source control integration with TFS and working in a common tool set. A rough spot at our shop is moving code from Dev/Test/Prod and hope this will improve that process over what we have today. The new features in the demo seemed pretty good and much of what you would could get with add on tools from major vendors built in which was nice to see. I’ll be going there but not sure how this will change the daily lives of the team. Brent Ozar did a nice write-up on some of the features here.

Let’s Go
So at this point I’m excited to get a system up with this on there and touch and feel for myself. In the meantime I’ll keep reading and learning from everyone out there rapidly blogging and documenting what they are finding. Enjoy and let me know what features interest you the most.

Head in the Clouds

My last post was quite some time ago. I explained that I’d be away for a while as I had committed my self to what ended up being almost another full time job. It was a great experience and is now winding to a close. Last game is this Saturday. I have one more film night and practice and then the game. We will have some post season parties etc but I should have some more time on my hands.

My Other Team

Me mentoring my other Team!

So what’s next… Well I just returned from a 3 day trip to Redmond to visit the Microsoft campus and learn about the technical road maps of some very interesting technologies. Being currently in a role where SQL is my blood I was excited about what is coming. The thing that really stuck with me however is this idea of the cloud. I have been hearing about it but not quite grasping what it really means. I have seen demos on SQL Azure at SQL Saturday’s and such but I couldn’t correlate it to my day to day activities. Seeing it explained by the people that are paid to explain it shed some light on it and really made the idea sink in.

Essentially the cloud ends up being different for everyone. There are probably pieces of it in your infrastructure today and you probably don’t even realize it. I drew some tangents to our talent acquisition and performance management solution we use at my company. We are using this as a cloud app. Essentially its is saas or Software As A Service. We don’t have servers running apps here in our data center. I’m not supporting this tools SQL Database. It’s simply a website that is the app and it’s all hosted off site. This is essentially an application running on the cloud that is critical to the management of our talent.

Additionally we have been doing more and more integration with virtual servers via various technologies. We often look to go virtual first on new installations as the flexibility and built in support and recoverability are such a win. In essence this becomes a mini private cloud that our internal IT staff is supporting.

So those are 2 small real examples of what the cloud is to me at this time. I hope to keep researching and learning more about how others are using this. Additionally I have a personal goal to find a real project here that I can implement on the cloud.

It’s starting to make sense and I see the road ahead and I’m excited to jump on the puffy white clouds!