Category Archives: Technical

Vista boot Stuck on a black screen with a mouse cursor

I’m not your help desk

Ok so a totally crazy name for a post. Here is the deal. I tend to be the tech support for much of my friends and family. My most recent conquest was one that was very challenging. I figured if it took me a few nights to figure it out and searching on the above got me the answer. It took a lot of reading comments in forums to get the right answer in my case. So I wanted to write a quick post with how I solved it and hopefully help someone out in the future.

So friend calls up and says, Geek Squad wants 130 bucks to upgrade my laptop to Windows 7. Can you do it? As usual I tell them to go to Geek Squad or use this other friendly company I know that does this on a regular basis. Essentially I don’t want to become their IT support for life. But an hour later he calls back and says, now this laptop won’t even boot. It just comes to a black screen with the mouse after I log in to the Vista login screen. Well I had some time that night so I said said drop it off I’ll take a quick look.

The Battle

So I tried the basics you will find if you search “Vista Black Screen Mouse Only” you will get long forums like this with various solutions and things to try. Mostly just people complaining about Microsoft. In any case this particular friend is not very technical so he would never have been able to figure this out.

I finally stumbled upon this thread which led me to the answer. It’s a long read and has lots of different answers but let me summarize.

1) Get to a command prompt using many of the ways described on the net. Pressing F8 on boot, or if you have to power off and on the hard way it should bring up the startup menu with command prompt as an option.

2) Get to C:\Windows\System32 and run MSCONFIG.EXE like shown below.


Here is what to type in

3) This should pop the GUI which you can choose the selective startup options as shown below.

System Configuration Screen

Change to what's highlighted here!

4) Choose Ok and then restart.

Next Steps

At this point I was able to login and get the desktop back. I still assumed other things were wrong so I started cleaning the system with the typical spyware and virus tools. In this case the problem continued to exits so I used the Windows Easy Transfer tool described here to backup all his data. I then installed Windows 7 and recovered his data from the backup.

At this point the response is that his laptop has never worked better.

Hope this helps someone.


Powershell Baby Steps

T-SQL Tuesday

First timer for T-SQL Tuesday so not even sure if I’m doing this right. I was going to write this post at some point but kept putting off. However reading all the great posts today and my passion for automation have driven me to share.

Old Backups
The way we have our SQL backups setup we have a few servers with numerous shares on them. So each servers backups are mapped to a specific share on the backup server(s). Our backup jobs are set to remove backups over x hours old when a new one is created so we should never have bak files over x hours on the backups servers.

In a perfect world we would never have old files but the world is not perfect. The problem starts to occur when servers are decomissioned or files are put in folders for a restore and forgotten. Essentially overtime we end up with 100’s of GB’s worth of old BAK files out on our backups servers. Eventually they get noticed and cleaned up when we get low disk space alerts on our servers or someone stumbles upon an old file. Not a huge problem but there is no reason to have these old files out there so why waste the space.

So one day I was frustrated by the amount of old files out there as I was arguing with someone over needing more disk space for another project. I was having an angry day and started looking through the folders manually one at a time. I decided that was a colossal waste of time so I did some windows searches on files *.BAK that were older than 60 days, etc. After asking Dave Levy (blog | twitter) how to copy out my Windows Explorer search results to excel to share with the team he challenged me to just write a Powershell script to dump to a CSV file.

I don’t get to code much these days in my role so I took the challenge. Well actually Dave sent me a script and I made  2 small changes to it and ran it. Magic! I had a file in seconds that I could send out to the team so they can go clean up the files. I’m sure you POSH experts are saying Duh that’s a no brainer. Keep this in mind, I turned in my coding badge years ago and struggle at times to grasp new coding concepts.

My Script

Get-ChildItem -Path \\YOURBACKUPSERVERNAME\d$\ -recurse -include *.bak | Where-Object {$_.lastwritetime -le ‘1/1/2011’} | Export-Csv OldBackupReport.csv

Within 1 day we re-gained 1TB on our backup server drive and no need to buy more disk!

Can you guess what day I sent out the report?

Look at all that disk we saved!

This could be an automation stretch as I’m pretty much dumping out info to a file to have people go manually delete the files. I could have the script delete them I’m sure but I wanted to make sure we were not removing files that may be in process for a restore. I also considered scheduling this to run monthly and email the team which is something I might do in the future and post about at a later date.

SQLShare Progress

In December I shared a post about how I was going to use SQLShare to do some continuous learning.It’s so cool to get an email every morning showing how well I’m tracking to my goal. As you can see below I have been tracking pretty good in January. I did adjust my goal down a bit to 60 minutes per month and as of today I’m at 35 minutes out of 60. Another neat thing is that I’m starting to see some videos done by folks I follow on Twitter it’s nice to hear a voice to go along with the profile pics and tweets.

Check it out can’t hurt to give it a try.

SQLShare Progress

More than half way to my goal!

SQL Denali

I have been tuned in to both keynotes this week at the SQLPass Summit really wishing I was there. I just was in Seattle 2 week ago as a Microsoft customer visiting and learning about some neat new things coming, some of which is being discussed in these keynotes. Seattle is a great place and I can’t wait to get back and have more time to explore.

There are a lot of new things being presented as being in the next release of SQL code named Denali. Some of it looks promising on many aspects. Here are the things I am very interested in exploring more over the next 6-12 months.

Columnstore Indexes
This is what I’m most excited about. Mainly because I come from a BI backgrounnd and have spent years trying to make DW queries run faster. It’s almost a passion of mine so I’m exicted to take this for a test drive in a real situation. There is a very nice whitepaper that is linked on Simons SQL Blog that explains this way better than I ever could.

SSIS Integration within SSMS or as it would be Visual Studio
The current way that SSIS is supported in SSMS is clunky. Some questions I have right of the bat are. Can you manage SSIS packages stored on the file system or only within MSDB? Can you manage packages across versions which is a problem now? I have SSIS 2005 running and can’t manage from my SSMS 2008. I will be interested to see how this is all going to work.

SSIS Enhancements
The undo seemed to be what most folks in attendance were excited about the demo I saw. What I’m more interested in is the performance gains if any that this version will bring. I don’t recall hearing that mentioned or at least it was not a focus of the demo. The data cleansing piece seemed promising but I have a feeling that the setup and maintenance of the rules etc will make that a feature not well adopted. Every version of this has been a vast improvement so I’m sure this will just be more solid than the last. SSIS Junkie has some great links to explore more of whats in store.

Visual Studio vs SSMS
I’m excited to see the movement towards integrating the DBA’s and the developers tool sets. However I still struggle with how this will all work and the adoption rate. I still have folks using Query Analyzer, old habits are hard to break and yes we still have SQL 2000 instances running production apps. The big win here for me is source control integration with TFS and working in a common tool set. A rough spot at our shop is moving code from Dev/Test/Prod and hope this will improve that process over what we have today. The new features in the demo seemed pretty good and much of what you would could get with add on tools from major vendors built in which was nice to see. I’ll be going there but not sure how this will change the daily lives of the team. Brent Ozar did a nice write-up on some of the features here.

Let’s Go
So at this point I’m excited to get a system up with this on there and touch and feel for myself. In the meantime I’ll keep reading and learning from everyone out there rapidly blogging and documenting what they are finding. Enjoy and let me know what features interest you the most.

SQL 2008 Sparse Columns

Sparse Columns
I had my first exposure to one of the new SQL 2008 features that I frankly didn’t even know existed. The feature is the ability to turn on an option to make a column sparse. I won’t go into the details of why to use this as others have covered it pretty well here and here.

Job Failed
I was looking into some job failures this morning and one in particular stuck out at me. Essentially we have a new job setup that checks a few databases on a SQL 2008 server looking for tables and indexes that were created and not setup with page compression. The job then attempts to alter these tables or indexes with the page compression option turned on. The reasoning behind when to use compression is for another day and varies based on many factors. In this case we have determined that any new objects in these database should have page compression turned on.

What Happened?
The job is setup to run once a week on Sundays and this morning the server was red in my Quest Spotlight console so I started to dig into why. Here is the error from the Job with some names changed to protect the innocent.

Executed as user: XXXXXXX. Cannot alter table 'yourtablenamehere' because the table either contains sparse columns or a column set column which are incompatible with compression. [SQLSTATE 42000] (Error 11418). The step failed.

First off what a great informative error message, kudos Microsoft. I knew immediately why it failed but was not sure what sparse columns were. So I used my trusty search engine and found some info on sparse columns in SQL 2008. Within 2 minutes I knew what the issue was and that I needed to exclude tables with sparse columns in this process. Essentially leaving these tables without compression. Here is the query in the job that looks for uncompressed objects.

DISTINCT OBJECT_NAME(A.object_id) AS [ObjectName]
sys.partitions A with (nolock)
INNER JOIN sys.objects B
A.object_id = B.object_id
WHERE type = 'U'
AND data_compression = 0

To quickly fix this problem I added the following line of code to the where clause of this query that identifies objects that are uncompressed which filters out any objects with sparse columns.

AND A.object_id NOT IN ( select distinct object_id from sys.columns where is_sparse = 1

Green Again
There is probably a much more elegant way of doing this and I’ll continue to look for that but for now the job runs, skips the one table with the Sparse columns in it and does what I want. This server is no longer red on my Spotlight console and all is good for now anyway.

Bing, Bing, Bing you just lost me!

Quick post, this is just utterly annoying. I find it very hard to believe that I just searched so much that I triggered a search engine to prompt me to determine if I am a program. Maybe I am a program and part of the Matrix?

Gotta Love Mondays, anyway looks like my default search engine is going back to the G guys. By By for now Bing!

Bing Verification

My Browser has Grown Up

I have been a big fan of Google Chrome browser. I have been using it for about 6 months now. It is in my opinion way faster than IE and just seems less bulky. I was never a big user of FireFox but I do know that many IT folks tend to gravitate toward that browser. I’m just a Google fan I guess. I use the photo editing software Picasa that they provide and it works for my needs.

I was pretty frustrated by one main thing. I have become a big fan of Delicious recently and there was not a Tag applet for Google Chrome. There was some work around they had posted for a bit but it never seemed to work for me. Well I just happened to go look again the other day and they now have one. You can find it here. If you are not using this tool I highly recommend it. It essentially becomes my list of articles to go back and read. I often get a fire hose of good stuff and not enough time to read them all when I get them. I end up with 30 tabs open on my browser hoping to read them all that day. Sometimes this just doesn’t happen and I need to bookmark them for a better time. Today is one of those days, Friday before a holiday weekend, parking lot is half empty and not 1 email in the past hour. Perfect day to catch up on the reading and learning that I say I’m always going to do.

I simply tag the page and then every so often I go back in to Delicious and look at the last few things I have bookmarked. Read what I want, add comments and Tag appropriately so I can find them again when needed. I can access from any computer so this works well for things I find interesting at work that I want to get back to maybe at night.

I know that Delicious can do so much more but for me it’s simply a tickler file of things to get back to. It’s working well and I’m glad that Brent Ozar turned me on to this great app from this article. If you care to follow what I’m tagging you can find my list here.

If you have not tried this out I suggest you give it a try.