free html hit counter
Posted on: Sunday, May 8, 2011 by Rajiv Popat

Thoughts On Tweaking Your Blogging Routine.

Blogs are cool. Blogs can be awesome. Blogs allow you to participate in this amazing thing we otherwise refer to as the internet. The content from Blogs feeds Google, keeps it alive and helps it grow.

From a personal aspect; blogs are important because blogs allow you to continue jabbing and help you hone your writing talents; and we all know how important writing talents are; even if no-one reads what you write.

Seek blogging advice from anyone who has blogged for more than ten posts and the advice you’re going to get is: pick a schedule and live by that schedule.

It is the single best advice anyone can give a young and budding blogger.

It works; till the time it doesn’t work and then you need to tweak it.

Think about this advice in terms of plain old mathematics. It’s like this; you become a better writer by reading more and writing more and given that you are reading as much as the other person; if you are cranking out four articles a week your chances of getting better at the craft of writing are four times more than someone who cranks out one article a week. Right!?

Well, the statement is moooostly right.... for you.... if.... you are starting out a blog or want to get into the flow of writing consistently. A regular stream of blog posts on a well-established schedule gets you in the flow for writing.

Besides it makes life simple for the Google crawler and your readers because they know exactly how much content to expect from your blog and when to expect it.

It forces you to show up even on the most depressing of days.

Like I said, the advice of writing regular blog post works.

At-least till the time it works.

And then comes a point of time in your life when the advice stops working and you need to tweak your schedule.

Here are some reasons why you might end up tweaking your publishing schedule:

  1. You’ve done enough jabbing for a couple of years and now you want to move into deliberate practice of writing by producing articles, books or relatively longer essays which will need your concentrated effort for a week, sometimes more than a week, sometimes a month and sometimes even multiple months before you can publish them out to the world. Posting four posts every week might not be possible here.
  2. You’ve done enough writing about code and now you’re going to be writing even more awesome code or doing something life changing. A classic example of this being Jeff Atwood who is the primary proponent of the “one step success” for your blog which was blogging regularly. Jeff started Stack overflow (now called Stack exchange) and slowed down publishing posts on his own blog.

Like I said, the advice works and it has it's own benefits while it works.

Then you reach a point in your life when you realize that just jabbing is not taking you to the next level in practicing your craft. You realize that just doing a given number of posts a week isn't enough deliberate practice of your craft.

When you have that realization it is time for you to slow down and focus on what is most important to you.

I’ve been blogging about three posts a week for months now. I've been contemplating the idea of longer articles on topics I feel strongly about, working on the book I said I would be working on, trying out some serious humor and doing some serious bullshit busting.

With those intentions in mind I am going to relax my publishing schedule down from three posts a week to sometimes two and sometimes even just one post a week.  On any given day the writing I do is probably going to increase. The frequency of publishing however might slow down a little in the weeks to come.

What that means that while the quantity of the posts might go down the quality of the posts that you see here might shoot up.

These posts will be edited much more meticulously. Some of them might be long enough to warrant turning them in articles that you can download in PDF or Kindle formats. You might also continue to get a full blown eBook or Kindle book every few months.

In the fitness world they say that nobody ever gets stronger by doing the same exercise again and again.

In the world of neuroscience they say that nobody gets smarter by solving the same kind of math problems again and again.

Continuously publishing three posts a week was a commitment I made to myself for months and it was a commitment that taught me a lot of things. It has now become a part of my life.

Having said that however, I feel I have grown out of and it is now time to master other aspects of writing. Even if that means reducing the number of posts I publish every week.

Long story short, the blogging frequency of this blog ‘might’ go down from three posts a week to two and sometimes even one a week. But I will hopefully continue to show up without fail. Every week! Consistently. And with lesser number of posts and more effort the content is expected to get better.

Expect to see posts with more content, more research, more fun and more takeaways. Expect to see PDF or Kindle versions of articles and occasionally also expect to see some eBooks once or twice each year.

Now, if you are a young and budding blogger seeking advice on how you can become a better blogger, here’s my advice:

  1. Pick a schedule that you are comfortable with and stick to it!
  2. And Do NOT change your schedule frequently.

More often than not any temptation to change the schedule is out of hidden laziness and your lizard brain playing tricks with you so be very careful before you decide to change yours. And when you do reduce your frequency; make sure you double your efforts.

That by the way is EXACTLY what I intend on doing on this blog; so do keep reading.

posted on Sunday, May 8, 2011 8:29:27 PM UTC by Rajiv Popat  #    Comments [0]
Posted on: Saturday, November 21, 2009 by Rajiv Popat

User Interface Design Is Not About Lorem Ipsum And Pretty Boxes.

Wikipedia defines Lorem ipsum in a website design proposal as - generally incomprehensible placeholder text allows viewers to focus on the visual elements, rather than the content.

Lipsum - the online Lorem Ipsum generator is bullish about the Lorem Ipsum history and future:

Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.

It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged.

It was popularized in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.

To be honest; Lipsum has a genuine reason to be bullish about Lorem Ipsum.

Graphic designers around the world have loved Lorem Ipsum for more then one generation.

Even today; I see countless graphic designers designing their site-layout using Lorem Ipsum; sending the designs out for review and then replacing them with real content as the project moves ahead.

Most graphic designers love Lorem Ipsum because it keeps them from having to work with those nasty business guys who are way too lazy to think about or give them genuine content upfront. Lorem Ipsum also shields the designers from those pesky programmers who make them fix their nasty HTMLs every time the designers try to work hand-in-hand with them.

The business folks love it because it allows them to walk up to a creative designer and say - 'I want a marketing website' - without having to worry about giving them any content or specifics before they start designing the website.

The programmers love it; because they can now develop their system without having to worry about synchronizing their development with the pretty-looking-screens.

Throw in a few stock photos; make pretty boxes; fill them up with Lorem Ipsum; and you are good to go. Life; for a designer; was never so beautiful before Lorem Ipsum showed up.

Put simply; at the end of the day; programmers love building ugly systems; graphic designers love building pretty boxes and business users love telling both programmers and graphic designers what they should do without feeling the need to take any real responsibility of the content or the message upfront. Lorem ipsum is a universal glue that makes this entire ecosystem function and makes all this possible. No wonder; we all love Lorem Ipsum.

Jason Fried at 37Signals however; has the courage to walk a different path and say no to Lorem Ipsum. He advices developers and designers to consider Lorem Ipsum their enemy; not their friend. He explains:

Lorem ipsum dolor has long been known as the designer’s best friend. We think it should be your enemy. Using lorem ipsum dolor reduces text-based content to a visual design element (a “shape” of text) instead of valuable information someone is going to have to enter and/or read.

We recommend that when you build out interfaces you use real and relevant words not “lorem ipsum” representative text. If your site or application requires data input, enter real and relevant words and type the text, don’t just paste it in from another source. If it’s a name, type a real name. If it’s a city, type a real city. If it’s a password, and it’s repeated twice, type it twice.

The goal here is to get as close to the real customer experience as possible. Don’t abstract yourself from the real experience. Every layer removed pushes you further and further away from the actual customer experience.

Ben Hunt in his book; Save the Pixel; explains the same thing from the perspective of a veteran experienced web designer. He explains:

Design the content, not the box it comes in.

Use your pixels on things that communicate meaning. It used to be very common for web designers to make just templates – attractive or jazzy  containers which would have “content” added at a later time.

This is a fundamentally wrong approach, because it doesn't fulfill the designer's mission - facilitating communication. If you find yourself decorating the package, rather than crafting real, meaningful content, stop & ask: “Are these pixels best used here?”

You want the visitor to focus on the navigation & content as that's where the signposts are that point to the goals.

When you are designing the layout of a website; it is easy for you to get consumed by just the layout and not even think about the content or what the design is trying to communicate in the first place. Ben describes this concept rather articulately in his book - Save The Pixel. In his book he explains:

Design isn't Art. It's not about creating beautiful or thought-provoking things for the sake of it. Design is a discipline – creating communication with a purpose.

So the next time you head out to a Lorem Ipsum generator; try spending some more time on your content or your system and try to figure out the message that your application or website is trying to convey using your design. If you don't have a strong message your pretty boxes will not mean a thing.

Now; go think of a concrete message and decent amount of content or the system before you even open Photoshop.

This dear reader; is your chance to designing something meaningful.

Don't build pretty boxes with Lorem ipsum. Let your design be a part of your communication; what you want to say and what it is that you stand for.

I wish you good luck.

posted on Saturday, November 21, 2009 5:57:57 PM UTC by Rajiv Popat  #    Comments [6]
Posted on: Tuesday, September 18, 2007 by Rajiv Popat

Did You Loose Your Visual Studio Intellisense?

In one of my older posts I announced out loud and shamelessly that I find it really difficult to code without intellisense. Besides Intellisense, I’ve always been for tools which increase developer productivity. A couple of months ago I played around with a Resharper 3.0 Trial to see what the interesting new features were. I have all good things to say about Resharper 3.0 but this post is not about any of those good things. It’s about one tiny little complaint I have towards Resharper Trial.

It killed my Visual Studio Intellisense.

After I had installed it and played around with it for around a week I lost intellisense from my Visual Studio – suddenly and completely. I could use the Intellisense Resharper offered me but Visual Studio Intellisense wouldn’t work while the trial was working. Soon the Resharper trial was over and the Resharper Intellisense died as well. Another colleague had a similar problem. She had tried out Resharper, and had lost her intellisense as well. Google and you'll see a few others running into similar problems.

As soon as I lost intellisense, I hit Resharper Options and told it not to use Resharper Intellisense and to use Visual Studio Intellisense instead – restarted my Visual Studio - didn’t help.

Turns out, whatever Resharper had done to turn Visual Studio Intellisense off is very easy to undo. Turning intellisense on or off in Visual Studio is a simple option I didn’t know about. In-fact, it is a part of those thousands of IDE options that a lot of us don’t even know exist.

If your Visual Studio Intellisense does not work, because you installed resharper and it killed your Visual Studio intellisense (or if your Intellisense doesn’t work for any other reasons) here’s a process to get back your Intellisense:

The picture pretty much sums up all you need to do to get your intellisense back in Visual Studio but if you’re a “Follow-the-instructions” kind of guy here are the instructions:

  1. Hit Tools / Options in your Visual Studio.
  2. On The Left Tree click Text Editor / All Languages.
  3. Under Statement completion ensure that “Auto list members” is checked with a tick mark. (Make sure that this option is fully selected and not partly selected)
  4. Under Statement completion ensure that “Parameter information” is also checked with a tick mark. (Make sure that this option is fully selected and not partly selected)

If you’ve downloaded Visual Studio 2008 / Orcas Beta Releases and have observed that you do not get intellisense (for whatever mysterious reasons) the above procedure works with Oracas too and will help you get your intellisense back. And if you're like me, hopefully with your intellisense back, the world will seem like a slightly better place to live in. :)

posted on Tuesday, September 18, 2007 8:28:13 AM UTC by Rajiv Popat  #    Comments [11]
Posted on: Wednesday, June 20, 2007 by Rajiv Popat

Download Multiple Files In One Shot With FlashGot.

Ever felt the “need” to go grab all those files you are seeing linked to the current page? Deep-copy type downloading of a website to your box for offline viewing isn’t new. Wget did quite a good job at it back in the days. In fact, even today it does a pretty good job. You give it a URL, you specify the recursion level of “N” and then it goes and fetches all the URL’s which are connected to the specified URL and are “N” levels deep.

Recently however, I was faced with a slightly complicated problem. What I really wanted to do, was to download a set of 500+ word documents from our work intranet (which runs on SharePoint 2003). The word documents were a part of a SharePoint Document Repository that has a gazillion other Word documents. I was faced with the default SharePoint Document Repository View that has Filter options. After I had set the appropriate filters and was able to see some 500 odd documents that I wanted to download on screen, the million dollar question surfaced in my little mind:

“Ok, now I can see the documents I want. How do I click on all these 500 document links and get them to download at-once, in a single shot, without having to go through the download / save / open dialog box for each and every one of those documents?”

I’m pretty much an Internet Explorer guy because I think it starts up faster than Firefox (and for the Firefox lovers, that’s just a personal feeling so no hate mail please :)) – but there are exactly the scenarios where the power of community plug-ins in Firefox leaves Internet Explorer cold.

FlashGot is a cool little Firefox plug-in that can do some serious damage when it comes to downloading multiple files!

Let’s try and juggle around with some hypothetical situations to see where you would need this tool:

  1. You’re looking at a SharePoint document repository and you want all 100 documents you see on screen downloaded to your box, right now!
  2. You’re looking at a SharePoint document repository, you’ve filtered your view and you want all those documents that you see on your screen downloaded to your box, right now!
  3. You are looking at any screen with X number of downloads and you want all those X downloads on your box, right now! (Ok, I think you get the idea :))

The bottom line is that if you often stare at your browser window, look at a web-page and tell yourself – “wow, this page has so many useful links - I wish I could download all of them in a single click” this tool is the Holy Grail you’ve been looking for!

Besides letting you get all those links, FlashGot lets you filter based on File-Types. So if you are in a page which has hyperlinks to 100 documents, 100 MP3s, 50 GIFs, 20 other HTML pages and 12 Text files and you just want to download just the 100 documents in a single click the FlashGot “More Options…” menu comes to the rescue. The "General" Tab in this is the same dialog also lets you pick your favorite download manager.

You just pick the type of files you want to download, click [Tools / FlashGot / FlashGot All…] menu, and pick a folder. That’s it. FlashGot Rips the URLs and adds them to your favorite download manager. Which means, you can now let the Download manager slog away at saving each download as you give yourself a well deserved nap. :)

Initially, when I was told we needed a set of 500+ word files from a SharePoint List, My first reaction was that I would have to use the SharePoint web-services and throw out some custom code to do this. But then, why write code when you can get the same results much faster using a tool? :)

I downloaded some 500+ word attachments from a SharePoint 2003 filtered view using FlashGot and I’m a FlashGot Fan already! Give it a shot, it’s free.

posted on Wednesday, June 20, 2007 1:56:12 PM UTC by Rajiv Popat  #    Comments [0]
Posted on: Tuesday, February 6, 2007 by Rajiv Popat

Fixing Package Load Failure Error with Windows Workflow Foundation and Visual Studio

I've often said that the really good part about Windows is What can be done, or sometimes... what happens ("seemingly automatically"), can be undone (and explained). The same is also true for Visual Studio 2005. At rare occasions there are errors in the IDE which seem magical and mysterious. With the right debugging techniques, commands, some time the mysterious errors can be fixed and explained.

I've been busily coding away at Windows Workflow Foundation for a few months now. Currently I'm using the final released version of Windows workflow foundation with the released version of Workflow extensions for Visual Studio, and my experience with WF has been very good. So when this strange error popped up today without any particular reason, I was a little surprised.

The error wasn't very helpful in describing what was going wrong. All it stated was: Package Load Failure. It basically said - "Package 'Microsoft.Workflow.VSDesginer.DesignerPackage, Microsoft.Workflow.VSDesigner, version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' has failed to Load Properly (GUID = {some_guid}..." and then it went on to ask me If I wanted to disable my workflow foundation extensions for now.

Even though my workflow project opened and worked perfectly this was just an irritating error which popped up every time I started Visual Studio 2005. Determined to put an end to this harmless yet irritating error, I started my investigation and Google searches.

I started off with basic commands like - "devenv.exe /setup" - which did not fix the issue at hand. Then there were posts out there which suggested that I delete my native images and regenerate them using the ngen.exe tool. That solution did not seem to work either.

After a few hours of Google searches I came across this post which basically described what the issue was. Turns out, a few days ago I had tried my hand at writing Visual Studio Add-ins and had forgotten to delete an Add-In I had written. Remains of my HelloWorld Addin existed in "C:\Users\admin\Documents\Visual Studio 2005\Addins" on my Vista Partition.

All I had to do was to delete the ".Addin" file (which btw, had nothing to do with Windows Workflow Foundation, so it's a little difficult to suspect this would be the cause of the problem) and as soon as I had done that Visual Studio Started without any Windows Workflow Extension Add-in errors.

If this doesn't do the trick for you - here are few other commands / resources which are usually helpful in catching and fixing Visual Studio.NET 2005 IDE issues:

  1. devenv.exe /setup - very useful for a host of other IDE setting related problems or if you just want to restore VS settings to their original state. Did not help in my case because the failed add-in was something I had written myself and my problem had nothing to do with visual studio settings. In my case, the settings were just fine, even though I had initially suspected that they had gone corrupt. 
  2. ngen.exe - Aaron Stebner's Post describes this pretty elaborately. The basic idea here is to delete your Visual Studio Native Image files and regenerate them using this tool.
  3. devenv.exe /log - this generates XML log which can be quite helpful in trouble-shooting various startup issues with the IDE.

And if you're still struggling Aaron Stebner also provides a Definitive list of workarounds for Package Load Failure errors in Visual Studio 2005, here.

We spend so much of our time with the IDE everyday writing code. Mysterious errors like give us a chance to look under the hood and see what's going on. It's just like the joy of fixing that little problem in your car, yourself. After all knowing how to fix minor problems with your car, or knowing how to replace a flat tire, is also important besides knowing how to drive. Isn't it? :)

posted on Tuesday, February 6, 2007 6:02:34 PM UTC by Rajiv Popat  #    Comments [0]
Posted on: Wednesday, November 29, 2006 by Rajiv Popat

Enabling ASP.NET 2.0 Debugging on Visual Studio 2005, IIS 7.0 and Vista

For the past couple of years my desktop ran on a Windows 2003 Server which was tweaked to make windows look like a Mac. I loved my desktop, but I also loved everything that had been going on the Windows Vista side and had been trying things out on VMWare instances and other non-work machines since the Beta 1 days. I had always stated that I would move to Vista as soon as Microsoft comes out with a Released version. Just wasn't adventurous enough to run a Beta version of an Operating System on my primary work machine. Yesterday, I finally made the move to Vista.

As a developer / geek, there were a few minor hiccups in 'getting up and working' and I'll probably post about all of those and their workarounds in other posts but this post is focused on getting Visual Studio 2005 to run with F5 / Play based Debugging on ASP.NET 2.0 Web-sites. I thought I would post about and document this, since this was the most tricky part and 'almost' kept me awake all night, trying to figure out what the heck was going on.

The Vista Install itself was a Smooth install. Minor hiccups here and there. Surprisingly, there were no drivers issues (things have come a long way since the Beta 1 days, when I had a bad time with the drivers). I didn't have to install a single driver manually. It detected everything on my Dell 700m and pretty much installed it, automatically. Sweet! Most hiccups were compatibility issues with 3rd Party tools, which is expected anyway, because of the enhanced security. Workarounds and alternate tools that work in Vista were pretty easy to find. Yes, there were some very minor issues with the default display drivers - which work fine with extended monitors but somehow, will not let me run on a external projector. I think I can live with that till Dell comes out with their own drivers. [This was resolved. If you googled to this page, in search of Intel-Adaptor-on-Dell Display Driver issues for Vista, see update at the end of the post.]

The real "Opps!" moment however, came when I was done with installing IIS 7.0 and started installing Visual Studio 2005. This is when I was warned that Visual Studio 2005 has known compatibility issues with Vista. I ignored the warning and the installation continued smoothly. Finally I decided to test my installation by making a simple ASP.NET 2.0 Hello-World website. Apparently, Studio would not let me open a website from the Local IIS.

The Warning Said: "You must be a member of the Administrator group on the local computer to access the IIS..."; This however, is a known issues and has to do with the Vista's On-Demand Administrator (i.e. UAC) feature. Installation of additional IIS components, A right click on Studio's Shortcut, followed by a "Run As Administrator" fixes this issue. There's a detailed post on this both as Scott Guthrie's Blog and MSDN. Both of these are pretty helpful and elaborate; So I won't repeat that information here.

After I did everything mentioned in the MSDN / Scott's Post, I could now open web sites from the local IIS. But wait, that was not the complete Fix. Once those steps were applied there were issues with Visual Studio which wanted me to have Integrated Windows Authentication in the Virtual Directory. Now this is something that is not installed by default with IIS 7.0. Which means that I had to explicitly go to "Turn Windows features on or off" and install Integrated Authentication for IIS. (I went ahead and installed all three since I am used-to and use all three - Integrated, Digest and Basic, in different projects. With this done I configured the virtual directory to use Integrated Windows Authentication using IIS Manager.)

So, was that it? Not Really. I still wasn't able to Debug my website. This time Visual Studio showed a security Dialog Box telling me that wasn't able to start debugging on the web server:

The workaround for this is to configure the site / virtual directory (depending on what you're trying to debug from Visual Studio) to run under the "Classic .NET Pool". To Do this just right-click the site on the new IIS Management console and click "Advanced Settings". In the Property Pane that opens up, change the Application Pool Setting to "Classic .NET AppPool" instead of "DefaultAppPool". 

With this done I opened my sample site in Visual Studio.NET 2005 and clicked the Play button (F5 key) and the Debugging worked, just like it should!

Yet another tricky Gotcha here is that changing the Application Pool for the website to Classic .NET pool doesn't seem to change the pool for all virtual directories under it. In other words, if you moved your entire site to DefaultAppPool but are trying to debug a specific virtual directory, you still need to go ahead and manually change the Advanced settings of that Virtual Directory to run Classic .NET AppPool, to enable debugging on it.

There were some 'really convincing' posts out there, that I came across, in a couple of Forums (I can't seem to find the link to those) where people suggested that F5 / Play and Debug is something you cannot do in Vista. I guess, that information is either old (from early beta builds) or just inaccurate. I would have liked to post a reply there and clear this up, but I can't seem to find the link again after reaching the right answer. So, I'm going to post this article here and hopefully Google will index it and help others who're trying to get this to work. 

Update [12/29/2006]: If you are looking for more details and other approaches, Mike has recently provided detailed step-by-step instructions on various approaches you can take and has described several tradeoffs associated with each approach. His detailed post is available here.

Update [02/07/2007]: Even though this post is not directly related to Dell 700m Intel Display Cards and Vista drivers for these cards a lot of people seem to be googling their way to this page in search of similar answers. This update might help. If you are running Intel(R) 82852/82855 Graphic Adaptor on Your Dell 700m and are having problems switching to clone mode with Dual Monitors or running on a Projector by pressing Fn-F8 key, start your Add New Hardware Wizard on Vista, select the manual driver installation process, choose Display Adaptor in the type of device and from the list of drivers Vita offers, choose Intel as Manufacturer, and "Intel(R) 82852 / 82855 GM/GME Graphics Controller (Microsoft Corporation - XDDM)" as your driver, even if Vista had detected your driver successfully during installation. Complete the Wizard and Reboot. Once the reboot is complete, you should be able to see two display adaptors with the same name installed in your Device Manager. Now disable Extended Desktop and should be able to switch to a projector or clone your desktop on your second monitor using the Fn-F8 key.

posted on Wednesday, November 29, 2006 10:52:23 AM UTC by Rajiv Popat  #    Comments [0]
Posted on: Tuesday, November 21, 2006 by Rajiv Popat

Skin PowerShell (Monad) to Customize it's Look and Feel

If you feel that typing "Start / Run / Notepad.exe" is faster than clicking on the Notepad icon or if you spend more than 15 minutes on the Command Prompt everyday you probably know a lot about Monad / Powershell by now. I fell in love with this one the day I saw it in it's Pre-Release versions. Of course I didn't see the light instantly, but it grew on me - slowly - over time.

I won't waste a lot of your time posting about the things that can make you fall in love with this tool - for example - the fact that it returns objects instead of strings, or the fact that it could change the world (No kidding!), or the fact that you can access .Net DLLs from within PowerShell. I won't even state the fact (ok, personal opinion :)) for example, that it's way cooler than any Linux console I've ever worked with.

I could write tons of posts on PowerShell because I've been hooked on to it, but then in all probabilities, if you're here (and are still reading this), you're hooked on to it too and you probably know all that stuff already. And if you aren't hooked and you're just the curious type, go ahead, click some of those links I mentioned above and read a little. The learning curve will be a little steep at first but I guarantee that you'll "see the light" soon. Honest!

I can go on and on about Powershell basics, for a very long time. But then, I've been busy, and now I realize that I'm a little late on posting about that. People everywhere have been doing an awesome job at writing about PowerShell and most of the basic stuff anyone wanted to find out about, is already out there.

People have been building Utility Scripts, Powershell Analyzers and some are even developing Sharepoint Providers for PowerShell (neat idea!). But being the stupid guy that I am, for the past couple of months that I've been playing around with PowerShell, there's just one thing that has been pinching me:

"Okay, All this is cool and I get-it, but on a slightly different note, How do I Skin this thing and make it look nice so that I can show-it-off to everyone else while I am working inside a Powerhsell window?"

And then there were others who were asking similar questions. On the Powershell team blog there are remarks like:

"Absolutely no improvement over the ugly looking command window. With the name change if someone in your group maybe can push for tabbed Power shell?"

And the reply is pretty much a shout to the 3rd Parties to build-this-thing that lets you Skin PowerShell:

"We share your pain. We Really do. This just fell into the 'to ship is to choose' category. We designed it so that 3rd parties could do this. (3rd parties - do you see how many people would be interested in a great PowerShell Host?)!"

That's how most of us are - aren't we? We just want to see the Dancing Banana in our Development IDEs! What the IDE or the Product does is just so irrelevant! If it can't show the dancing banana we just aren't happy! :)

And I've been looking for this thing, because this-thing-that-lets-me-customize-PowerShell-Look-and-Feel is something that "has to be developed" by someone! I mean, come on! People have written IDEs for this thing! Somebody must have written something that let's me skin Powershell!

Since Google started crawling this site I'm seeing a jump in visitor counts from around the world which is kind-of interesting and fun. So, If you've landed on this page from Google just because you were looking for a similar answer you're in luck! Yes, it's possible to skin PowerShell and make it look like the way you want it to look like.

Turns out, there is, in-fact an Uber-cool free and open source application that let's you do just that. It's called Console and even though I've used console before, for quite some time, I didn't quite figure out that Console is NOT just a Command Prompt replacement. Console works with Virtually anything - CMD.EXE, Cygwin and a host of other Shells. So basically, there's no reason what-so-ever why it shouldn't work with Monad / Powershell.

Since I like the Mac look so much - Let's make Powershell have some background-transparency so that we can see my Mac'ish wall-paper behind it. Long story short, Let's make Powershell look something like this:

The steps are pretty simple and straight-forward:

  1. Get Console.exe (Don't get the 2.0 "Demo" version because that's WIP and doesn't do much. Get the stable release instead.)
  2. Go to Control Panel / System / Advanced Tab / Environment Variables and create a new variable called "COMSPEC". Set it's Value to "Powershell.exe"  (Assuming you have PowerShell installed already. You can also do this from Console Configuration Files, but this is the easy way).

That's it. You're Done. Start Your Console and it starts up by skinning Powershell instead of the usual command prompt. You should now be able to skin it and theme it using all the rich options Console provides in it's configuration files. And if this isn't enough, go ahead, see if you can have The Dancing Banana in PowerShell! :)

posted on Tuesday, November 21, 2006 11:23:29 AM UTC by Rajiv Popat  #    Comments [2]
Posted on: Friday, October 20, 2006 by Rajiv Popat

A huge Trace.Log file deleted and a couple of Gigs reclaimed!

There is one thing good about windows. What can be done, or sometimes... what happens ("seemingly automatically"), can be undone (and explained). Last week one of the 3 power-horse machine that I work on mysteriously slowed down. I'm a multi-tasker who has three instances of Visual Studio.NET open with at-least 10 other applications running simultaneously at any given point of time.

So, initially, this slow-down seemed "normal" and the tendency was to monitor the RAM usage. But Task Manager and Process Explorer did not reveal anything peculiar. After some more investigation and using FileMon it was evident that the bottle-neck was actually the HardDisk which kept constantly thrashing. The Instant reaction was to Defrag the disk which pointed out the real problem. The Defragmenter completed "successfully" with an error :)

The error (included in the Defragmenter report) was that it wasn't able to move a particular file "C:\WINNT\system32\LogFiles\WMI\Trace.log" (No, I'm not using the primitive Windows NT - that's just how I like to name my Windows folder :))

But the real shock came when I tried to get to the file using explorer and see what's going on. Here's what I mean:

Now, I have three problems with this picture here:

  1. That is a 80 gig disk that has cost Money to buy. If there's going to be a 2.5 gig file somewhere on that disk, I'd better know about it.
  2. What-ever wrote that darn thing on this disk, wasted quite a bit of processor and RAM writing it.
  3. I couldn't delete it. Apparently, some process seemed to be using it and locking it!

A little more Googling on Large trace files revealed that there's some utility out there let's you disable trace logging and then allows you to delete these files. Back, in my MCSE days and days of NT 4.0 people like me, who were both into IT and Development, used to talk a lot about Resource Kits and Option packs and things like that. But not a lot of people seem to be talking about Resource Kits these days.

As it turns out, there's a Resource Kit for Windows 2003 available here. And in that long list of tools and utilities is a tool called TraceLog.exe. A quick "TraceLog.exe -l" told me what was occupying the file and writing away to it. A quick "TraceLog.exe -stop" allowed me to stop the NT Kernel Logger from Trace Logging.

Once trace logging stopped I was able to Delete the Log file off my disk and re-run the defragmenter successfully without any errors.

What really bothered me was not knowing what had Enabled Trace logging and the creation of a 2.5 Gig file on my disk. With the file now wiped off and the problem solved I decided to read a little more and do some investigation into the possible causes of this problem.

Discussion threads like this one revealed that the real problem was BootVis

BootVis is a tool which is supposed to provide faster Boots with Windows XP. I Had played with it a few days ago and realized that it does NOT work very well with Windows 2003.

BootVis had started Trace Logging. And in all probabilities, the NT Kernel Logger had diligently continued logging since then (even after BootVis was uninstalled) just because no-one told it that it was ok to stop logging now.

Long story short, I have a couple of gigs won over from a Zombie file that I wasn't going to need / read anyways and the Defragmented disk seems much faster than yesterday; and that makes me a happy man for today.

posted on Friday, October 20, 2006 10:53:08 PM UTC by Rajiv Popat  #    Comments [5]