Uber Workstation: Windows Vista vs. Windows Server 2008

by Jon Davis 25. February 2008 12:08

I have always been adamant that as a web developer it is far better to use Windows Server 2003 rather than Windows XP as your primary workstation. This view became necessary primarily because Windows XP had a stripped-down set of IIS services, namely it was IIS 5.0 rather than IIS 6.0, and it was constrained to not allow multiple virtual hosts on the same machine. This made XP worthless; being a web developer, having the process forced down my throat of building entire web applications as "subwebs" made things infinitely more difficult to develop against. For example, you could never have a simple hyperlink that starts with a slash ("/"). You had to build everything around the ASP/ASP.NET coding model of application root ("~/"), which required you to move all of your hyperlinks to server-side code (<asp:Hyperlink>, or <img src="<%= ResolveUrl("~/") %>images/bleah.gif">).

No more. Windows Vista has multiple web server support. Microsoft perhaps got tired of basically every web developer on the planet expressing their animosity towards the Windows team for their crippling of IIS without even so much as an alternate "IIS add-on for MSDN Universal subscribers" or something. It's full-blown IIS 7, same as in Windows Server 2008.

Now that Windows Server 2008 is released, the inevitable questions should be asked (rather than the answers assumed based on prior experience with XP / 2003): does Windows Server 2008 have any new features that Windows Vista doesn't have, that a typical ASP.NET web developer would want on his workstation, and does Windows Vista have any undesirable features that are not present in Windows Server 2008 that cannot be removed from Vista?

While the answer to both of these questions were "yes" in XP/2003, for Vista/2008 I think the general answer to both of these questions, I believe, is "no".

In Windows 2008 there are a gajillion new services that the next wave of Internet technologies will need on hand for regular development. For developers of one of these next-gen technologies, Server 2008 might be essential. But for basic ASP.NET and WCF development (in other words, for most web developers), Vista can suffice.

And 2008 doesn't really filter out anything from the Vista experience except for the fact that the Vista experience is an option rather than mandatory. That's nice; but if it's going to be used for a workstation, it makes sense to just add it. Only problem is, it's not a complete Vista experience; you don't get the sidebar, for instance, and Call of Duty 4 crashes on a co-worker / friend who agreed to be a Windows Server 2008-as-a-workstation guinea pig. And to be honest, I feel a lot more uncomfortable with all the undesirable new bells and whistles of Server 2008 being available to my workstation than with them missing from a Vista environment.

The only features I saw in Server 2008 that I didn't see in Vista that might be worth something to me were: Multipath I/O, TCP port sharing, and hypervisor (native virtualization) support (which is still in beta). Actually, Vista might have the first two of the three, I don't recall. But I already have VMWare Workstation, which I continue to prefer over that awful Virtual PC platform. Meanwhile, pretty much all of the other stuff, while some of it may be valuable, it's all so server-oriented and not development-oriented that it would make more sense to move that stuff to a VM or external environment anyway.

So my tentative conclusion is that Vista Ultimate is already the ideal environment for a web developer. With it, you have all the basics that you need to build multiple IIS solutions and to test basic WCF solutions. Meanwhile you get to keep the fluff you like (and I do like some fluff on my workstation, gimme Sidebar and stuff), while you can still kill off the fluff you don't like.

Currently rated 3.6 by 5 people

  • Currently 3.6/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , ,

Operating Systems | Microsoft Windows

LION Searching, AJAX Coding With jQuery and ASMX, Jaxer Tinkering, QAM HDTV on VMC, COD4 Gaming, CacheFile Forums, and Jabber Services

by Jon Davis 24. February 2008 23:49

I haven't blogged much lately but there are reasons for that. Lots of little things going on. Lots of things needing closure.

Lots of things, though, that have recently enjoyed closure, and lots of new beginnings.

Blogging about blogging is a big no-no, but let me summarize some of the kewl things I've been either up to or else observing:

  • At work, our internal Lucene server that I built had been flaking out. I finally fixed the glitch, and I was able to continue improvements to it. In the process of trying to figure out the problem, I checked out Lucene Solr a bit. When basic unit tests failed, likely due to learning curve issues, we decided to stick with our own. 
    Ours also has a new name: LION (Lucene Index[ing] On/Over .NET). I'd like to open source it someday but being as it's becoming an integral part of our web site foundations, that isn't likely to happen soon.
    It uses WCF to be accessible over a variety of communications protocols including HTTP (SOAP), TCP, and Named Pipes.
    It uses a custom Query object that supports condition trees, field selections, ordering .. basically all the stuff supported in Lucene query but generic to Lucene (so a SQL Server or a Solr query handler can also be built). A few things I am looking into for enhancing LION are:
    • Faceted searching - This is a term CNET seems to have coined with Solr, but it's a commonplace trend.
      The idea is basically to get a list of "narrow-down" hyperlinks in the margin of search results, along with a count for each; for instance, at eBay, when you search for "dell", in your search results, on the left sidebar you'll get a list of clickable filters, along with quantity of hits, such as "Laptops (8270)". When you click on it, you get more hyperlinks that are more specific as they fall under the filter you clicked. 
      In engineering terms, in the case of Solr, when you perform a search, the search result are cached, and each hit is evaluated againt a list of several known categories. Each hit is flagged for each match with a BitSet (somethng .NET developers are not typically keen to, think List<bool> but more performant). From the BitSet, counting and filtering can be performed quickly.
      The hassle is that faceted searching has several components. You have to consider a) what fields are considered facet categories and b) defined by whom, the document schema or the query, c) filtering the search on the facet, and d) returning the facet counts with the search results.
      I can't move forward yet until I feel confident that I haven't missed a lot here. This is just where I'm at in my understanding of faceted searching thus far.
    • XML-defined schemas - Right now, Lucene document structures are defined in non-serializeable C# classes; that is, they "serialize" by way of manual propogation of property values to a Lucene Fields collection. Ugh. This is an area where when I saw Solr I slapped myself on my forehead and moaned, oh brother, what have I done?!
    • A QueryBuilder object - This is what I typed up yesterday as a design reference (the first sample will be built, not the second, but the second is sort of a psuedo-TSQL explanation of the first):
      Query q = new QueryBuilder(   // most members return a created instance of this
          lionHandler)              // allow the handler to make proprietary query adjustments         
        .Select("IdxA.Field", "A")
          .And("IdxB.FieldB", "B")
          "IdxB.D < 299",           // conditioned facet counting
          "Dingbats $0 - $299"      // facet renaming
          "IdxB.D >= 300",
          "Dingbats at least $300"
        .From("IndexA", "IdxA")     // index selection, 2nd param is alias
          .And("IndexB", "IdxB")
        .Where(                     // returns a ConditionBuilder object with a parent QueryBuilder
          "A < B"                   // condition compares against another field
        .And("B LIKE \"ASDF\"~5)    // parsing
        .AndWhere                   // nests a BooleanCondition
          .Where("A > 4")
          .Or("A < 2")
          .Escape                   // escape nest; optional
        .AndWhere                   // nests another boolean condition, sibling to prior
          .Where("IdxA.C < 50")     // this one excludes a range
          .Or("IdxA.C > 500") 
        .AndThen                    // returns the parent QueryBuilder instance
          // todo; see FacetCount for syntax
        .Order("A", "B")            // sort the results
        .Page(2, 20)                // page 2, 20 hits per page
        .Query;                     // generate the query object
        // equivalent:
      Query query = QueryParser.Parse(
        lionHandler, @"
          IdxA.Field AS A,
          IdxB.FieldB AS B
          IdxB.D >= 300
        )   AS " + "\"Dingbats $0 - $299\"" + @"
          IdxB.D >= 300
            AS " + "\"Dingbats at least $300\"" + @"
          IndexA AS IdxA,
          IndexB AS IdxB
          A < B
          AND B LIKE " + "\"ASDF\"~5" + @"
            A > 4 OR A < 2
            IdxA.C < 50 OR IdxA.C > 500
          --todo, see FACETCOUNT for syntax
        ORDER BY A, B
        PAGE 2
        HITSPERPAGE 20";
      As is evident, there are a LOT of new features to be implemented in the actual service in order to pull a query like this off, such as comparing field against another field (a feature SQL users are spoiled on).
    • Index joining - I would love to be able to filter out the results of two distinct queries and join on a field value of each, i.e. an INNER JOIN. That's much further down the road, though.
  • A co-worker has been having a lot of fun with jQuery lately .. actually, both of the co-workers on each side of me have been. But on Friday the fellow engineer to my left had fun getting jQuery to do AJAX/JSON calls to ASMX and SubSonic, where the ASP.NET web app had a reference to System.Web.Extensions.dll (from GAC-deployed ASP.NET AJAX; this DLL is the dependency for JSON-enabled ASMX). He had to add [ScriptMethod] attributes to the method signatures, and the MIME type in the AJAX request from Javascript *HAD* to be application/json or else the ASMX would refuse to serve JSON. This required a custom getAsmxJson() function for jQuery, but fortunately jQuery, thanks to Javascript itself, is rediculously easily extensible.
    In the end, he was able to build a framework where he could draw up a simple HTML form, a quick and dirty SubSonic object definition, and a simple ASMX handler, to end up with a completely AJAX-controlled form management system. 
    I suppose in retrospect he could've just used a ScriptManager and WebForms for even more hands-off ease. I think the long-term goal for all of us, though, is to move away from WebForms and move towards a web services oriented web architecture.
  • The same fellow also took a look at Jaxer. I knew he'd get into this.
    I didn't realize it but Jaxer being an Apache mod it will support working with PHP-generated output. Nor did I realize that the discussion of wanting an ISAPI module for IIS would not mean an alternate ISAPI handler from ASP.NET, but rather a complementary one, just as it already works with PHP on Apache.
    The discussions, then, are valid, that a) <script runat="server"> won't work in an ASP.NET environment combined with Jaxer, and b) .NET-enabled Jaxer is otherwise feasible with the creation of an ISAPI module. Just needs to happen already.
    These are exciting times. Jaxer is cool stuff. There are a lot of concerns, though, that people have that Jaxer, having the Mozilla DOM on the server (unique to each reqeust and/or user session), will not scale well. Fortunately, it can be argued that server hardware capacity is now continuing to grow at a faster rate than user demand. I don't know how true that argument is, though.
  • At home, I obtained a Silicon Dust HDHomeRun unit to enjoy QAM digital cable over my analog cable subscription, piped to Vista Media Center as though it were an ATSC broadcast. Vista Media Center [still] does not support QAM (unencrypted, open television over cable) broadcasts for whatever reason, but HDHomeRun is one of only a couple known workarounds. It has two tuners, both support either QAM (cable) or ATSC (antenna). It not only works, but it works surprisingly well. The signal quality and quantity of ATSC (antenna) broadcasts is remarkable, it's tragic that you can't combine QAM and ATSC on Vista Media Center (due to the fact that VMC doesn't know that the two tuners are different). But the tie-in with VMC was otherwise nearly seamless. I am now recording shows like American Idol, Nature, The Tonight Show, etc., in amazing high definition detail. 
  • I am also finding myself firing up my Xbox 360 a lot less lately. The majority of my Xbox 360 time before had been for the Vista Media Center Extender, to watch live and recorded TV over the network from my PC. Now that I have an HTPC right in my living room, there's rarely any need for the Extender; I only need it now if either I'm already in the Xbox "desktop" or else I'm finding the PC overwhelmed with other tasks or something, such as if audio and video are out-of-sync, which happens less when viewing from the Xbox.
    I also installed Call of Duty 4 on the HTPC today, and I downloaded (and purchased) Pinnacle Game Profiler, which makes COD4 on the PC work pretty much just like COD4 on the Xbox, and so far it seems to work great!! I also got it to all work with Half-Life 2, which required me to reconfigure it slightly so that it didn't always sprint when fully forward, and I mapped sprint to left joystick button and flashlight to right joystick button.
  • I finally restored forums for CacheFile.net at http://cachefile.net/forum/. I also upgraded to phpBB 3; hopefully its CAPTCHA is better; I may end up doing some home-grown CAPTCHA.
    I had shut it down because of the spam that was posted there every day, most of which was pornography, and some of which included actual porn images in the posts. It's just awful, as cachefile.net doesn't target an age group at all.
  • Over the last couple weeks, I also got e-mail services working on the dedicated server that hosts cachefile.net. You can reach me at jon@ or at stimpy77@ at that domain name. Forum posts will now propogate announcement e-mails properly now, too.
  • Speaking of cachefile.net, I have been keeping it up-to-date. Updates have been about every two weeks or so now. These updates were part of the motivation to get the forums back online, though; I thought about adding a blog instead, but I figure, nah, discusson forum is better. We just need to get the spam under control.
  • This evening I set up Jabber services at headsense.com. (Feel free to sign yourself up. My account there is stimpy77 -AT- headsense.com, with proper treatment of "-AT-" of course.)
  • There's more I've been up to, but these are all I have time to bring up now. I wish I could blog in detail about all of these things, but at least I mentioned each of them.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5


The Best Summarization of Web 2.0 Ever

by Jon Davis 24. February 2008 23:35

If you were ever confused about what "web 2.0" really means, forget about everything you've picked up on regarding:

  • technologies (mashups, syndication)
  • web design ("the web 2.0 look")
  • MySpace / facebook
  • RSS / Atom

.. and just watch this video, which in less than 5 minutes of speechless video summarizes where we were in web 1.0, where we are now, and what our objectives are going forward. Very well done, very beautiful. Also a bit dated. Web 2.0 as a topic of discussion is behind us, I suppose .. call it nostalgia. :)


Currently rated 3.0 by 5 people

  • Currently 3/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: ,

Computers and Internet | Web Development


by Jon Davis 13. February 2008 10:16

A few days ago I posted a link / observation of LINQ-to-Javascript a.k.a. JSLINQ.

Ajaxian just pointed out a new project called LINQ-to-JSON that is more LINQ-like. One would use JSLINQ for working with any type of arrays in real Javascript. The LINQ-to-JSON solution looks like a truer LINQ coding experience; however, I can't tell if it's actually for Javascript or JScript.NET, the latter of which is not web browser Javascript and only runs in .NET. I'll check it out further but wanted to raise my eyebrows first. :P


Currently rated 1.6 by 16 people

  • Currently 1.625/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , , ,

Open Source | Software Development | Web Development

Stay At Home Server

by Jon Davis 12. February 2008 16:05

I dunno how long I'm going to keep up this silly home entertainment PC line of blogging but ...


Nothing you never needed nor wanted to know about computer servers... In your home. This microsite uses comedy and charm to lure people to Windows Home Server, which is kind of a mom-and-pop downgrade to the Windows Vista Ultimate based 24/7 PC.

LOL @ to the housewife, "Jealous?"

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5


Computers and Internet | Microsoft Windows

Xbox-style LOTRO With SwitchBlade

by Jon Davis 11. February 2008 02:11

Before I call it a night I wanted to blog a note that I thought I'd stumbled across mention somewhere that it was possible to play a PC MMORPG with a wired Xbox 360 controller. Sure enough, I found the answer to this at http://www.switchbladegaming.com/. It targets World of Warcraft alone for now, but the controls are customizeable and LOTRO is a WoW look-alike anyway so the mappings pretty much just work. There was a little bit of learning curve, but once I realized that I need to hold the right bumper down to rotate my character I was good to go.

I didn't have time to play it much, but testing it, it seemed promising. An MMORPG is meant to be played with a keyboard handy--those are real people you're playing alongside, after all, so you should be free to chat. But I like the idea of running around as a Hobbit without a mouse when I'm sitting at my sofa.

Tomorrow or sometime this week I hope to try the Xbox controller with my PC flavor of Call of Duty 4. I had the Xbox flavor of the game as well, but I immediately sold it when I realized that the PC copy would suffice.

Currently rated 3.3 by 13 people

  • Currently 3.307692/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5


PC Gaming | Xbox Gaming

Blog Went Down And Then It Went Up Again .. Boing Boing

by Jon Davis 10. February 2008 20:59

I am on the slow but inevitable journey to reclaim my home office. Out goes the rediculously noisy home computer, with big, noisy, dust-ruined fans on top, back, side, and inside, that served as a server but sounded like a jetliner. (Actually, it's still in there ... turned off. I haven't arrived in my journey yet.) I decided that if I want to use my home office as a home office and be productive away from the TV, I need to get the "jetliner PC" out of there, but I'll have to figure out what to do with it because it serves my Windows Media Center / TV / DVR stuff out to my Xbox 360 over the network, my blog web site, and my music library.

I've got three PCs: the "jetliner PC" for e-mail and home services like Media Center, a gamer PC that I originally bought to do music production with but didn't have time to follow through with that so I made it a gaming PC, and my laptop. I decided to replace the "jetliner PC" with a living room HTPC, or at least replace the case and CPU, and set it up to use my Blu-ray drive that I've been trying and failing to sell on Amazon.com Marketplace. I'd tried to use the Blu-ray player on my gaming PC from across the room, but after dragging DVI and optical cables all the way across the room, I couldn't get optical S/PDIF audio out to work with PowerDVD and my external audio adapter for my audio receiver. So after spending a couple weeks trying to get this kludgy system working, I had given up and bought a standalone Blu-ray player. But with an HTPC, maybe I can get it working after all, and then sell off the standalone player and get some money back.

After researching HTPC computer cases, based on price and user ratings I opted for the Antec Veris Fusion Black 430. This in turn required me to look for a new micro-ATX motherboard since I couldn't just swap in my plain ATX motherboard from the jetliner PC. I ended up getting the Gigabyte GA-G33M-S2H. So far so good, although one of the reasons for choosing it was its HDMI output, which I ended up not using because its HDTV signal stinks (doesn't fit the desktop to screen like my GeForce 8800 GT does with the nVidia Control Panel). The on-board video was adequate for normal use but not for multimedia use; I didn't expect it to be, but I was curious and wanted to try. So anyway I ended up putting my GeForce 8800 GT in it, which leaves me without a decent video card in my gamer PC. Hmph.

Anyway, my blog was also hosted on the jetliner PC, so now I've finally migrated it to my HTPC. As you can see, the blog's back up, and the HTPC is up and running in my living room where it will be running 24/7, making less noise than the air conditioner ventilation and only a hair more noise than my Xbox 360 -- not truly silent, but acceptably quiet.

With the awesome but expensive GeForce 8800 GT video card, I'm finding Blu-ray performance quite acceptable, as I was able to watch Spiderman 2 on Blu-ray without hiccups and in high resolution (although not without tweaking out the tint and other color settings on the TV to get rid of the over-contrast, over-saturation, and deep purplish tone). On the other hand, as of this post I have yet to see how Blu-ray performance will perform with IIS and SQL Server installed.

The LCD front panel on the Antec case is unreadable, no matter how much I toy with the contrast setting with the front panel utility software. It's not broken, it's flawed; I can read it if I squint hard enough or look closely, but at ten feet away my vision gets a little blurry and the blue backlighting against the bright blue foreground doesn't help at all.

The IEEE 1394 (Firewire) interface didn't have a connector on the inside, so the jack on the front panel is worthless, but fortunately I don't do video camera work and there's a working IEEE 1394 interface on the motherboard on the back panel.

I only got the S/PDIF optical audio output working with PowerDVD after I installed the latest audio drivers and then configured it to use dts 5.1 from Windows sound control panel.

I bought a Western Digital 1TB external drive to back everything up, as I had intended to take the two drives used in the RAID 0 array of my jetliner PC and make them dual standalone drives in the HTPC, starting from scratch. But Windows Vista Full PC backup (for emergency rollback) failed me. I had backed up everything up across two partitions of the RAID array I backed up the ASP.NET files, too, but forgot to back up this blog's database. So when I took the two drives back to the jetliner PC and ran the Windows Vista restore process, it told me that my hard drives were too few or too small. This made no sense, since it was all exactly the same; the RAID array had to be re-provisioned, but first I tried re-balancing the partitions, and then I tried the original partition sizes, but no matter what I tried, Vista refused to restore. Phooey.

A little Googling, and Windows Virtual Server 2005 R2 SP1 (yes, ALL of that in the name) came to the rescue. I installed only the VHDMount utility, with which I mounted the backed up hard drive partitions in the Full PC Backup archive. Ha! I grabbed the database files from the old Program Files directory, and .. well, here we are.

Currently rated 5.0 by 1 people

  • Currently 5/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: ,

Computers and Internet | Electronics

Dear Internet: Stop Using Dead URLs As URIs

by Jon Davis 7. February 2008 10:52

Microsoft, this especially goes to you.

It's confusing as heck when protocol-declarative dead URLs are used as URIs. It's bad netiquitte, very rude, please don't do that.

As many people know, a URI is an identifier, whereas a URL is a locator. The former is a unique ID for a resource, the latter tells the world how to access the resource over the Internet. They are often one and the same, but a URI is expected to never change without suggesting the change of the resource itself, whereas a URL can point anywhere at any time.

URLs are, in fact, URIs, although URIs are not necessarily always URLs. The url http://cachefile.net/scripts/jquery/1.2.2/jquery-1.2.2.js is also a URI that identifies the specific location of the given resource.

However, in the view of some people, the URI http://schemas.xmlsoap.org/soap/envelope/ is not a URL. Actually, in this case, it happens to be a URL because if you navigate to that URI it will actually return a schema document. But what would happen if there was no document that was returned from that URL? Would it be valid URI? Sure.

Some people, such as Microsoft, think that because you can have a URI that can look like a URL without being a URL and it can still be a URI, you should. No, Microsoft, and whoever else does this. You shouldn't.

Don't get me wrong, URIs that are not URLs are an acceptable practice in general. My issue is that the scheme of "http" should not be present in the URI if HTTP is not relevant to identifying it. A valid URI that is not a URL would be xmlschema://schemas.microsoft.com/2003/10/Serialization/Arrays, whereas an unacceptable URI that is not a URL would be http://schemas.microsoft.com/2003/10/Serialization/Arrays.

If you're going to put HTTP as the scheme of a URI, put a web site there and make it accessible. I would much prefer that a web site that contains http://schemas.microsoft.com/2003/10/Serialization/Arrays be at the specified location than be presented with the URI of xmlschema://schemas.microsoft.com/2003/10/Serialization/Arrays, but to have the HTTP URI with no web site provisioning what the URI is communicating to the reader is just wrong.

I don't take exception to URIs containing tempuri.org, which is shown with a scheme of "http". That is a stub that Visual Studio / .NET throws in that is the maintainer's responsibility to revise. Same with datacontract.org, which I find showing up in WCF communications. Furthermore, using the TLD of ".org" is unacceptable as it suggests that an organizational entity is behind the URI. A more appropriate TLD is no TLD at all, really. Follow "tempuri" with a single dot ("xmlschema://tempuri./resource"), which globalizes the name.  

kick it on DotNetKicks.com

Currently rated 4.0 by 1 people

  • Currently 4/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , , ,

Computers and Internet | Software Development | Web Development

Beyond Disabling UAC: Enable Networkable Admin Access

by Jon Davis 6. February 2008 12:34

Windows Vista and Windows Server 2008 both disable administrative access when accessing via a network. So all those administrative things you're used to doing, like accessing an administrative share (\\machinename\D$) have to be thrown out when you use Vista or Server 2008.

However, you can bring it back, Windows XP / 2003 style. The key is in the registry, at KHLM\Software\Microsoft\Windows\CurrentVersion\Policies\System. Add a DWORD value named LocalAccountTokenFilterPolicy with a value of 1. Reboot.

Currently rated 1.4 by 9 people

  • Currently 1.444444/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , , , ,

Microsoft Windows

Beyond Disabling UAC: Disable Virtual Store

by Jon Davis 5. February 2008 07:22

Something I like about Windows Server 2008 x64 is that it (finally) gives the user the benefit of a doubt when disabling the advanced security options in Internet Explorer. Now it automatically prompts me to install ActiveX controls, for instance, and when I download files from the Internet I no longer have to right-click the file, choose Properties, and "Unlock" before I can use them without security warnings (this being something I've been habitually doing on all file downloads since IE7 was released).

But all is not trusting. I was tinkering with the recent release of the the new OS when I noticed as I was saving stuff to my Program Files directory in a new subdirectory that the new subdirectory didn't exist. Namely, I downloaded Notepad2 and attempted to create a new directory at C:\Program Files (x86)\ called "Notepad2" where I would save the file, then open the directory up in Windows [File] Explorer to unlock and extract the .zip file. Lo and behold, my Internet Explorer "Save As..." dialogue box told me I was looking right at C:\Program Files (x86)\Notepad2, but Windows Explorer insisted that no Notepad2 directory exists in C:\Program Files (x86). Could it be a bug?

Directory virtualization, perhaps? Indeed, I've seen Microsoft do this more and more lately. I knew where to look: C:\Users\jdavis\ ... hmm that's right, Local Settings got moved to AppSettings\Local. VirtualStore? Yes! There it is! "Program Files (x86)", and in there, a "Notepad2" directory, all by itself.

I don't want this. I REALLY don't like this. Microsoft implemented this virtualization feature to work around insecure design bugs in software. Whose software, though? Theirs? Ours? Third parties?

I mean, come on, Microsoft, if you're going to virtualize the Program Files directory like this, go all the way with it and do it in Windows Explorer and the command prompt as well. Heck, do it at the kernel level so that any app running in user space sees this thing where it really is.

Or not. I don't like virtualized paths. It's an administrative nightmare. Let's disable this thing.

So, after turning off UAC from the User Accounts control panel, which I hadn't done yet to this point, I rebooted and still had this problem. Then I tried disabling Local Security Policy -> Security Settings -> Local Policies -> Security Options -> User Account Control: Virtualize file and registry write failures to per-user locations. I think this fixed it. I'll update this blog entry if I find otherwise.

I realize why Microsoft implemented this file path virtualization thing, but IMO it's a crutch and does NOT demonstrate good computing practices despite what some IT folks would proclaim. This is the kind of stuff that just makes computing all the more confusing and difficult to work with. While the intentions were valid, we don't need anymore unexpected twists and turns in our computing experiences.

UPDATE (1/17/2009): This HORRIBLE "feature" ended up in Windows 7 as well!! To fix it now you need to open "Security Configuration Management" where you'll find Local Policies -> Security Options -> "Virtualize file and registry write failures to per-user locations" and disable the thing.

Currently rated 4.6 by 26 people

  • Currently 4.615385/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , , ,

Microsoft Windows


Powered by BlogEngine.NET
Theme by Mads Kristensen

About the author

Jon Davis (aka "stimpy77") has been a programmer, developer, and consultant for web and Windows software solutions professionally since 1997, with experience ranging from OS and hardware support to DHTML programming to IIS/ASP web apps to Java network programming to Visual Basic applications to C# desktop apps.
Software in all forms is also his sole hobby, whether playing PC games or tinkering with programming them. "I was playing Defender on the Commodore 64," he reminisces, "when I decided at the age of 12 or so that I want to be a computer programmer when I grow up."

Jon was previously employed as a senior .NET developer at a very well-known Internet services company whom you're more likely than not to have directly done business with. However, this blog and all of jondavis.net have no affiliation with, and are not representative of, his former employer in any way.

Contact Me 

Tag cloud


<<  May 2021  >>

View posts in large calendar