Monthly Archives: September 2003

Mysticism and the PDC

I’ve followed with interest some of the discussions touched off by Ole’s entry on “Oh, not Oooh” and his follow-up piece. On the whole, I would have to agree with his overall thesis regarding the run-up to the PDC: even though I work for Microsoft and think the PDC is going to be awesome, I’ve found the quasi-mystical aura that many people (who I won’t mention by name) are trying to impart to the conference somewhat confounding at times.

My biggest issue with the mysticism is that it engenders this whole “complexity vs. simplicity” debate that’s swirled around Ole’s entries, and which I think is a red herring. The question is not how complex a technology is or how simple it is, the question is how organic it is. Case in point: in many of these discussions, the transition from VB6 to VB.NET is held up as a canonical example of moving from less complexity to more. But the truth is that from a complexity standpoint VB6 was probably a more complex product than VB.NET is. What’s different, though, is how the two products surfaced their complexity to the user. VB6 did an excellent job of tucking away the things that most people didn’t need in their day to day work – maybe a little too well at times. And it did an excellent job of sticking features right where you expected them to be. (A big feature for Whidbey is going to be returning Edit and Continue to the product, but the reality is that before VS 2002 most people probably weren’t even aware that Edit and Continue was a feature at all. They just did what they wanted to do and it worked. No thought required.)

In many ways, though, VB.NET is much simpler in terms of design and implementation than VB6 was. The problem (such as there is) is that complexity in VB.NET has been surfaced in ways that still do not feel entirely natural. Indeed, a lot of the work that our team has before us in Whidbey and beyond is not removing complexity but trying to create a more natural way of working with the product. This is what I mean when I say that technology should be “organic.” Well designed products don’t surface every bell and whistle to the user, they follow a principle of Everything in its Place. Using technology should be all about a natural unfolding of the capabilities of the technology as you need them, not shoving all this cool stuff in your face at once. And despite all the hoo-hah about complexity, I think that the .NET platform has actually gotten a pretty good start on this even if there are some rough edges that need to be worked on.

Ultimately, I think that this is really what the PDC is going to be about: letting people know about how we’re going to make your life easier and more natural, not how we’re going to “change Windows as you know it.” If we do our job right, developers moving to Avalon and Longhorn and Yukon and Whidbey and so on should feel like they’re meeting an old friend who they haven’t seen for a long time. Sure, there’s going to be a lot of catching up to do, but you should also immediately feel that familiar rapport, that feeling like you’ve known them all your life.

Whether we achieve that remains to be seen, and I think that’s worth showing up to the PDC for…

Grading on the curve

Chris’s posted some continuing thoughts on the question of whether its a good idea to grade people on a curve when doing performance reviews (or any other place where grading occurs, I guess). Like Chris, I also believe pretty strongly in grading on the curve but I still have lots of reservations about it. That’s because a curve can so easily be misused in a way that is damaging to employees, the company, or both. But the alternative, inevitable grade inflation, seems to be worse.

(Grade inflation is something I got enough exposure to in college, thank you very much. Which is not to say that I wasn’t hypocritically disappointed that my university started giving out graduation honors on a curve starting with my class. It turned out something like over 50% of the class was graduating “cum laude“ or better. It is numerically impossible that every student at a university is above average.)

Ultimately, it reminds me of the famous Churchill quote: “Democracy is a very bad form of government. Unfortunately, all the others are so much worse.” Same for the curve, same for the curve…

Feature scheduling and the ‘Using’ statement

After introducing it with much fanfare, I’ve been very negligent in actually answering anything submitted to Ask a Language Designer. So let me start to make amends. The first questions I want to address were narrowly asked but touch on a broader question. Both Muhammad Zahalqa and Kevin Westhead ask why VB doesn’t have a ‘using’ statement like C# has and when/if we will have one in the future.

The general question of "Why doesn’t VB have feature x?" is, as you can imagine, one that keeps coming up over time. It’s gotten even more common since C# came on to the scene, since VB and C# are much closer to each other than, say, VB and C++ are. So I thought I’d spend a little time talking in general about why some features that exist in other languages don’t make it into a particular release of VB. And, in the process, I’ll address the question of the ‘using’ statement.

When a feature doesn’t make it into VB, it’s typically because of one of the following reasons:

  1. It’s a bad feature. I’m throwing this in here because it does happen every once in a while that we get a request for a feature that some other language has that we think is just a bad idea. There isn’t anything that I can think of off the top of my head that’s fallen into this category, but it does happen.
  2. It’s not a good match for VB. There are some features that we think just don’t make sense for our customers and the kinds of things that they do. It’s not that we think that nobody would ever use the feature, just that it’s very unlikely that most of our users would need or use the feature. The prime example of this is the unsafe code feature in C#. Although there are situations in which people use unsafe code, they tend to be pretty limited and very advanced. In general, we believe that even very advanced programmers can happily exist in the .NET environment without ever needing unsafe code and that it tends to mainly be programmers coming to C# from C and C++ that find the feature useful. (Our experience since shipping VS 2002 and VS 2003 has so far validated these beliefs.)
  3. It’s not a good cost/benefit tradeoff. Some features are good ideas, but the benefit gained by having the feature doesn’t seem to justify the time it would take to design, specify, implement, test, document and ship the feature when compared against other features we want to do. To pick a prosaic example, the C# command-line compiler csc.exe has a feature where it will read in a "default" response file that contains references to all of the .NET Framework DLLs. This means you don’t have to explicitly "/r" DLLs like System.Windows.Forms.DLL when compiling an application on the command line. It’s a nice idea and handy on the command line, but in the past the overhead of implementing the feature in vbc.exe was judged to be higher than its benefits. So it didn’t make it into the product for VS 2002 or VS 2003. The danger, of course, is that little features like this can end up being constantly cut from release after release because they’re not big enough to be "must have" features, but not small enough to escape the ax. (You’ll just have to give it a try in Whidbey to see whether it made it in this time or not…)
  4. We ran out of time. This has got to be the most painful situation for the product team, because it’s something we desperately wanted to get done but for one reason or another, we just didn’t have time. So instead we have to sit back and suffer the slings and arrows of outrageous fortune for not having the feature, contenting ourselves only with the knowledge that next time will be different. A prime example of this is operator overloading. We really wanted it for VS 2002/2003 but it was too big to fit into the schedule. We’ll definitely be doing it in Whidbey.

So what about the ‘using’ statement? One of the major struggles of the VS 2002 product cycle was trying to figure out how to deal with the transition from reference counting (i.e. the COM model) to garbage collection (i.e. the .NET model). We spent a lot of time trying to see if we could make reference counting coexist with garbage collection in a way that: a) was reasonable and b) was comprehensible by someone less than a rocket scientist. While we were doing this, we deferred making any decisions on garbage collection related features like the ‘using’ statement. After banging our heads against the proverbial wall for what seemed like forever, we came to the very difficult conclusion that we couldn’t reconcile the two and that full-on garbage collection was the way to go. The problem was, we didn’t reach this dead end until late in the product cycle, which meant that there wasn’t enough to time left to properly deal with a feature like ‘using’. Which, let me just say, really, really, really sucked. So the painful decision was made to ship without the feature and, yes, we said to ourselves next time will be different.

Is this time different? This is another one of those things that you’ll have to try out in Whidbey to see… (wink, wink)

Visual Basic Decodes Human Genome

I was reading the entry by Phillip Greenspun that’s been floating around the blogsphere today comparing Java to SUVs. Interesting, but can you figure out what in the following quote caught my eye?

A project done in Java will cost 5 times as much, take twice as long, and be harder to maintain than a project done in a scripting language such as PHP or Perl. People who are serious about getting the job done on time and under budget will use tools such as Visual Basic (controlled all the machines that decoded the human genome). But the programmers and managers using Java will feel good about themselves because they are using a tool that, in theory, has a lot of power for handling problems of tremendous complexity.

VB controlled all the machines that decoded the human genome? I’d never heard that, but there’s at least some evidence on the web that it’s true. (Really, it’s one of those things that’s just "too good to check.") Does that mean we can take downstream credit for all the things we learn from the human genome? "Visual Basic Finds Cure For Cancer?" I guess we’ll just have to wait and see…

Fantasy vs reality

After letting it languish on my hard drive for several months, I finally uninstalled Arx Fatalis today. I’d made it (I think) about half-way through the game before losing interest and never getting back to it, which was disappointing. One of my all-time favorite computer games is Ultima Underworld, which the Arx designers said in interviews they were trying to emulate. Unfortunately, while they borrowed much of the interface logic and much of the basic plot of Underworld, somewhere along the line it lost whatever it was that made Underworld such an awesome game.

I think one of the biggest problems with these kinds of games today is that they end up focusing on trying to give the player a detailed world rather than an enjoyable story. Comparing Arx and Underworld, the former has a much more detailed world, both in terms of backstory and in terms of physical reality. Whereas the Stygian Abyss in Underworld was nothing more than a crude approximation of a real space – I think the dwarf “city” was just one big room with a couple of dwarves in it – the city of Arx has architecture, a castle, denizens who have jobs, etc. But the designers get so caught up in creating someplace “real” that they start to forget about what makes a game actually fun. (There are also some really practical problems with creating spaces that approximate the real world, namely that you end up spending a lot of really uninteresting time backtracking through areas that you’ve already visited to get to somewhere else. If I want to do that, I can go take a hike in the real world.)

This same problem also really ruined another game I should have loved, Deus Ex. After all, it was created by the same guy who oversaw my hands-down favorite game ever, System Shock. As a side plug, if you’ve never played System Shock, I highly recommend tracking down a CD of it in a remainder bin somewhere and doing whatever you have to do to your computer to get it run. It was the most amazing game I’ve ever played, not the least of which because I really felt like I was there on Citadel Station. Even now, I can recall a tactile sense of the layout of the station and some of the more important locations. And all this with a graphics engine that would be laughable today.

When I first heard about Deux Ex, I was very excited although I was worried by the interviews in which the team went on and on about how they adapted real-world locations in loving detail for the game. Sure enough, the game does render futuristic versions of real locations in great detail. The game starts in New York City on Liberty Island, and when I was in New York recently I found myself on a boat sailing past Liberty Island. I was idly staring at a dock on the far side of island when I suddenly had a shock of recognition – I’d been there! Well, not really, I’d just been there in the game, but the game had been accurate enough that I actually could recognize the layout of the island. But even though the game was so “real” in this sense, from a game perspective, the world was… well… empty. As accurate as they were, all the real world locations in Deus Ex felt like disconnected set pieces. This was compounded by the fact that I ended up not giving a damn about the character I was playing. I mean, the guy’s brother is being offed by an evil group bent on world domination, and all I’m thinking is “Damn, couldn’t they get a better voice actor for this gig?” Contrast that with, say, Max Payne, which had exactly the same kind of New York set pieces and the same over-the-top mythical pretensions (while Deus Ex steals from Christianity, Max Payne steals heavily from Norse mythology). Even though the places in Max Payne were less realistic than those in Deux Ex, it kept my attention throughout the entire game. And whereas I uninstalled Deus Ex halfway through, I had a great time with Max Payne through the very last gun battle.

Ultimately, I think how “real” a game is has nothing to do with whether the game is any good. The good games out there make me identify with the protagonist or the story, just like a good movie, book or TV show. They feel more real in an emotional sense… more human. Clichéd as he was, Max Payne was much closer to a real person than the robotic J.C. Denton. And the “how the hell do I get out of here?” plot of Underworld was much more accessible than the “I’m a messenger from the gods from another plane sent to stop another god from blah, blah, blah, blah” plot of Arx.

Now if someone would just rebuild System Shock and Underworld on top of the latest Unreal engine, I’d be as happy as a clam…

Fair warning…

As you may remember, two months ago I moved from the old BlogX codebase to some custom software that I wrote based on BlogX. When I did that, the RSS feed for my site moved to http://www.panopticoncentral.net/rss.aspx, and I put a permanent redirect at the old URL. Well, the time has come to drop the old URL, so if for some reason you still haven’t moved over, please do so ASAP. It’s going to stop working in the next few days. You’ve been warned!

Useless language constructs

Frans points out some C# and VB language constructs that he thinks are superfluous. In the interests of a deeper understanding of the language, here’s the thinking behind the three VB ones:

  • ReadOnly and WriteOnly property specifiers. When we were coming up with the new property syntax in VS 2002, we discussed this issue in great detail. Some people felt that the extra bit of explicitness was nice, but what really got it into the language was interfaces and abstract properties. In both of those cases, you don’t actually specify an implementation for the Get and the Set, but you need to specify whether you expect them to be there or not. One way of attacking the problem is the way C# does it, where you just specify an empty Get and/or Set. But with VB’s line orientation, this ended up with a lot of wasted screen real estate (and it just looked weird in VB to have a Get with no End Get). ReadOnly and WriteOnly were the main alternative, and we liked them much better. We did talk about whether they should be optional for properties that do have a Get and/or a Set, but we felt that it was better to have consistency about when we did and did not require them.
  • WithEvents with form controls. The truth is that WithEvents fields hide a whole lot of magic. (If you’re interested in exactly what, you should check out what the VB language specification has to say about WithEvents.) The most important thing that happens, though, is that a WithEvents field is always wrapped in a property. This is not a minor thing for two reasons. First, it changes the behavior of the field when it’s passed by reference – real fields can be passed true byref, while properties have to be passed copy-in/copy-out. (I just recently wrote the part of my language book on this, maybe I’ll post some of that in here at some point.) But more importantly, accessing a property has slightly more overhead than accessing a field, so there’s a slight penalty to declaring a field WithEvents. Given those two things, we decided we wanted to retain the VB6 convention of specifying WithEvents so that it wasn’t something that was done lightly or accidentally. The problem now, though, is that designers such as the Winforms and Webforms designers just put "WithEvents" on things willy-nilly so that they can add handlers later, resulting in the opposite effect of what was intended. This unintended consequence is something we’re trying to work through in Whidbey.
  • Overloads when Overrides. This is really just a bug in the language design. (Yes, we do have those.) Frans’s logic is completely correct on this point, and it’s something we were already looking at fixing. But for now, you’re stuck with it…

Feel free to throw other things you think are superfluous my way through comments or through Ask a Language Designer. (Yes, I am going to be answering some questions from there soon. I promise!)