Joined: 18 Dec 2004
||Posted: Fri Mar 14, 2008 6:24 pm Post subject: Why Win32 is a bad platform for games - part 3
|The previous installment ended with a cliffhanger: what's wrong with mouse input on Win32?
If you're like me, you've been using mouse input on Win32 for years without much trouble. There are many APIs that give you reasonable results, from GetCursorPos() to WM_MOUSEMOVE to DirectInput to Raw Input.
For a game that uses absolute positioning, such as mine, the standard Windows messages for mouse events were actually perfectly acceptable. Yes, it's a bit of a clunky API due to the whole SetCapture and mouse fencing ridiculousness. But it's the kind of thing that take a few hours to implement properly, and then it pretty much works.
Or so it was until 2002.
That was the fateful year when Microsoft introduced the mother of all mouse input disasters: Tablet PC.
I am 31 years old, which puts me right at the cusp of programmers who still had to write joystick calibration code. If you're 25 or younger, I'm guessing you never had to write anything like that, and if you're under 20 then maybe you never even used a joystick calibration screen. Hopefully I speak for all those who did write or use joystick calibration when I say we are lucky to live in the new calibration-free world.
Or perhaps I should say have lived, because with Tablet PC, Microsoft has brought it roaring back into style, not for joystick input, but for mouse input.
The problem begins with the interaction of two operating system services: the display driver and the tablet driver. These two systems are coupled because the user's perception of pen input relies on the pen driver knowing where the pen is relative to the display. If the pen driver doesn't know where the display is being drawn, then it certainly won't be able to report pen positions that correspond to where the user physically places the pen on an image.
Since this is the most important aspect of pen computing, you might think that Microsoft would have tested it extensively, or perhaps even thought about testing it extensively. Or at least heard from someone's cousin's uncle that maybe at some point if they got aroudn to it, it might not be a bad idea to do some testing in this general area.
Wrong, wrong, and wrong.
It turns out Microsoft not only doesn't test this, but as far as I can tell, they didn't even implement it! On the stock Fujitsu TabletPC that I tested, you could open up the display settings, change the display mode, and poof - goodbye calibration. The display driver was set to use centering (not stretching), so the new display mode shows up on a subset of the whole display, but the pen still tracks the whole display. So your physical pen position is now in no way related to where Windows puts the mouse, unless of course you happen to be clicking very close to the center of the screen.
This means that games which switch to full-screen mode (which is basically any AAA title, and even most casual games these days) immediately lose pen positioning, and the user now has to play a little interpolation game just to close the app (or switch it to windowed mode). Did anyone at Microsoft even test a single game on this platform? What were they thinking?
Obviously, an application should never have to be aware of this situation, because the OS should ensure that pen input always lines up with the display. But since Windows completely fails to do this, the app needs to step in... but unfortunately, Windows also doesn't provide the necessary information for you to do said stepping. As far as I'm aware, you can't even determine whether or not a graphics driver is in centered or stretched mode, because apparently no one at Microsoft ever bothered to standardize this setting so that it could be reported.
As a result, there's only one thing you can do: pop up a calibration screen. 1985, here we come!
But before we can step back into 1985, there's a problem: how do we even know that the user is using a pen in the first place? Obviously we don't want to pop up a calibration screen if they're using a relative positioning device, such as a mouse or trackpad. Well after some digging in the TabletPC documentation, which is some of the worst documentation I've seen from Microsoft, I found the magic incantation:
bool IsFromPen = ((GetMessageExtraInfo() & 0xFF515700) == 0xFF515700);
This code snippet can be used when processing mouse messages to determine whether or not the mouse message came from a Tablet PC pen. Don't ask me to what 0xFF515700 corresponds, because I don't know. And don't ask me why you're supposed to and-then-compare, because I also don't know.
With this code, you can identify when you are getting pen input, and pop up a calibration screen to allow the user to properly calibrate the tablet, even if their display driver is outputting an image which doesn't align with the tablet digitizer calibration. In my game, I when I detect pen messages, I ask the user to click on the corners of the screen and I use the resulting positions to do my own positional interpolation from then on.
Still, even if you accept the need for calibration (which should absolutely not be necessary), this isn't quite perfect. The reason is because although IsFromPen is true for TabletPC pens, and false for mice, trackpads, and Wacom tablets connected to an ordinary PC, it is also true for Wacom tablets connected to a TabletPC. Since a separate tablet connected to a TabletPC doesn't actually require calibration (it doesn't "correspond" to the display in the same way as much a pen that physically touches the display), the calibration screen isn't necessary, but as far as I know, Windows does not provide a way to distinguish between these two cases.
But if you aren't someone who happens to read this post (how sad for you!), there's another bit of fun in store for you when you try to make your game compatible with TabletPC.
In the TabletPC settings, there is an option which Microsoft calls "press and hold". When this setting is true, instead of sending mouse input to your game immediately, it instead buffers the pen button events until it can determine what the user is doing. If the user presses and releases the pen quickly, it sends the appropriate left mouse button events to your window.
But if the user holds the pen longer, it doesn't send any events at all. Instead, it waits to see how long the user is going to hold the pen. If the user holds the pen long enough, Windows draws a ring of red spheres (I'm not making this up) winding around the mouse cursor, and then flashes a little mouse-with-a-red-right-button icon above it and to the left. If the user releases the mouse while the icon is flashing, then is sends right mouse button events to your window.
Now, the best part is that this was actually the default setting on the TabletPC I tested. Even though the pen had two buttons. So it's pretty safe to assume that this mode could be on for any TabletPC user that happens to be running your game, whether or not they know about it or have ever touched the setting themselves.
But what to do? How do you turn this off? Clearly it makes any game unplayable, even ones that don't care about right clicking, because of the extreme delays it causes in mouse message reporting.
Well, to keep a long story long, I spent the better part of four days trying to turn it off. I tried everything. I read through the entire TabletPC SDK, I tried all kinds of sample apps, I made my own app and tried responding to various messages... nothing worked.
Every step of the way, it was excruciatingly painful, because the TabletPC SDK is absolutely horrible. It's the kind of thing where you have to "inherit" from a Microsoft "class" defined in COM and "register" yourself to get callbacks. But of course, it's all so poorly documented, that you don't even know what messages you should receive or when, or which classes get which kinds of messages.
To make matters worse, all the classes from which you can derive actually draw the pen strokes on the screen, and do other things like stroke recognition. So you then have to find the right series of calls to turn all that off.
I even tried Raw Input, and although that does seem to give you some pen data, responding to those messages didn't seem to have any effect on the operation of "press and hold".
Ostensibly, there is actually a class called "RealTimeStylus", that actually lets you get raw pen data. No other class lets you do this; they only let you get the processed stylus results. The catch? At least on the systems I tested, RealTimeStylus support was not available (perhaps it is Vista-only?) So I still have no idea how you actually get pen pressure and tilt data reliably if you actually want it for your game... but I digress.
At the end of four days of futility, I gave up and did two things. First, I posted a message on the MSDN forums asking "what the fuck", although I did not actually use the word "fuck" in the post because I thought it might trigger an automatic rejection.
Second, I implemented the "nuke-u-lar option". This involved watching the registry to determine what setting actually controls "press and hold". Once I knew the key values, I wrote code to overwrite the value with "off", hunt down the TabletPC mouse message translation process, kill it, and force it to restart. Then when my app exited, I put the old value back, re-killed, and re-started the process.
This "worked like a charm", but it didn't seem like the most friendly thing to ship.
Thankfully, a few days later, to my surprise, someone actually answered my forum post. In it, they referred me to a technical article written in 2003 that showed how to do exactly what I wanted: instead of messing with the TabletPC API at all, you just set a secret global text atom on your window, and poof! TabletPC disables press and hold for your window.
At this point, I thought to myself: how did I miss that? So I pasted the atom name, "MicrosoftTabletPenServiceProperty", into the MSDN search box. Only two hits. One was the article to which I had been referred. The other? The forum thread I had started when I asked the question!
So somehow, in all their COMness, with multiple libraries and hundreds of GUIDs and pages and pages of class documentation, the TabletPC SDK had failed to include a define for, or even a mention of the existence of, this special atom. Or what "press and hold" was (since it would have been really helpful to know that term for searching before I started - I might have been able to find the secret technical article that way).
All told, this was probably the worst Win32 experience I've ever had. TabletPC as a platform was clearly engineered solely with C# programming in mind. In fact, up until the recent Vista platform SDKs, the TabletPC samples were almost entirely in C# only. And even in the latest Vista platform SDK, there are a fair number of TabletPC examples that are C#-only.
I sincerely hope that the TabletPC SDK is an anomaly, and it's not the way Win32 SDKs will be in the future. But somehow, I'm not so hopeful.