Correlation between gnegg.ch and WoW

If you take a look at the archive (a feature I’ve actually only discovered just now), you’ll notice quite an interesting distribution of posts here on gnegg.ch

2002 was where all started. November was still a bit slow, but in December I really got into blogging only to let it slip a bit during 2003.

2004, I began subscribing to tons of RSS feeds which provided me with a lot of inputs for my own articles. You’ll notice a significant increase of posts during the whole year.

Then, in 2005, my WoW-time began. My first WoW-related posting was from February 21st, 2005 and makes a reference to when I bought WoW, which would be – provided I’m calculating correctly – February 15th 2005.

Going back to the archive, you’ll immediately notice something happening to the post count: It’s steadily going down. From a good 9 entries in January (pre-WoW) down to one entry in October which is more or less when I got my first character to level 60. In November I was affected by my first fed-up-ness of WoW which lasted till January 2006 (post count coming up again – despite having christmas and all which was keeping me away from computers.

Then, in January, I was playing again, getting closer to 60 with my second character in February (just one posting).

March was WoW-less again due to my feeling of not having anything to do any more.

In mid-April, I began playing again and started my third character… (posts going down) – which I got 60 with at the end of May.

June was playing at 60 and before the end of the month, I began feeling fed-up with WoW. And burned out. I clearly felt to have wasted way too much of my life. And I felt like I was truly addicted to WoW. So I used the emergency break and stopped playing.

As you can see, I was back to 16 posts in July which also was due to my “Computers under my command”-series which was easy to do due to the topics being clear in advance.

August is interesting. Have a look at the month calendar and guess when I took my lv60 character out again!

More or less regular postings here until August 10th. Then nothing.

September is better again because I put my WoW to a deep-freeze again – especially after having seen what WoW does to my other hobbies. gnegg.ch is a very nice indicator in that regard.

So I’m coming to all the same conclusion as Adam Betts who also stopped playing WoW due to noticing his real life being severely affected by WoW.

World of Warcraft is highly addictive and I know of no person who could say not being affected by this. Once you start to play, you play. Even worse: Even if you think that you got it behind you and that you can control it, it just takes over again.

So for me it’s clear what I have to do: I will stop playing. For real this time. No taking out my character again. No-more-playing. I won’t delete my characters as they are the result of a lot of work, but I will cancel my subscription.

I’m really grateful for the archive function of gnegg.ch as it was a totally clear indicator of my addiction and it still is a perfect way to prevent me from going back as everyone will know I have due to the post count going down again.

SQLite, Windows Mobile 2005, Performance

As you know from previous posts, I’m working with SQLite on mobile devices which lately means Windows Mobile 2005 (there was a Linux device before that tough, but it was hit by the RoHS regulation of the European union).

In previous experiments with the older generation of devices (Windows CE 4.x / PocketPC 2003), I was surprised by the high performance SQLite is able to achieve, even in complex queries. But this time, something felt strange: Searching for a string in a table was very, very slow.

The problem is that CE5 (and with it Windows Mobile 2005) uses non-volatile flash for storage. This has the tremendous advantage that the devices don’t lose their data when the battery runs out.

But compared to DRAM, Flash is slow. Very slow. Totally slow.

SQLite doesn’t load the complete database into RAM, but only loads small chunks of the data. This in turn means that when you have to do a sequential table scan (which you have to do when you have a LIKE ‘%term%’ condition), you are more or less dependant on the speed of the storage device.

This what caused SQLite to be slow when searching. It also caused synchronizing data to be slow because SQLite writes data out into checkpoint files during transactions.

The fix was to trade off launch speed (the application is nearly never started fresh) for operating speed by loading the data into an in-memory table and using that for all operations.

attach ":memory:" as mem;

create table mem.prod as select * from prod;

Later on, the trick was to just refer to mem.prod instead of just prod.

Of course you’ll have to take extra precaution when you store the data back to the file, but as SQLite even supports transactions, most of the time, you get away with

begin work;

delete from prod;

insert into prod (select * from mem.prod);

commit;

So even if something goes wrong, you still have the state of the data of the time when it was loaded (which is perfectly fine for my usage scenario).

So in conclusion some hints about SQLite on a Windows Mobile 2005 device:

  • It works like a charm
  • It’s very fast if it can use indexes
  • It’s terribly slow if it has to scan a table
  • You can fix that limitation by loading the data into memory (you can even to it on a per-table basis)

Word 2007 – So much wasted energy

Today, I’ve come across a screencast showing how to quickly format a document using the all new Word 2007 – part of office 2007 (don’t forget to also read the associated blog post).

If you have any idea how Word works and how to actually use it, you will be as impressed as the presenter (and admittedly I) was: Apply some styles, chose a theme and be done with it.

Operations that took ages to get right are now done in a minute and it’ll be very easy to create good looking documents.

Too bad that it’s looking entirely different in practice.

If I watch my parents or even my coworkers use word, all I’m seeing is styles being avoided. Heading 1? Just use the formatting toolbar to make the font bigger and bold.

Increase spacing between paragraphs? Hit return twice.

Add empty spacing after a heading (which isn’t even one from Word’s point of view)? Hit return twice.

Indent text? Hit tab (or even space as seen in my mother’s documents).

This also is the reason why those people never seem to have problems with word: The formatting toolbar works perfectly fine – the bugs lie in the “advanced” features like assigning styles.

Now the problem is that all features shown in that screencast are totally dependent of the styles being set correctly.

If you take the document shown as it is before you apply styling and then use the theme function to theme your document, nothing will happen as word doesn’t know the semantic data about your document. What’s a heading? What’s a subtitle? It’s all plain text.

Conversely, if you style your document the “traditional” way (using the formatting toolbar) and then try to apply the theme, nothing will happen either as the semantic information is still missing.

This is the exact reason why WYSIWYG looks like a nice gimmick at the first glance, but it more or less makes further automated work on the document impossible to do.

You can try and hack around this of course – try to see pattern in the user’s formatting and guess the right styles. But this can lead to even bigger confusion later on as you can make wrong guesses which will in the end make the themeing work inconsistently.

Without actually using semantic analysis of the text (which currently is impossible to do), you will never be able to accurately use stuff like themeing – unless the user provides the semantic information by using styles which in turn defeats the purpose of WYSIWYG.

So, while I really like that new themeing feature of Office 2007, I fear that for the majority of the people it will be completely useless as it plain won’t work.

Besides, themes are clearly made for the end user at home – in a corporate environment you will have to create documents according to the corporate design which probably won’t be based on a pre-built style in office.

And end users are the people the least able to understand how assigning styles to content works.

And once people “get” how to work with text styles and the themes will begin to work, we’ll be back at square one where everyone and their friends are using all the same theme because it’s the only one looking more or less acceptable, defeating all originality initially in the theme.

Upgrading the home entertainment system

Upgrading the home entertainment system

The day when I will finally move into my new flat is coming closer and closer (expect some pictures as soon as the people currently living there have moved out).

Besides thinking about outdated and yet necessary stuff like furniture, I’m also thinking about my home entertainment solution which currently mostly consists of a Windows MCE computer (terra) and my GameCube (to be replaced with a Wii for sure).

The first task was to create distance.

Distance between the video source and the projector. Currently, that’s handled simply by having the MCE connected to the projector via VGA (I’d prefer DVI, but the DVI output is taken by my 23″ cinema display I) and the GC, the PS2 and the XBox360 via composite to my receiver and the receiver via composite to the projector.

The distance between the projector and the receiver/MCE is currently about three meters tops, so no challenge there.

With a larger flat and a ceiling mounted projector, interesting problems arise distance-wise though: I’m going to need at least 20 meters of signal cable between receiver and projector – more than what VGA, DVI or even HDMI are specified for.

My solution in that department was the HDMI CAT-5 Extreme by Gefen. It’s a device which allows sending HDMI signals over two normal ethernet cables (shielded preferred) and reaching up to 60 meters of distance.

Additionally, CAT-5 cables are lighter, easier to bend and much easier to hide than HDMI or even DVI cables.

Now, terra only has a DVI and VGA out. This is a minor problem though as HDMI is basically DVI plus audio, so it’s very easy to convert a DVI signal into a HDMI one – it’s just a matter of connecting pins on one side with pins on the other side – no electronics needed there.

So with the HDMI CAT-5 Extreme and a DVI2HDMI adaptor, I can connect terra to the projector. All well, with one little problem: I can’t easily connect the GameCube or the other consoles any more: Connecting them directly to the projector is no option as it’s ceiling mounted.

Connecting them to my existing receiver isn’t a solution either as it doesn’t support HDMI, putting me into the existing distance problem yet again.

While I could probably use a very good component cable to transport the signal over (it’s after all an analog signal), it would mean I have three cables going from the receiver/MCE combo to the projector: Two for the HDMI extender and one big fat component cable.

Three cables to hide and a solution at the end of its life span anyways? Not with me! Not considering I’m moving into the flat of my dreams.

It looks like I’m going to need a new receiver.

After looking around a bit, it looks like the DENON AVR-4306 is the solution for me.

It can upconvert (and is said to do so in excellent quality) any analog signal to HDMI with a resolution of up to 1080i which is more than enough for my projector.

It’s also said to provide excellent sound quality and – for my geek heart’s delight – it’s completely remote-controllable over a telnet interface via its built-in ethernet port – even bidirectional: The – documented – protocol provides events on the line when operating conditions change by different events, like the user changing the volume on the device.

This way, I can have all sources connected to the receiver and the receiver itself connected to the projector over the CAT-5 Extreme. Problems solved and considering how many input sources and formats the denon supports, it’s even quite future-proof.

I’ve already ordered the HDMI extender and I’m certainly going to have a long, deep look into that Denon thing. I’m not ready to order just yet though: It’s not exactly cheap and while I’m quite certain to eventually buy it, the price may just fall down a little bit until November 15th when I’m (hopefully) moving into my new home.

Windows Vista, Networking, Timeouts

Today I went ahead and installed the RC2 of Windows Vista on my media center computer.

The main reason for this was because that installation was very screwed (as most of my Windows installations get over time – thanks to my experimenting around with stuff) and the recovery CD provided by Hush was unable to actually recover the system.

The Hard Drive is connected to a on-board SATA-RAID controller which the XP setup does not recognize. Usually, you just put the driver on a floppy and use setup’s capability of loading drivers during install, but that’s a bit hard without a floppy drive anywhere.

Vista, I hoped, would recognize the RAID controller and I read a lot of good things about RC2, so I thought I should give it a go.

The installation went flawlessly, though it took quite some time.

Unfortunately, surfing the web didn’t actually work.

I could connect to some sites, but on many others, I just got a timeout. telnet site.com 80 wasn’t able to establish a connection.

This problem in particular was in my Marvel Yukon chipset based network adapter: It seems to miscalculate TCP packet checksums here and there and Vista actually uses the hardwares capablity to calculate the sums.

To fix it, I had to open the advanced properties of the network card, select “TCP Checksum Offload (IPv4)” and set it to “Disabled”.

Insta-Fix!

And now I’m going ahead and actually start to review the thing

lighttpd, .NET, HttpWebRequest

Yesterday, when I deployed the server for my PocketPC-Application to an environment running lighttpd and PHP with FastCGI SAPI, I found out that the communication between the device and the server didn’t work.

All I got on the client was an Exception because the server sent back error 417: Precondition failed.

Of course there was nothing in lighttpd’s error log, which made this a job for EtherealWireshark.

The response from the server had no body explaining what was going on, but in the request-header, something interesting was going on:

Expect: 100-continue

Additionally, the request body was empty.

It looks like HttpWebRequest, with the help of the compact framework’s ServicePointManager is doing something really intelligent which lighttpd doesn’t support:

By first sending the POST request with an empty body and that Expect: 100-continue-header, HttpWebRequest basically gives the server the chance to do some checks based on the request header (like: Is the client authorized to access the URL? Is there a resource available at that URL?) without the client having to transmit the whole request body first (which can be quite big).

The idea is that the server does the checks based on the header and then either sends a error response (like 401, 403 or 404) or it advises the client to go ahead and send the request body (code 100).

Lighttpd doesn’t support this, so it sends that 417 error back.

The fix is to set Expect100Continue of System.Net.ServicePointManager to false before getting a HttpWebRequest instance.

That way, the .NET Framework goes back to plain old POST and sends the complete request body.

In my case that’s no big disadvantage because if the server is actually reachable, the requested URL is guaranteed to be there and ready to accept the data on HTTP-level (of course there may be some errors on the application level, but there has to be a request body for them to be detected).

.NET CF, Windows CE and Fullscreen

.NET CF, Windows CE and Fullscreen

Assuming you are creating an application for the .NET compact framework and further assuming that the application is designed to be the only one running on the target device because the whole device is defined by your application.

Also, you don’t want the end-users to tamper with the device.

This is why you sometimes want to put your application in a full-screen mode, hiding all other UI elements on the screen. Of course, to prevent tampering, you’d have to take additional measures, but that’s another topic.

The application I’m currently working on is written for the .NET compact framework, so the explanations are made for that environment.

Putting your application to full screen on the PocketPC is easy: Set your form’s FormBorderStyle to None and set WindowState to Maximized. That will do the trick.

On Windows CE (PocketPC is basically a special UI library and application collection running on top of Windows CE), there’s a bit more work to do.

First of all, you have to remove the task bar, which is acomplished by some P/Invoke calls which are declared like this:

[DllImport("coredll.dll", CharSet=CharSet.Auto)]
public static extern bool ShowWindow(int hwnd, int nCmdShow);

[DllImport("coredll.dll", CharSet = CharSet.Auto)]
public static extern bool EnableWindow(int hwnd, bool enabled);

Then, in your main form’s constructor, do the magic:

int h = FindWindow("HHTaskBar", "");
ShowWindow(h, 0);
EnableWindow(h, false);

And don’t forget to turn the task bar on again when your application exits.

int h = FindWindow("HHTaskBar", "");
ShowWindow(h, 5);
EnableWindow(h, true);

There’s one important additional thing to do though:

WindowState = Maximized won’t work!

Well. It will work, but it will resize your form in a way that there weill be empty space at the bottom of the screen where the taskbar was. You will have to manually resize the form by using something like this:

this.Height = Screen.PrimaryScreen.Bounds.Height;
this.Width = Screen.PrimaryScreen.Bounds.Width;

That last bit hit me hard today :-)

On a side note: There’s also the SHFullScreen-API call which also allows your application to position itself on the top of the taskbar. This basically is the official way to go, but the DLL aygshell.dll where the function is implemented in is not always available on all CE configurations.

XmlReader – I love thee

Lately, I have been working with the .NET framework. Well. It was the compact framework actually. I’m currently writing software for one of these advanced barcode scanners which run Windows Mobile.

The one thing I want to talk about is XmlReader. You know: One of these devices actually has a built-in GPRS unit, so it lends itself as a really nice mobile client.

With mobility comes synchronization and synchronization is something PopScan can do quite well. The protocol is XML based, so I need to parse XML on the device.

It’s even getting more interesting though: The server usually bzip2-compresses the XML-data while sending it out. The XML stream is perfectly compressible, so that’s a good thing to do – even more so that the device communicates over a volume taxed GPRS connection.

The naïve approach to this situation is to do this:

  1. Read data from server to the memory
  2. Compress the data in-memory
  3. Use a DOM-Parser to build a DOM-Tree
  4. Iterate over the tree and handle the article data

This approach, of course, is completely unworkable. For once, you waste memory by storing the data multiple times in different forms. Then you build a DOM-tree which is pointless as it’s more or less flat data anyways. And finally, you wait for the download and then for the decompression before you can begin parsing. So it’s slow.

The way to go is to read data from the network, decompress it as it arrives, feed the data into a stream based XML-parser and work with its output.

That way, you only need some memory of buffers in the decompression engine and the XML parser. And you don’t wait. As you recieve data from the server, you can start decompressing and parsing it.

I’ve done this before. It was in Delphi. Reciving data from WinInet, feeding it through a bzip2 decompressor and finally parsing it with expat was truly hard work: Pointers here, malloc there and that awful event based interface of expat making it very difficult to track state.

And now I had to do it again with c#

Wow! This was easy.

First, there’s the nice Stream interface using a decorator pattern: You can just wrap streams into each other and then just read from the “outermost” stream.

This means that I can wrap a bzip2-decompression stream around the HTTP-Response stream and make the XML parser read from the decompression stream which in turn reads from the HTTP-response stream.

And then you have the XmlReader interface.

Parsing XML is done in a while loop by calling the object’s Read() method which returns whenever it encounters a start or end element in the stream. This makes tracking the state much easier and helps cleaning keeping your code clean.

All in all, I can’t believe how easy it was to write that parser.

This shows that some nice thought went into the design of the .NET framework and I’m really looking forward into finding even more nice surprises such as this.

Where have I been?

Long time no see. Where did yours truly go? Back to World of Warcraft (which was the reason for the lack of postings during 05)? Or something even worse?

I’m pleased to say that the WoW-times are more or less over. Granted: I still log on to the game here and then, but the pleasure I was getting out of playing the game is more or less gone.

There are more fun things to do than playing WoW and I’m currently enjoying them. WoW finally has regained the state of standard evening leisure as one of many alternatives of how to waste my time.

But back to the reason for my absence:

Since april this year I know that I will move into my very own flat. Back in April, it was a date far off with lots of things still needed to be done – things I didn’t bother about yet back then.

But now, November 1st is getting closer and closer by the day. And stuff still needs to be done.

And this is precisely why I somewhat lack the time to blog.

Writing an entry here on gnegg.ch consists of many tasks: First there’s inspiration. I browse the web, live through my day at work or just talk to colleagues of mine. Sooner or later something will happen about which I want to talk.

Then, I think about the subject and try to serialize my thoughts to create an entry that’s (hopefully) interesting to read.

And then I sit down and write the thing. This is the task that actually takes the least amount of my time (inspiration is the hardest for me – often times, I think the subjects are too obvious or too uninteresting to blog about).

The final thing is the proofreading – a task I’m not really good at.

So an average entry here takes about two to four hours to do – time I currently rather use for planning where to put existing furniture, where to buy new furniture (and where to put it of course), who to hire to install a new bathtub and so on.

This is a big thing for me. When I moved to my current flat back in 2001, it was more or less a “getting away from my parents” (don’t get me wrong: I love my parents). I moved more or less into the first available flat – also because it was hard as hell to get one in Zürich back then. So I took the opportunity.

Now it’s different. For one, this is my flat. Yes. I bought it. It’s mine. Then it’s more than three times as big as my current one. And it’s beautiful. Just filling it with my current furniture doesn’t give it the credit it deserves.

So, this is what’s keeping me absorbed.

Still, work is very, very interesting currently and I have lots of interesting stuff to write about in the pipeline (so inspiration is there) and I’m looking forward to post these entries. Today and in the near future.