shion died

After so many years of continued usage, shion (not the character from Xenosaga, my Mac Mini) died.

The few times it’s actually capable of detecting its hard-drive at boot-time, it loses contact to it shortly after loading the kernel. And the hard drive makes an awful kind of noise which is a very good pointer at what’s wrong.

Now, I could probably just replace the hard drive, but that old G4 processor, the 512 Megs of RAM and the two single USB-ports forcing me to cascade hub after hub all are good reasons to upgrade the hardware itself.

And thus, Shion 2.0 was born.

I grabbed an unused Mac Mini from the office and tried installing Ubuntu Gutsy on it, which worked well, but Leopard’s “Startup Disk” preference pane didn’t list the partition I installed Ubuntu on as a bootable partition. Booting Linux via pressing alt during pre-boot worked, but, hey, it’s a server and I don’t have a keyboard ready where shion is going to stand.

So I did it the brute-force way and just installed Ubuntu using the whole drive. It takes a hell of a lot of time for the EFI firmware to start missing the original GUID partition scheme and the original EFI parition, but when it does, it starts GRUB in the MBR partition, so I’m fine.

This does mean that I will be unable to install later firmware upgrades (due to the lack of a working OS X), but at least it means that I can reboot shion when needed without having to grab a keyboard.

This, provided that Domi will be able to solder me a display adaptor making the EFI BIOS emulation think that a display is connected.

All in all, I’m not totally happy with the next generation of shion. Not booting without a display attached, long boot times, non-working bios updates and, especially, no eSATA, but it’s free, so I’ll take it. I guess the old shion just chose a terribly inconvenient time to die.

The new iPods

<p>So we have new iPods.</p> <p>Richard sent me an email asking which model he should buy which made me begin thinking whether to upgrade myself. Especially the new touch screen model seemed compelling to me – at first.</p> <p>Still: I was unable to answer that email with a real recommendation (though honestly, I don’t think it was as much about getting a recommendation than about to letting me know that the models were released and to hear my comments about them) and still I don’t really know what to think.</p> <p>First off: This is a matter of taste, but I hate the new nano design: The screen still is too small to be useful for real video consumption, but it made the device very wide – too wide, I think, to be able to comfortably keep it in my trousers pockets while biking (I may be wrong though).</p> <p>Also, I don’t like the rounded corners very much and the new interface… really… why shrink the menu to half a screen and clutter the rest with some meaningless cover art which only the smallest minority of my files are tagged with.</p> <p>Coverflow feels tucked onto the great old interface and looses a lot of its coolness without the touch screen.</p> <p>They don’t provide any advantage in flash size compared to the older nano models and I think the scroll wheel is way too small compared to the large middle button.</p> <p>All in all, I would never ever upgrade my second generation nano to one of the third generation as they provide no advantage, look (much) worse (IMHO) and seem to have a usability problem (too small a scroll wheel)</p> <p>The iPod classic isn’t interesting for me: Old style hard drives are heavy and fragile and ever since I bought that 4GB nano a long while ago, I noticed that there is no real reason behind having all the music on the device.</p> <p>I’m using my nano way more often than I ever used my old iPod: The nano is lighter and I began listening to podcasts. Still: While I lost HD-based iPods around every year and a half due to faulty hard drives or hard drive connectors, my nano still works as well as it did on the first day.</p> <p>Additionally, the iPod classic shares the strange half-full-screen menu and it’s only available in black or white. Nope. Not interesting. At least for me.</p> <p>The iPod touch is interesting because it has a really interesting user interface. But even there I have my doubts: For one, it’s basically an iPhone without the phone. Will I buy an iPhone when (if) it becomes available in Switzerland? If yes, there’s no need to buy the iPod Touch. If no, there still remains that awful usability problem of touch-screen only devices:</p> <p>You can’t use them without taking them out of your pocket.</p> <p>On my nano, I can play and pause the music (or more often podcast) and I can adjust the volume and I can always see what’s on the screen.</p> <p>On the touch interface, I have to put the screen to standby mode, I can’t do anything without looking at the device and I think it may be a bit bulky all in all.</p> <p>The touch is the perfect bathtub surfing device. It’s the perfect device to surf the web right before or after going to sleep. But it’s not portable.</p> <p>Sure. I can take it with me, but it fails in all the aspects of portability. It’s bulky, it can’t be used without taking it out of your pocket and stopping whatever you are doing, it requires two hands to use (so no changing tracks on the bike any more) and it’s totally useless until you manually turn the display back on and unlock it (which also requires two hands to do).</p> <p>So: Which device should Richard buy? I still don’t know. What I know is that I will not be replacing my second generation Nano as long as it keeps working.</p> <p>The Nano looks awesome, works like a charm and is totally portable. Sure. It can’t play video, but next to none of my videos actually fits the requirement of the video functionality anyways and I don’t see myself recoding already compressed content. That just takes an awful lot of time, greatly degrades the quality and generally is not at all worth the effort.</p>

Careful when clean-installing TabletPCs

At work, I got my hands on a LS-800 TabletPC by motion computing and after spending a lot of time with it and as I’m very interested in TabletPCs anyways, I finally got myself its bigger brother, the LE-1700

The device is a joy to work with: Relatively small and light, one big display and generally nice to handle.

The tablet came with Windows XP preinstalled and naturally, I wanted to have a look at the new Tablet-centric features in Vista, so I went ahead and upgraded.

Or better: Clean-installed.

The initial XP installation was german and I was installing an english copy of Vista which makes the clean installation mandatory.

The LE-1700 is one of the few devices without official Vista-support, but I guess that’s because of the missing software for the integrated UMTS modem – for all other devices, drivers either come prebundled with Vista, are available on Windows update or you can use the XP drivers provided at the Motion computing support site.

After the clean installation, I noticed that the calibration of the pen was a bit off – depending on the position on the screen, the tablet noticed the pen up to 5mm left or above the actual position of the pen. Unfortunately, using the calibration utility in the control panel didn’t seem to help much.

After some googling, I found out what’s going on:

The end-user accessible calibration tool only calibrates the screen for the tilt of the pen relative to the current position. The calibration of the pens position is done by the device manufacturer and there is no tool available for end-users to do that.

Which, by the way, is understandable considering how the miscalibration showed itself: To the middle of the screen it was perfect and near the sides it got worse and worse. This means that a tool would have to present quite a lot of points for you to hit to actually get a accurately working calibration.

Of course, this was a problem for me – especially when I tried out journal and had to notice that the error was bad enough to take all the fun out of hand-writing (imagine writing on a paper and the text appearing .5cm left of where you put the pen).

I needed to get the calibration data and I needed to put it back after the clean installation.

It turns out that the linear calibration data is stored in the registry under HKLMSYSTEMCurrentControlSetControlTabletPCLinearityData in the form of a (large) binary blob.

Unfortunately, Motion does not provide a tool or even reg-file to quickly re-add the data should you clean-install your device, so I had to do the unthinkable (I probably could have called support, but my method had the side effect of not making me wait forever for a fix):

I restored the device to the factory state (by using the preinstalled Acronis True Image residing on a hidden partition), exported the registry settings, reinstalled Vista (at which time the calibration error resurfaced), imported the .reg-File and rebooted.

This solved the problem – the calibration was as smooth as ever.

Now, I’m not sure if the calibration data is valid for the whole series or even defined per device, but here is my calibration data in case you have the same problem as I had.

If the settings are per device or you have a non-LE-1700, I strongly advise you to export that registry key before clean-installing

Obviously I would have loved to know this beforehand, but… oh well.

Wii in a home cinema

The day before yesterday I was lucky enough to get myself a Wii.

It was and basically still is impossible to get one here in Switzerland since the launch on December 8th. So I was very happy that I got the last device of a delivery of like 15 pieces to a game shop near where I work.

Unfortunately, my out-of-the-box experience with the Wii was quite poor which is why I didn’t write the review yesterday – I wanted to spend a bit more time with the console before writing something bad about it.

Here’s my story:

I’m using a projector, a receiver and a big screen – a real home cinema.

This means that the Wii is usually placed quite far away from either the screen or from the receiver (and especially from the projector about 25 meters in my case). This also means that I get into large issues with the relatively short cable with which you are supposed to connect the sensor bar to the Wii.

And the short A/V-cable didn’t help either, so I also couldn’t just place the Wii near the screen because then I wouldn’t be able to connect it to the receiver.

I ended up placing the Wii more or less in the middle of the room and while I like the looks of the console, it still doesn’t fit the clean look of the rest of my home cinema.

It gets worse though: I placed the sensor bar on the top of my center speaker right below the screen. It turned out though that this placement was too far below my usual line of sight so that the Wiimote wasn’t able to pick the signal up.

So currently, I have placed the sensor bar on top of an awful looking brown box right on the middle of my table – a setup I have to rebuild whenever I want to play and to put away when I’m not playing.

I SO want that wireless sensor bar to place it on the top of my screen.

But the not-quite-working goes on: At first I wasn’t able to connect to my WLAN. The Wii just didn’t find the network. Flashing the ZyXEL AP with a newer software helped there and the Wii recognized the network, but was unable to get an IP address.

Due to the awkward placement it was unable to get a strong signal.

I moved the device more to the middle of the room (making it even more visible to the casual eye) and it was finally able to connect.

My first visit to the shopping channel ended up with the whole console crashing hard. Not even the power button worked – I had to unplug and replug it at which time I had enough and just played Zelda (a review of that jewel will probably follow).

Yesterday I was luckier with the shopping channel (I didn’t buy anything though) and as I had my terrible “sensor bar on a box” configuration already up and running, I got a glimpse of what the Wii out-of-the-box-experience could be: Smootly working, good-looking and a very nice user control interface – using the Wiimote to point at the screen feels so … natural.

In my opinion, Nintendo did an awful mistake of forcing that cable on the sensor bar. As we know by now, the bar contains nothing more than two IR-LEDs. The cable is only for powering them. Imagine the sensor bar being another BT device – maybe mains-powered or battery-powered otherwise (though these IR-LEDs suck power like mad). Imagine the console being able to turn it on and off wirelessly.

The whole thing would not have been that much more expensive (alternatively, they could sell it as an addon) but it would allow the same awesome out-of-the-box experience for all users – even the one with a real home entertainment system.

If it wasn’t Nintendo (I admit that I am a «fanboi» in matters of Nintendo – the conditioning I got with the NES in my childhood still hasn’t worn off), I would have been so incredibly pissed at that first evening that I would have returned the whole console and written one bad review here – even the XBox 360 worked better than the Wii… *sigh*

And all that to save a couple of hours in the engineering department.

Debugging PocketPCs

Currently I’m working with Windows Mobile based barcode scanning devices. With .NET 2.0, actually developing real-world applications for the mobile devices using .NET has become a viable alternative.

.NET 2.0 combines sufficient speed at runtime (though you have to often test for possible performance regressions) with a very powerful development library (really usable – as compared to .NET 1.0 on smart devices) and unbeatable development time.

All in all, I’m quite happy with this.

There’s one problem though: The debugger.

When debugging, I have two alternatives and both suck:

  1. Use the debugger to connect to the real hardware. This is actually quite fast and works flawlessly, but whenever I need to forcibly terminate the application (for example when an exception happened or when I’m pressing the Stop-Button in the debugger), the hardware crashes somewhere in the driver for the barcode scanner.

    Parts of the application stay in memory and are completely unkillable. The screen freezes

    To get out of this, I have to soft-reset the machine and wait half a century for it to boot up again.

  2. Use the emulator. This has the advantage of not crashing, but it’s so slow.

    From the moment of starting the application in VS until the screen of the application is loaded in the emulator, nearly three minutes pass. That slow.

So programming for mobile devices mainly contains of waiting. Waiting for reboots or waiting for the emulator. This is wearing me down.

Usually, I change some 10 lines or so and then run the application to test what I’ve just written. That’s how I work and it works very well because I get immediate feedback and it helps me to write code what’s working in the first place.

Unfortunately, with these prohibitive long startup times, I’m forced to write more and more code in one batch which means even more time wasted with debugging.

*sigh*

ServeRAID – Fun with GUI-Tools

We’ve recently bought three more drives for our in-house file server. Up until now, we had a RAID 5 array (using a IBM ServeRAID controller) spawning three 33GB drives. That array recently got very, very close to being full.

So today, I wanted to create a second array using the three new 140GB drives.

When you download the ServeRAID support CD image, you get access to a nice GUI-tool which is written in Java and can be used to create Arrays on these ServeRAID controllers.

Unfortunately, I wasn’t able to run the GUI at first because somehow, the Apple X11 server wasn’t willing/able to correctly display the GUI. I always got empty screens when I tried (the server is headless, so I had to use X11 forwarding via ssh).

Using a Windows machine with Xming (which is very fast, working perfectly and totally free as in speech) worked though and I got the GUI running.

All three drives where recognized, but one was listed as “Standby” and could not be used for anything. Additionally, I wasn’t able to find any way in the GUI to actually move the device from Standby to Ready.

Even removing and shuffling the drives around didn’t help. That last drive was always recognized as “Standby”, independant of the bay I plugged it into.

Checking the feature list of that controller showed nothing special – at first I feared that the controller just didn’t support more than 5 drives. That fear was without reason though: The controller supports up to 32 devices – more than enough for the server’s 6 drive bays.

Then, looking around on the internet, I didn’t find a solution for my specific problem, but I found out about a tool called “ipssend” and there was documentation how to use it in an old manual by IBM.

Unfortunately, newer CD images don’t contain ipssend any more, Forcing you to use the GUI which in this case didn’t work for me. It may be that there’s a knob to turn somewhere, but I just failed to see it.

In the end, I found a very, very old archive at the IBM website which was called dumplog and contained that ipssend command in a handy little .tgz archive. Very useful.

Using that utility solved the problem for me:

# ./ipssend setstate 1 1 5 RDY

No further questions asked.

Then I used the Java-GUI to actually create the second array.

Now I’m asking myself a few questions:

  • Why is the state “Standby” not documented anywhere (this is different from a drive in Ready state configured as Standby drive)?
  • Why is there no obvious way to de-standby a drive with the GUI?
  • Why isn’t that cool little ipssend utility not officially available any more?
  • Why is everyone complaining that command line is more complicated to use and that GUIs are so much better when obviously, the opposite is true?

Upgrading the home entertainment system

Upgrading the home entertainment system

The day when I will finally move into my new flat is coming closer and closer (expect some pictures as soon as the people currently living there have moved out).

Besides thinking about outdated and yet necessary stuff like furniture, I’m also thinking about my home entertainment solution which currently mostly consists of a Windows MCE computer (terra) and my GameCube (to be replaced with a Wii for sure).

The first task was to create distance.

Distance between the video source and the projector. Currently, that’s handled simply by having the MCE connected to the projector via VGA (I’d prefer DVI, but the DVI output is taken by my 23″ cinema display I) and the GC, the PS2 and the XBox360 via composite to my receiver and the receiver via composite to the projector.

The distance between the projector and the receiver/MCE is currently about three meters tops, so no challenge there.

With a larger flat and a ceiling mounted projector, interesting problems arise distance-wise though: I’m going to need at least 20 meters of signal cable between receiver and projector – more than what VGA, DVI or even HDMI are specified for.

My solution in that department was the HDMI CAT-5 Extreme by Gefen. It’s a device which allows sending HDMI signals over two normal ethernet cables (shielded preferred) and reaching up to 60 meters of distance.

Additionally, CAT-5 cables are lighter, easier to bend and much easier to hide than HDMI or even DVI cables.

Now, terra only has a DVI and VGA out. This is a minor problem though as HDMI is basically DVI plus audio, so it’s very easy to convert a DVI signal into a HDMI one – it’s just a matter of connecting pins on one side with pins on the other side – no electronics needed there.

So with the HDMI CAT-5 Extreme and a DVI2HDMI adaptor, I can connect terra to the projector. All well, with one little problem: I can’t easily connect the GameCube or the other consoles any more: Connecting them directly to the projector is no option as it’s ceiling mounted.

Connecting them to my existing receiver isn’t a solution either as it doesn’t support HDMI, putting me into the existing distance problem yet again.

While I could probably use a very good component cable to transport the signal over (it’s after all an analog signal), it would mean I have three cables going from the receiver/MCE combo to the projector: Two for the HDMI extender and one big fat component cable.

Three cables to hide and a solution at the end of its life span anyways? Not with me! Not considering I’m moving into the flat of my dreams.

It looks like I’m going to need a new receiver.

After looking around a bit, it looks like the DENON AVR-4306 is the solution for me.

It can upconvert (and is said to do so in excellent quality) any analog signal to HDMI with a resolution of up to 1080i which is more than enough for my projector.

It’s also said to provide excellent sound quality and – for my geek heart’s delight – it’s completely remote-controllable over a telnet interface via its built-in ethernet port – even bidirectional: The – documented – protocol provides events on the line when operating conditions change by different events, like the user changing the volume on the device.

This way, I can have all sources connected to the receiver and the receiver itself connected to the projector over the CAT-5 Extreme. Problems solved and considering how many input sources and formats the denon supports, it’s even quite future-proof.

I’ve already ordered the HDMI extender and I’m certainly going to have a long, deep look into that Denon thing. I’m not ready to order just yet though: It’s not exactly cheap and while I’m quite certain to eventually buy it, the price may just fall down a little bit until November 15th when I’m (hopefully) moving into my new home.

Computers under my command (4): yuna

Yuna was the lead girl in Final Fantasy X, the first episode of the series being released for the Playstation 2.

Now, I know I’m alone with this oppinion, but FFX was a big disappointment for me: Obvious character backgrounds, unimpressive story, stupid mini games, no world map, much too short. No. I didn’t like FFX.

But this doesn’t change the fact that I played through the game and that I was serisouly impressed of how well the thing looked. Yes. The graphics were good – unfortunately that’s everything positive I can say about the game.

And this is why I’m getting straight to the computer behind the name:

I called my MacBook Pro “yuna”.

My MacBook Pro is the one machine I use at work that impressed me the most yet: Fast, good looking, long battery life… and… running MacOS X.

Yuna did what was completely unthinkable for me not much more than 5 years ago: It converted me over to using MacOS X as my main OS. It’s not secondary OS. It’s no dual boot (especially since I stopped playing WoW). It’s no “MacOS is nice, but I’m still more productive in Windows”. It’s no “sometimes I miss Windows” and no “mmh… this would work better in Windows”.

No. It’s a full-blown remorseless conversion.

Granted: Some things DO work better in windows (patched emulators for use in Timeattack videos come to mind), but my point is: I don’t miss them.

The slickness and polish of the OSX interface and especially the font rendering (I admit, I putting way too much emphasis in fonts when chosing my platform, but fonts after all are the most important interface between you and the machine) and the unix backend make me wonder: How could I ever work without OS X?

It’s funny. For some time now I thought about converting.

But what really made me do it was the knowing that there’s a safety net: You know: I still have that windows partition on this intel mac. And I do have Parallels (which is much faster than Virtual PC) which I use for Delphi and lately Visual Studio.

Everyone that keeps telling that Apple switching to Intel will decrease their market share even more better shuts up. Now. Once you have that machine, once you see the slickness of the interface, once you notice how quickly you can be productive in the new environement, once that happens, you’ll see that there’s no need, no need at all, to keep using Windows.

So, a wonderful machine with a name of a (admittedly) good looking girl (with a crappy background story) from a crappy game. Too bad Marle or Terra wasn’t free any more.

Computers under my command (3): terra

Final Fantasy VI (known as Final Fantasy 3 in the US) begins with two guys and Terra using some mech-like device to raid a town with the objective to find a creature referred to as an Esper.

You soon learn that Terra is in fact wearing a device inhibiting her free will and that she would never do something like that out of her free will – quite to the contrary.

When the three people find that Esper, the two soldiers die at its hands, but Terra survives.

The rest of the game evolves about her, the balance between magic and technology, love and humanity.

Terra is the main character in what I think is the best Final Fantasy ever done, probably because it’s in a way similar to Chrono Trigger (which is the second-best RPG ever done): Good thought-out characters, very free progression in the game (once the world ends, which is about half into the game), nice graphics and one hell of a story.

What really burns this game and especially Terra into your mind though is her theme sound. Even on the SNES it sounds really nice and in the end it’s what made me really interested in game soundtracks.

Also, I’ve blogged about a remix of that theme song. You should probably go ahead and listen to that to understand why FF6 is special to me and to everybody else.

Even after not having played FF6 for more than two years now, I still can’t get that theme song out of my head.

The computer terra is a fan-less media center PC by hush technologies. It’s running Windows XP Media Center edition and it’s connected to my video projector.

I’m also using it to manage my iPod (and I’m using a script to rsync the music over to shion both for backup and SlimServer access) and sometimes to play World of Warcraft on.

Even though the machine is quite silent, I can’t have it running over night, so it hasn’t that big an uptime: It’s right next to my bed and the sound of the spinning hard-disk and the blue power indicator led both keep me from sleeping at night.

Ever since I’m using the machine, I had small glitches with it: Crashes after updating the graphics driver (fixed via system restore), segmentation faults in the driver for the IR receiver, basically the stuff you get used to when you are running Windows.

I’m not complaining though: Even though the installation is fragile, my home entertainment concept depends on terra and usually it works just fine.

And after all, the original Terra was kind of fragile too (mentally and physically), so it’s just fitting that the same applies to the computer named after her.

PS: Sorry for the bad picture quality, but I only found Terra on a black background, so I had to manually expose her. Problem is: I know as much about graphics software as a graphics designer knows about programming in C. Anyways: It turned out acceptable IMHO.

Computers under my command (2): marle

While everyone keeps calling her Marle, she is actually the princess Nadia of the Kingdom of Guardia in what many people are calling the best console RPG ever made, Chrono Trigger

Chrono Trigger was one of the last RPGs Squaresoft ever did for the SNES and it’s special in many ways: Excellent Music (by Yasunori Mitsuda), excellent graphics, smooth game play, really nice story and: Excellently done characters.

Robo, Frog, Lucca, Marle, Crono, Magus and Ayla – every one of them has its very own style and story. Aside from Crono which is quite the ordinary guy, every one of them is special in its own kind.

The server marle is special on its own way too.

It’s not as outstanding as shion, but it’s special in its own way: It was the first 64Bit machine running a 64Bit OS I’ve ever deployed.

The OS was Gentoo linux (as usual) and the machine itself is some IBM xSeries machine equipped with a 3Ghz Xeon processor and 2GB of RAM, so basically nothing you need 64 Bit for.

It still was an interesting experiment to get the machine to work with a 64 Bit OS, though all that went completely uneventful.

Ever since deployed, marle is running at a customers site without crashes or other problems.

marle ~ # uptime
     11:56:13 up 265 days, 44 min,  2 users,  load average: 0.00, 0.01, 0.00

Not much happening there currently I guess. Also, it’s amazing how quickly time passes – installing that machine feels like it was only yesterday.