the new (2013) MacPro

Like many others, I couldn’t wait for Apple to finally upgrade their MacPro and like many others, when they could finally be ordered, I queued up to get mine.

Last Monday, after two months of wait, the package finally arrived and I could start playing with it. I have to say: The thing is very impressive.

The hardware itself is very lightweight and compact. Compared to the old aluminium MacPro it was replacing, it felt even smaller than it is. Also, the box is nearly silent – so silent in fact, that now the hum of the dimmed background light in my old 30” Cinema Display is louder than the machine itself.

Speaking of that 30” display: It’s using a dual-link DVI port. That means a special adapter is required to connect it to the new Thunderbolt ports – at least if you want to use a higher resolution than 1280×800 (which you definitely do).

The adapter is kinda difficult to get, especially as I totally forgot about it and I reall wanted to migrate to the new machine, so I had to look through local retail (only the one from Apple even remotely available) as opposed to Amazon (three other models available, some cheaper).

The device is huge by the way. I’m sure there’s some electronics in there (especially when you consider that you have to plug it into a USB port for power), probably to split the full 2560×1600 pixels sent over Thunderbolt into two images of 1280×800, only to be reassembled in the display I guess.

The fact that there obviously is processing going on leaves a bit of a bad taste as it’s one more component that could now break and, of course, there might be display lag or quality degradation.

At some time, there was for sure, if the adapters reviews are to be believed, but so far, I wasn’t able to notice bad quality nor lag, but the fact that now there’s one more active component involved in bringing me a picture makes me just a tad bit nervous.

Anyways – let’s talk about some more pleasant things.

One is the WiFi: With the old MacPro I had peak transfer of about 3 MBytes/s which was just barely good enough for me to not wanting to go through the trouble of laying cable, even though it really pissed me off at times.

On the new Pro, I reached 18 MBytes/s over the exact same WiFi yesterday which removes any need for ever considering installing a physial cable. Very handy. Remember: It’s not a file server, it doesn’t run a torrent client, it doesn’t serve movies to my home network. The really large bulk transfers it does are mainly caused by Steam which clearly is the bottleneck here (it never manages to saturate my 150MBit/s downstream).

Another thing that really surprises me is the sleeping behavior of the box. Well, actually, the waking up behavior: When asleep, the thing wakes up instantly (less than a second) – never in my live have I seen such a quick waking up from sleep in a computer.

Yes. I’m waiting for the fan to spin down and all audible noise to go away, but still. Hit any key on the keyboard and the machine’s back. We’re talking “waking an iphone from sleep” speeds here.

It might be that the machine has multiple levels of sleep states, but the instant wake-up also happens after sleeping for over 12 hours at which point a deeper sleep would totally make sense if there was any.

What is strange though: I seem to be able to wake the machine by pinging it. Yes. I know about the bonjour proxy, but in this case, I’m pinging it directly by IP and it wakes up (the first ping has a roundtrip time for 500ish ms – yes. it wakes THAT quickly).

This leads me to believe that the machine might not actually be sleeping for real though because waking from a direct ping requires quite a bit more technology than waking from a WOL packet.

Somdeday, I’ll play with tcpdump to learn what’s going on here.

Performance-wise, I haven’t done that much testing, but replaying a test Postgres database dump that takes 5ish minutes on a 2012 retina MacBook Pro completes in 1:12 minutes on the pro – pretty impressive.

And one last thing: When you get a machine as powerful as this, there’s of course also the wish of playing a game or two on it. As I had one SSD dedicated to Bootcamp in the old Pro, I was curious whether I might be able to keep this setup: The built-in flash drive dedicated to MacOS and Windows on its own (the old one) dedicated SSD.

Now that we don’t have internal drive bays any more, this might seem tricky, but yesterday, I managed to install Windows 8 nicely on that SSD after connecting it via Thunderbolt using this adapter (no affiliate code – I got the link straight from google).

I guess the fact that it’s using Thunderbolt makes Windows think it’s a built-in hard drive which is what makes this work: You’re not allowed to install Windows on a portable drive due to licensing issues.

The adapter is not actually intended for use with arbitrary drives (it’s an accessory to some Seagate portable drives), but it works totally well and is (physically) stable enough. I’ll have to do a bit of benchmarking to see how much performance I lose compared to the old built-in solution, but it certainly doesn’t feel any slower.

Overall, I’m really happy with my new toy. Yes, it’s probably overpowered for my needs, but it’s also cool has hell, it is the first MacPro I own where sleep works reliably (though I’m inclined to say that it works suspiciously well – it might be cheating) and the fact that bootcamp still works with a dedicated external drive makes me really happy too.

Things I can’t do with an iPhone/iPad

  • have a VoIP call going on when a mobile call/SMS arrives
  • read Kindle ebooks (I can now, but knowing Apple’s stance on “competing functionality”, with the advent of iBook, how long do you think this will last?)
  • give it to our customers as another device to use with PopScan (It’s not down-lockable and there’s no way for centralized app deployment that doesn’t go over apple)
  • plug any peripheral that isn’t apple sanctioned
  • plug a peripheral and use it system-wide
  • play a SNES ROM (or any other console rom)
  • install Adblock (which especially hurts on the iPad)
  • consistenly use IM (background notifications don’t work consistently)

The iPhone provides me with many advantages and thus I can live with its inherent restrictions (which are completely arbitrary – there’s no technical reason for them), but I see no point to buy yet another locked-down device that does half of the stuff I’d want it to do and does it half-assed at that.

Also it’s a shame that Apple obviously doesn’t need any corporate customers (at least for a small company, I see no possibility).

I just hope, the open and usable Mac computer remains. I would not know what to go back to? Windows? Never. Linux? Sure. But on what hardware?

Snow Leopard and PHP

Earlier versions of Mac OS X always had pretty outdated versions of PHP in their default installation, so what you usually did was to go to entropy.ch and fetch the packages provided there.

Now, after updating to Snow Leopard you’ll notice that the entropy configuration has been removed and once you add it back in, you’ll see Apache segfaulting and some missing symbol errors.

Entropy has not updated the packages to snow leopard yet, so you could have a look at PHP that came with stock snow leopard: This time it’s even bleeding edge: Snow Leopard comes with PHP 5.3.0.

Unfortunately though, some vital extensions are missing, most notably for me, the PostgeSQL extension.

This time around though, Snow Leopard comes with a functioning PHP development toolset, so there’s nothing stopping you to build it yourself, so here’s how to get the official PostgreSQL extension working on Snow Leopard’s stock php:

  1. Make sure that you have installed the current Xcode Tools. You’ll need a working compiler for this.
  2. Make sure that you have installed PostgreSQL and know where it is on your machine. In my case, I’ve used the One-click installer from EnterpriseDB (which persisted the update to 10.6).
  3. Now that Snow Leopard uses a full 64bit userspace, we’ll have to make sure that the PostgreSQL client library is available as a 64 bit binary – or even better, as an universal binary.Unfortunately, that’s not the case with the one-click installer, so we’ll have to fix that first:
    1. Download the sources of the PostgreSQL version you have installed from postgresql.org
    2. Open a terminal and use the following commands:
      % tar xjf postgresql-[version].tar.bz2
      % cd postgresql-[version]
      % CFLAGS="-arch i386 -arch x86_64" ./configure --prefix=/usr/local/mypostgres
      % make

      make will fail sooner or later because you the postgres build scripts can’t handle building an universal binary server, but the compile will progress enough for us to now build libpq. Let’s do this:

      % make -C src/interfaces
      % sudo make -C src/interfaces install
      % make -C src/include
      % sudo make -C src/include install
      % make -C src/bin
      % sudo make -C src/bin install
  4. Download the php 5.3.0 source code from their website. I used the bzipped version.
  5. Open your Terminal and cd to the location of the download. Then use the following commands:
    % tar -xjf php-5.3.0.tar.bz2
    % cd php-5.3.0/ext/pgsql
    % phpize
    % ./configure --with-pgsql=/usr/local/mypostgres
    % make -j8 # in case of one of these nice 8 core macs :p
    % sudo make install
    % cd /etc
    % cp php.ini-default php.ini
  6. Now edit your new php.ini and add the line extension=pgsql.so

And that’s it. Restart Apache (using apachectl or the System Preferences) and you’ll have PostgreSQL support.

All in all this is a tedious process and it’s the price us early adopters have to pay constantly.

If you want an honest recommendation on how to run PHP with PostgreSQL support on Snow Leopard, I’d say: Don’t. Wait for the various 3rd party packages to get updated.

Alt-Space

Today, I was looking into the new jnlp_href way of launching a Java Applet. Just like applet-launcher, this allows one to create applets that depend on native libraries without the usual hassle of manually downloading the files and installing them.

Contrary to applet-launcher, it’s built into the later versions of Java 1.6 and it’s officially supported, so I have higher hopes concerning its robustness.

It’s even possible to keep the applet-launcher calls in there if the user has an older Java Plugin that doesn’t support jnlp_href yet.

So in the end, you just write a .jnlp file describing your applet and add

<param name="jnlp_href" value="http://www.example.com/path/to/your/file.jnlp">

and be done with it.

Unless of course, your JNLP file has a syntax error. Then you’ll get this in your error console (at least in case of this specific syntax error):

java.lang.NullPointerException
    at sun.plugin2.applet.Plugin2Manager.findAppletJDKLevel(Unknown Source)
    at sun.plugin2.applet.Plugin2Manager.createApplet(Unknown Source)
    at sun.plugin2.applet.Plugin2Manager$AppletExecutionRunnable.run(Unknown Source)
    at java.lang.Thread.run(Unknown Source)
Ausnahme: java.lang.NullPointerException

How helpful is that?

Thanks, by the way, for insisting to display a half-assed German translation on my otherwise english OS: Never use locale info for determining the UI langauge, please.

Of course, this error does not give any indication of what the problem could be.

And even worse: The error in question is the topic of this blog post: It’s the dreaded Alt-Space character, 0xa0, or NBSP in ISO 8859-1.

0xa0 looks like a space, feels like a space, is incredibly easy to type instead of a space, but it’s not a space – not in the least. Depending on your compiler/parser, this will blow up in various ways:

pilif@celes ~ % ls | grep gnegg
zsh: command not found:  grep
pilif@celes ~ %
pilif@celes ~ % cat test.php
<?
echo "gnegg";
?>
pilif@celes ~ % php test.php
PHP Parse error:  syntax error, unexpected T_CONSTANT_ENCAPSED_STRING in /Users/pilif/test.php on line 2

Parse error: syntax error, unexpected T_CONSTANT_ENCAPSED_STRING in /Users/pilif/test.php on line 2
pilif@celes ~ %

and so on.

Now you people in the US with US keyboard layouts might think that I’m just one of those whiners – after all, how stupid must one be to press Alt-Space all the time? Probably stupid enough to deserve stuff like this.

Before you think these nasty thoughts, I ask you to consider the Swiss German keyboard layout though: Nearly all the characters use programmers use are accessed by pressing Alt-[some letter]. At least on the Mac. Windows uses AltGr, or right-alt, but on the mac, any alt will do.

So when you look at the shell line above:

ls | grep gnegg

you’ll see how easy it is to hit alt-space: First I type ls, then space. Then I press and hold alt-7 for the pipe and then, I am supposed to let go of alt and hit space. But because my left hand is on alt and the right one is pressing space, it’s very easy to hit space before letting go of alt.

Now instead of getting immediate feedback, nothing happens. It looks as if the space had been added, when in fact, something else has been added and that something is not recognized as a white space character and thus is something completely different from a space – despite looking exactly the same.

As much fun as reading hexdump -C output is – I need this to stop.

Dear internet! How can I make my Mac (or Linux when using the Mac keyboard layout) stop recognizing Alt-Space?

To take air out of the eventually arriving troll’s sails:

  • I won’t use Windows again. Thank you. Neither do I want to use Linux on my desktop.
  • I cannot use the US keybindings because my brain just can’t handle the keyboard layout changing all the time and as I’m a native German speaker, I do have to type umlauts here and then – actually often enough, so that the ¨+vocal combo isn’t acceptable.
  • While running Mac OS X, I’m stuck with the mac keyboard layout – I can’t use the Windows one.

Above JNLP error (printed here just in case somebody else has the same issue) caused me to lose nearly 5 hours of my life and will force me to work this weekend – who’d expect a XML parser error due to a space that isn’t one when seeing above call stack?

Update: A commenter on reddit.com has recommended to use Ukelele which I did and it helped me to create a custom keyboard layout that makes alt-space work like just space. That’s the best solution for my specific taste, so thanks a lot!

No more hard drives for me!

Last week I noticed that the hardware store of my choice had these fancy new (and fast) Intel SSDs in stock – reason enough for me to go ahead and buy two to try them out in my two MacPro desktop machines. Kos-Mos, my home mac was the first to be converted.

But before that, there was this hardware problem to overcome. See: The SSDs are 2.5 inch drives whereas the MacPro has 3.5 inch slots. While the connectors (SATA) are compatible, the smaller form factor of the Intel drives prevents the usual drive sliders of the MacPro from working.

The solution was to buy one of these adapters for the SSDs. Before doing that, I read about other solutions, some of them involving duct tape, but this felt like it was the cleanest way and it was: The kits fit perfectly, so installing the drive was a real piece of cake.

The next problem was about logistics:

pilif@kosmos /Volumes/Macintosh HD
 % df -h | grep Macintosh
/dev/disk2s2   365Gi  319Gi   46Gi    88%    /Volumes/Macintosh HD

Whereas the largest Intel SSD available to date has just 160GB of capacity (149 “really usable”), so at least some kind of reorganization had to be done.

Seeing that the installation running on the traditional drive was ages old anyways (dating back to the last quarter of 2006), I decided that the sanest way to proceed was to just install another copy of Leopard to the new drive and use that as the boot device, coping over the applications and parts of the user profile I really needed.

Been there, done that.

I didn’t do any real benchmark, but boot-time is now sub 10 seconds. Eclipse starts up in sub 5 seconds. The installation of all the updates since the pristine 10.5.1 that was on the DVDs that came with the machine took less than three minutes – including the reboots (I’ve installed the 10.5.7 update this morning and it took around 10 minutes on the same machine).

And to make things even better: The machine is significantly quieter than before – at least once the old hard drive powers down.

I will never, ever, again use non-SSD drives in any machine I’m working at from now on.

The perceived speedup was as significant as going from 8MB or RAM to 32MB back in the days. The machine basically feels like a new computer.

Of course I ran into one really bad issue:

The idea was to symlink  ~/Music to my old drive because my iTunes Library (mostly due to Podcasts and audio books) was too large to conveniently copy to the SSD. I renamed ~/Music to ~/Music.old, created the symlink and started iTunes for the first time, only to get screwed with an empty library.

According to the preferences though, iTunes did correctly follow the symlink and was pointing to the right path (WTF?). I tried to manually re-add the library folder which did kind of work, but screwed over all my podcasts – completely.

This is where I noticed that somehow iTunes still found ~/Music.old and used that one. A quick ps turned out my best friend, the iTunes helper was running, so I shut that one down and moved ~/Music.old away to /, just to be sure.

Restarted iTunes just to run into the very same problems again (now, this is a serious WTF).

The only way to get this to work was to quit iTunes (that includes killing the helper) and to completely remove all traces of that Music folder.

Now iTunes is finally using the Music folder on my traditional hard drive. This kind of work should not be needed and I seriously wonder what kind of magic was going on behind the scenes there – after killing the helper and renaming the folder, it should not have used it any more.

Still: SSDs are fun. And I would never again want to miss the kind of speed I’m now enjoying.

celes in the office is next :-)

Playing Worms Armageddon on a Mac

Last weekend, I had a real blast with the Xbox 360 Arcade version of worms. Even after so many years, this game still rules them all, especially (if not only) in multiplayer mode.

The only drawback of the 360 version is the lack of weapons.

While the provided set is all well, the game is just not the same without the Super Banana Bomb or the Super Sheep.

Worms Screenshot

So this is why I looked for my old Worms Armageddon CD and tried to get it to work on todays hardware.

Making it work under plain Vista was easy enough (get the latest beta patch for armageddon, by the way):

Right-Click the Icon, select the compatibility tab, chose Windows XP, Disable Themes and Desktop composition and run the game with administrative privileges.

You may get away with not using one option or the other, but this one worked consistently.

To be really useful though, I wanted to make the game run under OS X as this is my main environment and I really dislike going through the lengthy booting process that is bootcamp.

I tried the various virtualization solutions around – something that should work seeing that the game doesn’t really need much in terms of hardware support.

But unfortunately, this was way harder than anticipated:

  • The initial try was done using VMWare Fusion which looked very good at first, but failed miserably later on: While I was able to launch (and actually use) the games frontend, the actual game was a flickery mess with no known workaround.
  • Parallels failed by displaying a black menu. It was still clickable, but there was nothing on the screen but blackness and a white square border. Googling around a bit led to the idea to set SlowFrontendWorkaround in the registry to 0 which actually made the launcher work, but the game itself crashed consistenly without error message.

In the end, I’ve achieved success using VirtualBox. The SlowFrontendWorkaround is still needed to make the launcher work and the mouse helper of the VirtualBox guest tools needs to be disabled (on the Machine menu, the game still runs with the helper enabled, but you won’t be able to actually control the mouse pointer consistently), but after that, the game runs flawlessly.

Flickerless and with a decent frame rate. And with sound, of course.

To enable the workaround I talked about, use this .reg file.

Now the slaughter of worms can begin :-)

New MacMini (early 09) and Linux

The new MacMinis that were announced this week come with a Firewire 800 port which was reason enough for me to update shion yet again (keeping the host name of course).

All my media she’s serving to my various systems is stored on a second generation Drobo which is currently connected via USB2, but has a lingering FW800 port.

Of course the upgrade to FW800 will not double the transfer rate to and from the drobo, but it should increase it significantly, so I went ahead and got one of the new Minis.

As usual, I entered the Ubuntu (Intrepid) CD, hold c while turning the device on and completed the installation.

This left the Mini in an unbootable state.

It seems that this newest generation of Mac Hardware isn’t capable of booting from an MBR partitioned harddrive. Earlier Macs complained a bit if the harddrive wasn’t correctly partitioned, but then went ahead and booted the other OS anyways.

Not so much with the new boxes it seems.

To finally achieve what I wanted I had to do the following complicated procedure:

  1. Install rEFIt (just download the package and install the .mpkg file)
  2. Use the Bootcamp assistant to repartition the drive.
  3. Reboot with the Ubuntu Desktop CD and run parted (the partitioning could probably be accomplished using the console installer, but I didn’t manage to do it correctly).
  4. Resize the FAT32-partition which was created by the Bootcamp partitioner to make room at the end for the swap partition.
  5. Create the swap partition.
  6. Format the FAT32-partition with something useful (ext3)
  7. Restart and enter the rEFIt partitioner tool (it’s in the boot menu)
  8. Allow it to resync the MBR
  9. Insert the Ubuntu Server CD, reboot holding the C key
  10. Install Ubuntu normally, but don’t change the partition layout – just use the existing partitions.
  11. Reboot and repeat steps 7 and 8
  12. Start Linux.

Additionally, you will have to keep using rEFIt as the boot device control panel item does not recognize the linux partitions any more, so can’t boot from them.

Now to find out whether that stupid resistor is still needed to make the new mini boot headless.

Dropbox

Dropbox is cloud storage on the next level: You install their little application – available for Linux, Mac OS X and Windows – which will create a folder which will automatically be kept synchronized between all the computers where you have installed that little application on.

Because it synchronizes in the background and always keeps the local copy around, the access-speed isn’t different from a normal local folder – mainly because it is, after all, a local folder you are accessing. Dropbox is not one of these slow “online hard drives” it’s more like rsync in the background (and rsync it is – the application is intelligent enough to only transmit deltas – even from binary files).

They do provide you with a web interface of course, but the synchronizing aspect is the most interesting.

The synchronized data ends up somewhere in Amazon’s S3 service, which is fine with me.

Unfortunately, while the data stored in an encrypted fashion on S3, the key is generated by the Dropbox server and thus known to them, which makes Dropbox completely unusable for sensitive unencrypted data. They do state in the FAQ that this will maybe change sometime in the future, but for not it is as it is.

Still, I found some use for Dropbox: ~/Library/Preferences, ~/.zshrc and ~/.ssh all are now stored in ~/Dropbox/System and symlinked back to their original place. This means that a large chunk of my user profile is availalbe on all the computers I’m working on. I would even try the same trick with ~/Library/Application Support, but that seems risky due to the missing encryption and due to the fact that Application Support sometimes contains database files which get corrupted for sure when moved around while they are open – like the Firefox profile.

This naturally even works when the internet connection is down – DropBox synchronizes changes locally, so when the internet (or Dropbox) is down, I just have the most recent copy of when the service was still working – that’s more than good enough.

Another use that comes to mind for Dropbox storage are game save files or addons you’d want to have access to on every computer you are using – just move your stuff to ~/Dropbox and symlink it back to the original place.

Very convenient.

Now if only they’d provide me with a way to provide my own encryption key. That way I would instantly buy the pro account with 25GB of storage and move lots and lots of data in there.

Dropbox is the answer to the ever increasing amount of computers in my life because now I don’t care about setting up the same stuff over and over again. It’s just there and ready. Very helpful.

Listen to your home music from the office

My MP3 collection is safely stored on shion, on a drobo mounted as /nas. Naturally, I want to listen to said music from the office – especially considering my fully routed VPN access between the office and my home infrastructure and the upstream which suffices for at least 10 concurrent 128bit streams (boy – technology has changed in the last few years – I remember the times where you couldn’t reliably stream 128 bit streams – let alone my 160/320 mp3s).

I’ve tried many things so far to make this happen:

  • serve the files with a tool like jinzora. This works, but I don’t really like jinzora’s web interface and I was never able to get it to work correctly on my Ubuntu box. I was able to trace it down to null bytes read from their tag parser, but the code is very convoluted and practically unreadable without putting quite some effort into that. Considering that I didn’t much like the interface in the first place, I didn’t want to invest time into that.
  • Use a SlimServer (now Squeezecenter) with a softsqueeze player. Even though I don’t use my squeezebox (an original model with the original slimdevices brand, not the newer Logitech one) any more because the integrated amplifier in the Sonos players works much better for my current setup. This solution worked quite ok, but the audio tends to stutter a bit at the beginning of tracks, indicating some buffering issues.
  • Use iTune’s integrated library sharing feature. This seemed both undoable and unpractical. Unpractical because it would force me to keep my main mac running all the time and undoable because iTunes sharing can’t pass subnet boundaries. Aside of that, it’s a wonderful solution as audio doesn’t stutter, I already know the interface and access is very quick and convenient.

But then I found out how to make the iTunes thing both very much doable and practical.

The network boundary problem can be solved using Network Beacon, a ZeroConf proxy. Start the application, create a new beacon. Chose any service name, use «_daap._tcp.» as service type, set the port number to 3689, enable the host proxy, keep the host name clear and enter the IP address of the system running iTunes (or firefly – see below).

Oh, and the target iTunes refuses to serve out data to machines in different subnets, so to be able to directly access a remote iTunes, you’d also have to set up an SSH tunnel.

Using Network Beacon, ZeroConf quickly begins working across any subnet boundaries.

The next problem was about the fact that I was forced to keep my main workstation running at home. I fixed that with Firefly Media Server for which even a pretty recent prebuilt package exists for Ubuntu (apt-get install mt-daapd).

I’ve installed that, configured iptables to drop packets for port 3689 on the external interface, configured Firefly to use the music share (which basically is a current backup of the itunes library of my main workstation – rsync for the win).

Firefly in this case even detects the existing iTunes playlists (as the music share is just a backup copy of my iTunes library – including the iTunes Library.xml), though smart playists don’t work, but can easily be recreated in the firefly web interface.

This means that I can access my complete home mp3 library from the office, stutter free, using an interface I’m well used to, without being forced to keep my main machine running all the time.

And it isn’t even that much of a hack and thus easy to rebuild should the need arise.

I’d love to not be forced to do the Network Beacon thing, but avahi doesn’t relay ZeroConf information across VPN interfaces.

VMWare Fusion Speed

This may be totally placebo, but I noticed that using Vista inside a VMWare Fusion VM has just turned from nearly unbearable slow to actually quite fast by updating from 2.0 Beta 2 to 2.0 Final.

It may very well be that the beta versions contained additional logging and/or debug code which was keeping the VM from reaching its fullest potential.

So if you are too lazy to upgrade and still running one of the Beta versions, you should consider updating. For me at least, it really brought a nice speed-up.