I just started a short term contract doing some RoR and was given a PC to work with. While developing on RoR on Windows is possible, it’s not what I’m used to or best at. As there were no Macs I could use, I thought I would give Linux a go. Seeing as I use Vim as my editor of choice, switching over should have been relatively painless.
I’ll just get to the point, that for many years now I’ve always said that I don’t think Linux as a desktop OS really cuts it compared to Windows or OSX, and having used it for a day and a half I still think that’s true. Off the bat the install seemed plagued with things I just wouldn’t expect from a “modern” OS.
The first major blunder came during the installation process. I booted off the CD and was told that a hard drive with Windows had been detected and would I like to install Ubuntu along side it. I chose yes and was shown a screen with a dropdown with HDs and a slider to allow you to resize the partition of the drive. I split it in half and set it going. Then only when half way through did I actually look at what drive had been selected by default, it had chosen an external drive as the one to get to work on. I totally accept that I should have checked it first, but I don’t understand why I was asked if I wanted to install it beside Windows for it to then choose a drive that Windows wasn’t even on as the one to set as the selected on.
The pain didn’t end there. Once finally up and running, my dual monitor setup was being mirrored which was easily remedied in the display settings control panel, but then it was apparent hardware acceleration wasn’t working. You couldn’t move windows without them struggling to keep up with the pointer. Trying to enable the proprietary ATI drivers didn’t work. Do I chose the normal or the post-release drivers offered? The normal drivers installed, but the displays would only mirror each other and the post-release ones wouldn’t even install. So I took a chance with some commands I found on a wiki which involved stripping away all the default ATI stuff and compiling my own drivers from scratch, which eventually got it running with hardware acceleration and with both screens working independently. I had no idea what the collection of commands I had run did to the system, which left me with the un-nerving feeling that the system was sort of hanging together by a thread. I was too scared to restart incase it came back with no display at all.
A few people advised me to ditch the Unity manager in favour of Gnome Shell, but that just made life even worse. I couldn’t even move windows without them crawling across the screen. It just felt like a total disaster and I had absolutely no faith in the install at all.
I remember in 1998 hearing about Linux Mandrake and traveling all the way to some dodgy warehouse in North London to buy a copy. My experience back then was actually a good one. Everything worked as it should of and using it day to day was fine. I don’t remember having any display or driver problems. It was only gaming that made me return to Windows back then. What has actually been achieved in 14 years? Some transparent UI elements? To me Linux still seems plagued by constant lack of driver support. Perhaps someone can explain even why just browsing the net on Linux looks bad? Why are no good fonts distributed? If you know exactly what you’re doing then you can make do. Plenty get by, perhaps I just don’t have the patience for it.
I just want something that works, and after some begging I convinced the place I’m at to let me use a Mac. Setting up took a fraction of the time, admittedly I’m used to it so it’ll be quicker to get up and running, but I didn’t have to worry about any drivers, display issues or whether or not my machine would survive a reboot.