Wednesday, May 30, 2012

Why We Started ZoomBA Cloud

Those who follow me on social media might have seen me go on about this "ZoomBA" thing and may have some idea about it. For others, in short, it's our own Cloud SaaS meant for Small Businesses (or SMEs as it is called elsewhere). But I thought of sharing the story of ZoomBA (which at this point sounds more like an African folk tale)r. Though we started working on ZoomBA earlier this year, the whole concept actually began quite a while back.

Once upon a time, in December 2004, Sanjiva rang me up on an idea of building an "Out of the box" appliance for Small Businesses out in the US. By his estimates, majority of them did not have a drop in IT solution which worked. Being a FOSS advocate myself and trying to drive Linux adoption, I immediately connected with the idea of making an IT system that worked for a Small Business. But I wasn't ready to start a company just then and had already committed to join another company, so it had to wait!

Finally in late 2005, myself and Lakmal teamed up for the challenge. We didn't have much startup capital and so incubated our project along side Sanjiva's other company. After a bit of brainstorming we came up with the name THINKCube for the company and the tagline "Out of the box solutions in a box" (Shahani was the inspiration for it). Company setup aside, we really started focusing on the key question of "What does it take for a Small Businesses to run their IT system and what's preventing them from doing it".

Without going into the details, I will simply state that we realized a lot of Small Businesses didn't want the hassle of hiring an IT department only to loose control to the IT guy. Instead they were very focused on growing their business and IT is seen as something that would be nice to have but not worth the trouble.

thinkCube's go to market exercise was an interesting learning experience. Like most startups, we had to go through a lot of experimentation in figuring out the product's technology and look & feel. The other challenge was how best to put it in front of our end user. First we tried to build a hardware appliance but later realized that we weren't really equipped to handle the logistics of moving devices around the globe. So we turned to virtualization by creating VMWare bottles of the product and then multi-tenant virtualization running on a Telco Cloud Infrastructure. In the same manner we did several UI iterations before settling down on our current dynamic shell which offers SAML based SSO, fun dashboard widgets and context sensitive end-to-end help. (You can read more about it on the ZoomBA blog.)

So thats how ZoomBA came about. It's our attempt to really address the core issues that prevent Cloud SaaS adoption among Small Businesses. While I know we haven't addressed all the issues, I am confident we have addressed some of the most fundamental ones.

The journey so far from the inception of thinkCube to us brainstorming, arguing, building, abandoning, rebuilding has been such a reward. I hope our hard efforts will come through when users finally get their hands on the system once we start sending invites out soon! In the mine time, if you know anyone who could benefit from a service like ZoomBA, please spread the news.

Monday, May 14, 2012

Peer-to-Peer Collaborative Development Using GIT

Wow how time flies when you’re having fun! I first thought of writing this post back in January, when I was on a roll with writing blog posts. But it never materialized beyond notes I collected in preparation. Come several months later with a lot more experience on what I am about to tell you, and you have this post. The notes I’m referring to is about a development style that came about as a result of optimizing, how we at thinkCube organize and work with source code in a revision control system. Considering the wide use of such systems such as SVN and GIT, I thought I’d share our development experience in the hope it will help you to take another look at your own development style. But before I get into it, I’d like very briefly touch on some background on the evolution of development styles around version control systems.

Thou shalt not commit, Yet!

Back in the days of CVS, source code lived centrally on a server called a repository. One had to earn the right to read/write to this repository in order to ensure “world order” (others had to submit a patch via Bugzilla). While this led to a centrally managed system of Collaborative software development, it also created a new software development style of “Earn your commitership”, “Commit often” and “Communicate often”. Nothing much changed when everyone moved over to SVN, which started out as an improved CVS. This development style didn’t go down well with Linus Torvalds for his Linux work and so he created GIT instead and hates CVS/SVN.

Pray-Pull-Push style of development

We started with CVS back in 2005 and then SVN for managing our development at thinkCube. A little while after git came along when it was stable enough to use and also was usuable by mere mortals, we made the switch. At first we had some trouble wrapping our heads around git and so just used it like SVN where we used to always do commit/pull/push operations as if they were atomic. But after about a week or two, we realized the power of git was in its ability to let you commit locally and push when you were ready to share. And so, yet another development style arose where git acted as a Collaborative Whiteboard for sharing code changes. This also meant developers needed push access to the central repository in the same manner commit access was required with SVN/CVS.

The unfortunate development style this results in is that developers may occasionally push unfinished worked upstream just to share it with the “Developer next door”. The consequence of that maybe felt by the poor developer that spent all night working on a feature, pulls in oder to push to find that his code now conflicts for no apparent reason!

Now I know a lot of you may be on this style of development and thinking, “If our developers do that, we punish them!” and so they don’t! Good for you! But my take on this is, if the system is fundamentally broken then it’s better to fix it than to enforce tough rules. For example, if I may digress for a bit, “Why are TukTuk drivers and Motor Cyclist so careless?”. Is it likely that only careless drivers pick up these vehicles or is it more likely the vehicle made them reckless? My advice is if the tool is broken, then fix it! (and ban TukTuks :)

Therefore last year we decided to adopt our current style of development which is as follows.

Fetch-Merge-Push style of development

If Linus can do it, then so can you! We didn’t invent this stuff but we did adapt it in a manner which scales for us. The idea is simple, stop devs from committing upstream as a means of sharing changes but instead get them to share peer to peer by fetching from each other. Our git repos are setup so that everyone has read access to clone the repo but only a couple of devs (usually just one dev) can push to a given repo. That means the dev who has push access usually will not need to pull prior to pushing because no one else can change it. This is what git was designed to do and yet it’s probably one of the least used features among git converts.

Ok so lets get technical shall we?

First off, I’m assuming you know git basics and are already using it. If not then checkout my git article on Digit

Sharing your repo

Let’s look at how you can share your git repo with another P2P style. Suppose you have a repo as follows:

/home/bud/repos/awesome.repo

You can easily share your awesome.repo with anyone on the local networking using git-daemon command by first cd-ing to it’s parent directory.

$ cd /home/bud/repos
$ git daemon --export-all --base-path=.

The above will share all your git repos under the current directory as read-only for others in the network to fetch. Git daemon will run in the foreground by default and so once you’re done sharing you can just Ctrl+C it.

Fetching a shared repo

In order to fetch from a fellow dev, you will first need to add him/her as a remote. Chances are you cloned the project from upstream in which case you have just one default remote called origin which points to your upstream repo.

To add another remote for your friend joe for example

$ cd /home/bud/repos/awesome.repo
$ git remote add joe git://joes-computer.local/awesome.repo

From within your repo, you add a remote using the git remote command. The url above uses git’s special git:// protocol that is understood by the git-daemon, while I’m relying on mDNS to resolve joes-computer.local automatically. If your network(or OS) doesn’t support this, then you can just use the IP address.

Finally to fetch joe’s changes over the network to your machine issue:

$ git fetch joe

The above command should give you some feedback as to the success of the fetch operation. Remember that fetch is safe since it only “fetches” as opposed to pull which fetches and then tries to merge. So while you could’ve used pull instead of fetch, I wouldn’t recommend it!

Merging and deleting

Now that you have a copy of the remote changes, what you’d want to do next is to see which branches they were working on. Usually joe will tell you, hey my latest changes are on the new-cool-feature branch.

$ git branch -a
* master
experimental
remotes/joe/master
remotes/joe/new-cool-feature

Git branch will first show your local branches (master, experimental) followed by the remote ones. At this point you should checkout a remote branch you’re planning to merge and just make sure everything is working.

$ git checkout -b joes-new-cool-feature remotes/joe/new-cool-feature

This creates a local branch called “joes-new-cool-feature” which tracks the remote branch remotes/joe/new-cool-feature and switches the current HEAD to it. Once your happy then you can switch back to master and merge.

$ git checkout master
$ git merge joes-new-cool-feature

But… if you have any merge conflicts then you will have to resolve it! If you don’t then your master will remain in a state of CONFLICT. If that sounds like additional work, then do what I do instead of above.

$ git checkout master
$ git checkout -b master-merge-joes-new-cool-feature
$ git merge joes-new-cool-feature
$ git checkout master
$ git merge master-merge-joes-new-cool-feature
$ git branch -d master-merge-joes-new-cool-feature

Wow that’s a handful of commands to type, you say. Trust me, it beats wasting time trying to resolve someone elses conflict! In above, we fork master as master-merge-joes-new-cool-feature in anticipation of a bumpy merge. If things go right, we then merge the merged to master :) The last line is just to delete the temporary branch which we no longer need.

Of course, as you go back and forth merging these micro commits with a dev, you will get into a comfort zone of realizing things won’t go wrong in which case you could merge the remote directly.

$ git merge remotes/joe/new-cool-feature

It all depends on how much you trust the other dev :) and how much of merge conflict resolution you’re prepared to take on. The branching approach is safer and if you’re the BOSS and you have a merge conflict you can simply abandon the branch and ask your peer dev to fetch from you and fix the conflict and let you know! (Which is what Linus would generally do)

One particularly useful technique to extract a good commit from a potentially set of conflicting ones is to use Cherry Picking. If you know the commit log’s SHA1 then you can use that to do a cherry-pick style merge.

$ git cherry-pick 623a3dfb5e86f4da4e043f26b6f075f6e3be77ad

Working remotely

The issue with git-daemon is that it is more difficult when you’re not on the same network and masked by a bogus IP. One technique is to get the router to DNAT the 9418 port. Another option is to setup a VPN. A third option is to use a bit of SSH tunneling magic to get everything to work. I’ll cover that in a different post, perhaps.

Sunday, January 01, 2012

A New Desktop for a New Year

Simple, minimilistic Mint desktop

Compared to other new years where I would spend some time cleaning up my room or upgrading my gear, this year I did none of that! Instead I invested some time cleaning up my online space starting with a brand new distro. I’m a big fan of MacOSX’s Lion desktop and wanted to bring some of that minimilistic simplicity to the Linux desktop. So if the above screenshot looks appealing, then read on to see how you too can get a modern desktop that is simple and elegant. Unlike some of my previous desktop customization article, I’ll keep this one to the bare minimum so that you can implement it fairly quickly.


Installing the base OS


IMHO, Gnome 3 is the best next thing when it comes to being a modern desktop. I realize this is a controversial statement, given news of some ditching and some forking the project. But IMHO when you have a project that can polarize a community that was once united, it means you’ve got true innovation - not just incremental tweaks.


Having said all that, I’ve started with Linux Mint 12, as the base distro to build my minimilistic desktop even though Mint has many of the traditional Windows like UI elements. Now, I’m sure you could do this with Ubuntu 11.10, the base for Mint 12, but I like Mint due to their focus on usability where most of what I need work out of the box!


Let the tweaking begin!


Don’t worry I’ll keep it to a minimum! The first thing I did was to get rid of the bottom taskbar completely, because its soo Windows 95 like! Fortunately, Gnome 3 comes with an “Advanced Settings App” (gnome-tweak-tool in CMD line) which uses the same iPhone like on-off toggles to do the job.


Gnome 3 Advanced Settings App aka Gnome tweak tool


Goto Desktop section within Advanced Settings App.


Have file manager handle the desktop -> ON
Computer icon visible on desktop -> OFF
Home icon visible on desktop -> OFF
Network Servers icon visible on desktop -> OFF
Trash icon visible on desktop -> ON
Show mounted volumes on the desktop -> ON

From the above list if you decide you want to see the Computer and Home icons or perhaps not have any icons then go right ahead. I have my reasons for the above :)


Goto Shell section and make sure every toggle is OFF


Goto Shell Extensions


This is where we do most of the customizations. We will come back to this but for now:


Menu Extension -> OFF
Media Player Extension -> ON
Smart Overview Extension -> ON
Monitor Status Extension -> OFF
Bottom Panel Extension -> OFF
User Themes Extension -> ON
noa11y Extension -> ON
Alt Tab Extension -> ON
Notification Extension -> ON
Shutdown Menu Extension -> ON
Window List Extension -> OFF

Goto Theme section and select Mint-Z-Dark as the Shell theme


Installing tweaks


As I’m running this distro on a 10” netbook, I wanted all the screen real-estate I can get. Besides running apps in full screen mode is all the rage these days! The other tweak that comes from the same repository, is to overlay an icon when viewing all open windows in the Exposé view. This makes it a lot easier to figure out which window preview is for which App.


Install PPA repository and plugins


sudo add-apt-repository ppa:webupd8team/gnome3
sudo apt-get update
sudo apt-get install gnome-shell-extensions-autohidetopbar
sudo apt-get install gnome-shell-extensions-windowoverlay-icons

After logging out and back logging back in (or Alt+F2 followed by r), head over to the “Advanced Settings App”


Next to the “Shell Extensions” section you should find new toggles


Smart Overview Extension -> ON
Auto Hide Top Panel Extension -> ON

In order to get more space and simplify the window, I wanted to get rid of the menubar by integrating it as a global menubar. Unlike MacOSX’s global menubar which is horizontal, this one is vertical and works perfectly with my limited screen width.


sudo apt-get install gnome3-globalmenu

Next I wanted get rid of the scrollbar, similar to how it is with Ubuntu. I used an updated verion of it from another PPA.


sudo add-apt-repository ppa:ayatana-scrollbar-team/release
sudo apt-get update
sudo apt-get install overlay-scrollbar  

Ok almost done. The final step I did was to install a nifty applet that is great for laptops to manage power settings and screen resolutions, especially when you want to project. It adds itself right to the top menubar for easy access.


Install Jupiter


sudo add-apt-repository ppa:webupd8team/jupiter
sudo apt-get update
sudo apt-get install jupiter

And we’re done! Enjoy your new desktop this holiday season and Happy 2012!


Update (02-Jan-2012)


After writing the article I wondered if it was possible to get the Window titlebar to dissapear whereby Apps can run in full screen similar to Lion. Though its not perfect, the following hacks will give you just that!


Install window-buttons extension to mirror minimize, maximize, close buttons in the top menubar


sudo apt-get install gnome-shell-extension-window-buttons

Now enable the extension using “Gnome Advanced Settings” by visiting the “Shell Extensions” section.


Window Buttons Extension -> ON

Install maximus daemon to automatically remove the title bar as a window is maximized.


sudo apt-get install maximus

Restary Gnome 3 and your set. You can restore a maximized window using the buttons in the top right corner of the top menu bar.




References


Saturday, December 31, 2011

Track Your New Year Resolutions With ii.do

I know, I know, it’s a bit of a cheesy title to promote my Open Source project but here me out - it really does work. Though I didn’t really write it to track my new year resolutions (I’ve never found them effective beyond a week), I did write it out of constant frustration with not finding a TODO productivity tool that stuck with me.

Let me explain… I’ve tried EVERYTHING!
  • Google Calendar & tasks
  • Sending myself Emails
  • Wiki (Dokuwiki, Mediwiki, Twiki etc.)
  • Tomboy
  • Tomboy with UbuntuOne & Android App
  • EverNote
  • Remember the Milk
  • Gedit notes on my desktop
  • Post-it notes widget
  • Actual Post-it notes
  • Old-skool diary
  • Pieces of paper including backs of payment receipts
From the above list, if anything came remotely to sticking as a habit, then it’d be the sticky notes & pieces of paper. I’m not suggesting the other methods suck, but I wouldn’t use it beyond a couple of weeks.

Why was it that despite me spending hours on a computer, the best thing that had a chance of remotely working was old-skool pen & paper? And then it hit me. A good todo App should be:
  1. In your face!
  2. Really simple to use (like pen & paper)
  3. Did I mention in your face?
Now my solution isn’t for everyone. At this moment, it is intended ONLY for fellow geeks who spent a lot of time in the command line (GNU & UNIX only), and I call it ii.do.

What is ii.do

ii.do, pronounced “I do”, but really a roman play on 2.do, is a command line todo list manager that uses a simple text file and simple MarkDown syntax to track your todo tasks. The beauty of using MarkDown syntax is in it’s resemblance to the natural way we jolt down text on a piece of paper.
ii.do is optimized around querying tasks as opposed to updating tasks. For entering and updating tasks, it uses a plain old vim text editor, which has syntax highlighting for Markdown built-in. If vi is not your thing, then it’s relatively easy to configure another editor by exporting the shell $EDITOR variable.

The other main design goal I had was to make it into a standalone shell script which could stand (mostly) on its own. Except for standard shell commands like sed, grep and bash itself, it doesn’t demand much.

But the main power of ii.do comes ONLY (and I repeat ONLY), if you modify your shell to :
  1. Define an easy alias (such as t) which can be used to summon ii.do from anywhere within the shell
  2. You modify your $PS1 shell prompt to update it with the pending number of tasks (this is the in your face bit)
The second one point above is important, if you plan on actually using it productively, for there is nothing like an App stalking you with a reminder of how many things you have left todo.

Installing ii.do

You can download a tarbar ball of ii.do from github. Then just extract it to your home directory, make the shell script executable (just in case) and finally copy the sample todo.markdown to your $HOME
 
 $ tar zxf geekaholic-ii.do-iido-xxx.tgz
 mv geekaholic-ii.do-\* ~/ii.do
 chmod +x ~/ii.do/ii.do
 cp ~/ii.do/todo.markdown ~

Though ii.do is now usable, you should create an alias in order to make it more accessible and add it to ~/.bash_profile or ~/.bashrc
 
 echo "alias t='$HOME/ii.do'" >> ~/.bashrc

Finally the most important step of adding a counter to your SHELL prompt is semi-automated via ii.do!
 
 ~/ii.do/ii.do -S "$PS1" >> ~/.bashrc

Your all set! You might want to logout and login or do a source ~/.bashrc
Note: if you would rather relocate the todo.markdown, for instance in your Dropbox folder then use the -f option.
 
 echo "alias t='$HOME/ii.do -f $HOME/Dropbox/todo.markdown'" >> ~/.bashrc

Using ii.do

Now the fun begins! Lets start with the most basic.
 
 t -h

    Version: 0.6.1

    Usage: ii.do [-f todo_file.markdown] [-T topic_number] [options]

    Options :
     -e          Open TODO file using $EDITOR
     -n          Count number of pending tasks. Can be filtered using -x, -X etc.
     -X          Filter to show only pending tasks
     -x          Filter to show only completed tasks
     -i          Filter to show only important tasks
     -t          Filter to show only topics with topic_number
     -C          Don't colorize output (useful for piping)
     -H          HTMLize the output
     -S "$PS1"   Will return modified PS1 prompt to contain pending task count
     -h          Show this help screen

    By default, we expect a ~/todo.markdown to be in your $HOME if not overridden 
    by the -f option. Refer to http://github.com/geekaholic/ii.do for examples of 
    creating this file.

To edit a file using vi or $EDITOR
 
 t -e

Using markdown syntax to maintain todo.markdown is simple. You start out with a main heading called a topic.
 
  # Weekly Activities

or using the alternate style
 
Weekly Activities
=================

Next you start your list of tasks as a bullet * list
 
* Come up with a BIG idea
* Implement BIG idea and be awesome

You could further break up your topic into subtopics as follows:
 
# Weekly Activities

## Entertainment

* Watch a Movie
* Go bowling

Excercise
---------

* Go to gym at least 3 days a week
* Play some wii sports
As you might have guessed the --- are the alternate form for a sub level topic. This way you have have multiple top level topics followed by sublevel topics, having tasks at each level.

Now we got the data entry part sorted, lets see how we can query the task list.
 t
ii.do output

Will show all your tasks using terminal colors.

To filter tasks to show only pending or only completed ones:
 
 t -X
 t -x

To filter by topic, so that it only shows tasks belonging to one topic including it’s sub topic:
 
t -t

1: # Weekly Activities
2: ## Entertainment
3: ## Excercise
4: # Home Work

t -T 1

The above will show everything up to topic 4 (Home Work)

To count the number of pending and competed tasks
 
 t -X -n
 t -x -n

To update the task to mark it as complete, place an x in front of the task
 
* x Take out the trash

To mark a task as important, place an ! mark in front of it
 
* ! Go to gym at least 3 days a week

To mark a task with a high priority, place the priority number in front of the task
 
* (1) Finish history essay

Finally, ii.do has two options that customize the output. The first option is to turn off color which is handy when you want to pipe the output of ii.do with more unix commands.
 
t -C | grep '^*'

The other option is to export the todo list as html
 
t -H > ~/todo.html
ii.do html output

Other Uses

Besides tracking my todo list on a daily basis, I’ve recently found another use for ii.do - track my bookmarks. I know, your probably thinking of delicious or firefox/chrome bookmark syncing but for me those solutions just don’t cut it. For one, I use about 3 browsers and finding old bookmarks can be a real pain. So now I just use an alias with a custom bookmarks.mdown
alias bm="$HOME/ii.do/ii.do -f $HOME/Dropbox/bookmarks.mdown"
Another use was to keep track of lecture topics by marking them off as I taught them over a period of two months. I also use it to keep track of some interesting quotes I come across, just for inspiration.


See also


Watch a talk I gave to introduce ii.do at RefreshColombo

Saturday, October 22, 2011

Remembering Steve Jobs

The passing of Steve Jobs came in as an instant shock that morning as I was reading the news on my phone. While it wasn't as much of a surprise as when MJ passed away, mainly because I had seen that one photo of Steve wearing a black gown coming back from the hospital, something inside me felt empty. When many flooded the social media and TV with messages of condolences and looking back at his achievements, I just watched not sure how I should express the loss.

So after about a week later, when I was asked if I can do a talk on Steve Jobs for Refresh Colombo, I immediately and almost impulsively said YES! But I still wasn't sure what I wanted to say. One thing I did know was that I didn't want to recap his life or accomplishments like I knew the guy.

Soon after accepting the talk, the next thing I almost instantly realized was that I needed to get Chanux in as a co presenter, not because we've recorded so many episodes of a podcast together but due to his reaction to Steve's death, which surprised me even more than Steve's death itself.

Why the majority of the people reacted the way they did will probably take a book to investigate, rather than a simple blog post, but I suspect it's complicated. So rather, I asked myself, what is it about Steve Jobs that I'm mostly going to miss. The keyword here being mostly, I realized I was going to miss his persona, his insight, his principals and his approach to doing things. And so that is what I decided the talk should be about - what made Steve great, or rather insanely great!

Myself and Chanux started doing brain dumps of Steve quotes which captured his philosophy just from the top of our heads. The way I saw it, if we couldn't really remember a particular quote and had to research on the Internet then it hasn't really had much of an impact on us personally. And so except for looking for some great images of which suited the slides, we didn't much go looking for his quotes. Consequently this probably means we might have not got the quotes in verbatim accuracy.

What follows is our presentation slides which we delivered last week. By highlighting these tidbits, we hope that it will inspire you to think differently about what you are currently doing before it is too late. Because our time on this planet is quite limited and even if we believe we're coming back it doesn't matter when it takes a new form factor with a completely new UI and Operating System!

Saturday, September 03, 2011

Getting Down With Markdown

Recently I've been looking for an alternative to docbook, which I've used for most of my tutorial handouts and internal developer documentation at Thinkcube. But the more I used it docbook the more I wanted a simpler solution which didn't require me to make sure my XML was in order.


Naturally, at first I thought I'd try Latex since it had a pretty good wrap with geeks and has even surpassed usability expectations set forth by some of the mainstream wordprecessors :). What I loved about Latex was you could concentrate on the content first and formatting later. Its legendary ability to output desktop publishing quality documents and convert to a variety of formats such as html, pdf or odt was a killer.


Just as I was about to dive into Latex, Chanux suggested Markdown as an alternative. Hmm, Markdown, I pondered... I even liked the sound of it. It turns out Markdown is even better! You could think of it as a simplified wiki syntax but a better description would be to call it a WYSIWYG wiki syntax.


I've always endorsed the KISS philosophy. There is nothing more simple and satisfying than to write a text file using vim and track its progress via git. After briefly going through the syntax, I realized this is exactly what I needed. I also realized that I had already used Markdown without actually thinking about it as part of using github for a pet project. Everything about Markdown was all good and the whole controversy around Markdown's html compiler names were exactly the kind of celebrity gossip it needed to grab attention!


It was around this time, I was due to create a note for a tutorial for the ICTer workshop myself and Dr. Ajantha from UCSC was to deliver. By now, I had decided on Markdown with upskirt (yes this is one of the controversal names) to create the notes but what about the slides? Could I use Markdown for that as well? After a little looking around, I found a wonderful system called Landslide which enabled me to compile Markdown syntax into a beautiful html5 slide show presentation. After a little playing around I managed to build slides as well as the note using a single markdown source code! How cool was that? I will write a separate post soon on the HOWTO details but for now enjoy the slides, if thats your cup of tea. My Markdown adventures don't end there. This post too was written using Markdown and converted to html using octopress.


Thursday, August 11, 2011

There's something about Lion!


So it's been a little over a month since I switched to MacOSX Lion as my primary desktop after running Ubuntu and before that Gentoo on this Macbook. So why'd I finally switch? Well I'm sure it comes at no surprise to existing mac owners who mostly run OSX anyways.

I'm different! I owned two Mac's to date, first a MacMini G4 (PPC) since 2006 and now a Macbook Aluminum since 2009, both primarily ran some flavor of Linux. Sure I had OSX lying around in another partition, but I'd only boot into it once in a while, just to update or checkout an interesting app or two.

Linux was what i used, not because OSX was bad (like Windows is) but because I was more comfortable with it, it was more flexible, had more innovation happening (pretty much every 6 months) and above all it was fun to use. Part of that fun was really the do-it-yourself attitude Linux has had for years but is somewhat fading away IMHO with distros such as Ubuntu. Its also no secret that I like the OSX interface and have tweaked my desktop in the past to resemble it somewhat. At the time, what made me buy a Mac was not the OSX interface. I really bought the Mac for its beautiful hardware design and higher build quality despite Linux not being treated as a first class citizen on it.

After years of owning other laptops I was fed up! I recall my first ever laptop, a Sony Vaio back in 2000 which only lasted 3 months before the disk died. Coupled with my project manager's ignorance it would later never be able to have a hard drive! Years later I bought an IBM thinkpad which worked fine except that the plastic frame around the LCD started to crack one day as I opened or closed the lid and over time the crack increased to the point it was crippled as a portable device. My next HP pavilion developed a random boot feature (especially when compiling or transcoding) which I could never reproduce to get a replacement and lastly the Acer which had far too many problems to mention before it too died. Seriously people, whats a good Laptop thats reliable? Is Dell reliable?

When I first saw Lion being previewed at one of the Apple events I brushed it off as the same old OSX with an iPad like icon interface, which they now call Launchpad. It wasn't till the recent WWDC event when they showed off the full deal that I got an urge to install it. And so I pondered, "How do I get this beast on here?". My Mac partition was small and had less that 2GB free. I hardly had room on my Linux partition either to get away with a resize. Besides I knew Lion would not install with Linux lying around as the installer wasn't that smart to deal with it. I decided, it was time to delete the Linux partition!

Now I'd like to say it was a sad move which I pondered for days like when you have to move to a new place, leaving your friends behind. It really wasn't! I don't miss leaving Ubuntu. Not a lot at least...

And I ask my self, why am I not missing Ubuntu as much as I missed Gentoo or Debian before that. IMHO, Ubuntu isn't fun anymore. It was even somewhat frustrating to use with the new Unity interface being default and I tried to install Gnome 3 over it. That didn't go too well. It's not just in the GUI but also in the command line where things aren't that much fun. It just too easy! and you mostly don't need it. Don't get me wrong - this is all good and Ubuntu is doing fantastic work for getting Linux to the mainstream. Its just not that fun as say Gentoo or even Fedora. But the reason I'm not switching isn't because Ubuntu is not fun or because I'm frustrated with Ubuntu. I'm really not! If that was the case, I would have switched to another distro like Mint or Debian.

Lion had a few features which I thought was neat and fun to explore. One such feature I really like is full screen apps on its own desktop being a built in feature of the core OS. This is a usage pattern I was quite used to when working with multiple desktops on Linux. I'd always open several apps in maximized form and move them to their own desktop and switch between them using control + arrow keys. But its better on the Mac...


  • Your not limited by the static number of multiple desktops. On Linux I sometimes run out of desktops

  • It's an app feature so the desktop is automatically created and destroyed as you click the new full screen button found on the titlebar

  • Apps are in true full screen (no titlebar or menubar)

  • ..and my favorite, it supports 3 finger swipe in addition to the keyboard short cut. Its really fun swiping between multiple desktops



Gnome3 looks promising in this respect as it too has the concept of dynamic desktops. Unfortunately it resets your desktops to always having just one and you have to create them all the time, which is more annoying than having a static number of desktops.

The other killer feature I like in Lion is support for multiple versions of a document. When Apple introduced "Time Machine" back in the day, the UI looked cool but it was not so practically usable as I would imagine as you needed an external hard drive in order to do continuous backup. Unlike what many speculated to be the adoption of ZFS and its online snapshotting capabilities, it turned out to be a far less elegant method under the hood. I doubt its as elegant as ZFS even with this feature because the underlying filesystem is still HFS+ but it works quite well in a practical manner for apps that makes use of this new API (Apple's own apps at the moment). Whats really cool is that the same time machine like UI is adopted to browse the different versions of a document where you can even copy & paste objects between versions.

The automatic save and resume of application's state between reboots is another interesting feature worth studying about. Again it only works for apps that use special APIs but essentially the app is hibernated and resumed as opposed to the whole OS. As a result its a lot faster. The ability to access documents by file type regardless of where its stored in the filesystem is another good usability feature which I always wanted the Linux desktop to have. In a world where hard drives are large and there are too much clutter, the filesystem organization is really a bottleneck. I don't think Lion nailed it either but its a good start. Vista tried and failed with WinFS. Google, beagle (now tracker) and Spotlight's approach of giving a search engine doesn't quite scale in my opinion. You'd think search would work, especially from Google but the more I think about it the more it seems to me that, we think Google is awesome because it finds the information we're looking for and gives us an answer - not necessarily the the needle in a haystack. For instance I really find Gmail frustrating for finding a mail which I vaguely remember about. Desktop search breaks down as opposed to web search because it is needle in a haystack problem but I digress!

There seems to be tons of other small technological as well as usability features of Lion and OSX which are pleasant having around and are interesting to study. So to summarize why I switched, its not one single thing but many things. I'm looking forward to the MacOSX command line and I've already started exploring it with the help of O'Reilly's book "Mac OSX for Unix Geeks". It was only recently after arriving at OSX, I really appreciated the bonjour protocol and its implementation on Linux via avahi. Most Linux users don't know that they can ping their neighbors machine without DNS or IP address but merely using the machine's hostname with .local appended to it. When did you drop in to a command line on Ubuntu and try running avahi-browse -a or avahi-discover?.

Trying to install LAMP on MacOSX I realized how frustratingly fun it was. It was like going back to the Redhat 9 days of manually enabling apache modules. There is also Mac Ports which is akin to Gentoo's portage where you download and compile apps, which is fun!

Having said all this, there is so much more of little things that I like (such as being able to right click on a word and look it up or have it read me the text via the excellent TTS which can also be done on the command line using the say command) and some things that are annoying like Finer (file browser) not supporting tabs, cut & paste of files or ability to delete without first sending to the trash. And I hear printer configuration is also non intuitive if your coming from Windows/Linux despite it using CUPS. I also find it using a bit more memory than usual on some apps and as a result my 2GB is almost fully used making it too slow for running Virtualbox.

All in all, I'm pleased with the move and look forward to learning more of its underlying intricacies and BSD origins in days to come. There sure is something about Lion!