tag:blogger.com,1999:blog-89479642024-03-14T02:23:21.758-05:00Geek with an attitudeHello my name is Bud and I'm a geek-a-holic.geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.comBlogger150125tag:blogger.com,1999:blog-8947964.post-58007382418936951612020-12-13T21:13:00.008-06:002020-12-20T01:53:50.732-06:00 DIY Docker on Apple silicon M1<div class="separator"><div class="separator" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em; text-align: center;"><img border="0" data-original-height="281" data-original-width="500" src="https://1.bp.blogspot.com/-VVMY436xVtQ/X9aQFwHUTTI/AAAAAAAAUZw/rHeLkAWALVwIVIj82MshhUflPl9_9SKgwCLcBGAsYHQ/s320/9e9.gif" width="320" /></div></div><p>When Apple announced the transition to Apple's own ARM-based silicon, I was ecstatic! I've always enjoyed tinkering with ARM based single-board computers such as the raspberry pi 4 and Pinebook Pro. But they always had sub-par performance and here was Apple trying to transition their entire lineup!</p>
<p>Even though we all knew the performance should be good based on how recent iPad Pros scored, I don't think many expected the M1 chip, Apple's first iteration, to then beat Intel's most high-end CPUs while redefining "all-day battery life"!</p>
<p>So naturally, I found myself clicking buy on day 1, something I rarely do before doing a ton of research. I ended up ordering the base Macbook Air but bumping up the RAM to 16GB in order to better run Docker.</p>
<p></p>
<h2>Disappointment: Docker support not ready</h2>
<p>My machine arrived quickly, despite getting shipped directly from China as a result of my memory upgrade. The first step was to setup the environment just the way I liked using <a href="http://github.com/geekaholic/install-buddy" target="_blank">InstallBuddy</a>, a script I wrote a while back to help with this sort of thing. And there hit my first snag, as I discovered that <a href="https://brew.sh" target="_blank">homebrew</a> didn't yet have native support. After jumping through few hoops to get homebrew setup via Rosetta 2, and my goto packages installed, I was on my way to downloading <a href="https://docs.docker.com/docker-for-mac/install/" target="_blank">Docker Desktop for Mac</a>.</p>
<p>I naively expected Docker to "just work" as it was a touted feature, briefly mentioned during Apple's transition announcement and then later followed up in more detail during the <a href="https://appleinsider.com/articles/20/06/25/apples-federighi-and-joswiak-discuss-apple-silicon-ios-14-big-sur-and-more" target="_blank">AppleInsider interview</a>. Unfortunately, I soon came across an official <a href="https://www.docker.com/blog/apple-silicon-m1-chips-and-docker/" target="_blank">docker blog post</a> about Docker Desktop not being quite ready but was being worked on.</p>
<h2>Hope: Apple support for Linux virtualization</h2>
<p>Impatient, I searched on and came across <a href="https://developer.apple.com/documentation/virtualization" target="_blank">Apple's developer documentation</a> for supporting Linux through virtualization. This was promising and I knew it was only a matter of time before someone built an Objective C or Swift wrapper around it.</p>
<p>A few days later, I came across this well written article: <a href="https://finestructure.co/blog/2020/11/27/running-docker-on-apple-silicon-m1" target="_blank">Running Docker on Apple Silicon M1</a> which had a nice step-by-step tutorial to getting it setup using a project called <a href="https://github.com/evansm7/vftool" target="_blank">vftool</a> and the <a href="https://cdimage.ubuntu.com/focal/daily-live/current/focal-desktop-arm64.iso" target="_blank">ARM64 version of Ubuntu 20.04 image</a>. Following those steps, I was able to get docker running, albeit in a read-only live cd that had a small <a href="https://en.wikipedia.org/wiki/Copy-on-write" target="_blank">COW</a> filesystem. The <a href="https://finestructure.co/blog/2020/11/27/running-docker-on-apple-silicon-m1-follow-up">2nd follow up article</a> has steps to show how to connect to the docker daemon from docker cli running on macOS, thus giving something closer to what Docker Desktop for Mac had to offer.</p>
<h2>Contribution: Getting full read/write support with data persistence</h2>
<p>Some of the downsides to running off the livecd was that the default COW filesystem can quickly run out of diskspace as it was only about 1GB in size. As the COW filesystem is merely an overlay on top of a read-only squashfs filesystem, uninstalling packages won't help (in fact will eat into the COW space). Another downside was that in order to get Docker working properly, vfs filesystem driver needed to be configured and this was known to be less performant.</p>
<p>Therefore I wanted to see if there was a way to install Ubuntu onto a read-write filesystem. Below are the steps I followed in order to achieve that. Please first read the above linked post to get vftool setup and kernel and initrd images extracted from the live CD iso from within macOS.</p>
<p>Below is how my directories are structured, so you will need to adapt to your situation.</p>
<div><img border="0" data-original-height="276" data-original-width="656" height="161" src="https://1.bp.blogspot.com/-POEwLh6n0wU/X9a2Z4sOXBI/AAAAAAAAUaM/s-L68_i-1CkoUpKJ3mZU1qKEq-4fVbWNQCLcBGAsYHQ/w382-h161/tree.png" width="382" /></div>
<h3>Steps 1: Create a loopback disk image</h3>
<p>After testing a few disk sizes, I settled on a 10GB loopback disk image.</p>
<code>
dd if=/dev/zero of=disk.img bs=1m count=10240
</code>
<h3>Step 2: Create a temporary VM to install Ubuntu</h3>
<p>The next step is to boot into a VM similar to how the article I've linked explains. We use this VM to then manually install Ubuntu onto the loopback disk.img</p>
<code>
./vftool -k ./vmlinuz -i ./initrd-livecd -c ./focal-desktop-arm64.iso -d disk.img -m 4096 -a "console=hvc0"<br />
</code>
<p>On another terminal connect to the VM</p>
<code>
screen /dev/ttys002
</code>
<h3>Step 3: Partition and mount the loopback disk</h3>
<p>The disk should be mounted on to /dev/vda</p>
<code>
cfdisk /dev/vda<br /><br />
Partition type: gpt<br />
/dev/vd1 Linux filesystem
</code>
<p>Save and exit. Then format the partition as ext4</p>
<code>
mkfs.ext4 /dev/vda1
</code>
<h3>Step 4: Mount new partition and copy files to it</h3>
<p>Mount and start copying files from /rofs which is where Ubuntu mounts the read-only squashfs file system.</p>
<code>
mount /dev/vd1 /mnt<br />
cd /rofs<br />cp -axv . /mnt/
</code>
<h3>Step 5: Chroot onto loopback filesystem to perform post install tasks</h3>
<code>
mount -t proc none /mnt/proc<br />
mount -t sysfs sysfs /mnt/sys<br />
mount -o bind /dev /mnt/dev<br />
chroot /mnt
</code>
<p>Next lets set the /etc/fstab so it knows how to mount /</p>
<code>
editor /etc/fstab<br />
# UNCONFIGURED FSTAB FOR BASE SYSTEM
0.0 0 0.0<br />
0<br />
UTC
</code>
<code>
dpkg-reconfigure tzdata
</code>
<p>Let's also create a user for us to login and be able to sudo.</p>
<code>
adduser bud<br />
addgroup --system admin<br />
adduser bud admin
</code>
<h3>Step 6: Making a new initrd</h3>
<p>The initrd we extracted from the livecd is not suitable for booting off the loopback disk.img so we need to generate a new one. Still within the chrooted environment run the following:</p>
<code>
mkinitramfs -c gzip -o /boot/initrd.img-$(uname -r)
</code>
<p>Next we need to copy the newly generated initrd.img back onto our macOS filesystem. I did this by enabling SSH within macOS.</p>
<div>
<img border="0" data-original-height="860" data-original-width="1318" height="248" src="https://1.bp.blogspot.com/-eXbnGkYXdwI/X9bAkjrg_EI/AAAAAAAAUaY/luYOyajLFAwHjNKTPTj1c8HxOXaekhJhgCLcBGAsYHQ/w380-h248/ssh-sharing.png" width="380" />
</div>
<code>
scp /boot/initrd.img-$(uname -r)<br />
bud@192.168.86.28:/Downloads<br />
</code>
<p>Then copied the file from Downloads to where the other files were.</p>
<h3>Step 7: Exit and unmount from chrooted environment</h3>
<p>
<code>
exit<br />
umount /mnt/root/sys<br />
umount /mnt/root/proc<br />
umount /mnt/root/dev<br />
umount /mnt
</code>
</p><p>Then shutdown the VM properly by issuing a poweroff</p>
<code>
sudo poweroff
</code>
<h3>Step 8: Ready to launch the new persistent VM</h3>
<p>We can now get rid of the cdrom image and boot directly on to the disk.img using the same kernel and new initrd.img-5.4.0-56-generic image.</p>
<code>
./vftool -k ./vmlinuz -i ./initrd.img-5.4.0-56-generic -d disk.img -m 4096 -a "console=hvc0 root=/dev/vda1"
</code>
<p>From another terminal...</p>
<code>
screen /dev/ttys002
</code>
<p>We should now be able to login using the account you created within the chrooted environment.</p>
<h3>Step 9: Setting up SSH</h3>
<p>It's far easier to work within an SSH session that within the screen session and SSH can also be used to link docker cli running on macOS to connect to the daemon running inside the VM (more on this later)</p>
<code>
sudo apt install openssh-server
</code>
<p>Obtain the IP address within the VM and try connecting to it.</p>
<code>
ip addr show dev enp0s1 | grep 'inet '<br />
ssh bud@192.168.64.13
</code>
<p>Copy your macOS account's SSH key to the remote VM in order to enable password-less login, which is also needed by docker cli when connecting from the Mac.</p>
<code>
ssh-copy-id bud@192.168.64.13
</code>
<h3>Step 10: Setting up Docker inside VM</h3>
<p>These steps are copied from the linked article which are adapted from docker's official installation guide.</p>
<code>
sudo apt-get install apt-transport-https ca-certificates \<br />
curl gnupg-agent software-properties-common<br /><br />
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -<br /><br />
sudo add-apt-repository \<br />
"deb [arch=arm64] https://download.docker.com/linux/ubuntu \<br />
$(lsb_release -cs) \<br />
stable"<br /><br />
sudo apt-get update<br />
sudo apt-get install docker-ce docker-ce-cli containerd.io<br />
</code>
<p>To make it easy to run docker commands as the user I added myself to the docker group.</p>
<code>
sudo usermod -aG docker ${USER}<br />
su - ${USER}
</code>
<p>Now logout and log back in for the new group to take effect. Docker should now be running properly.</p>
<code>
docker run --rm hello-world
</code>
<h3>Step 11: Setting up Docker cli to connect to VM</h3>
<p>For a near Docker for Mac like experience, we need to install docker cli on macOS. Unfortunately at the time of writing, this needs to be installed via Rosetta 2. I used homebrew to accomplish this, but you could also just <a href="https://docs.docker.com/engine/install/binaries/" target="_blank">download the binary directly</a> from Docker Inc.</p>
<code>
arch -x86_64 brew install docker
</code>
<p>We then create a docker context and switch to it to seamlessly connect to the docker daemon running inside the VM.</p>
<code>
docker context create myvm --docker "host=ssh://bud@192.168.64.13"<br />
docker context use myvm<br />
docker run --rm hello-world
</code>
<p>You should now be able to issue docker commands from within macOS.</p>
<h3>Step 12: Uninstall gnome desktop (optional)</h3>
<p>Finally, I got rid of gnome desktop to reclaim more space as everything is running via the terminal. When trying this I ran into an issue with the flash-kernel package so you might want to first uninstall it.</p>
<code>
sudo dpkg -r flash-kernel<br /><br />
sudo apt purge adwaita-icon-theme gedit-common gir1.2-gdm-1.0 \<br />gir1.2-gnomebluetooth-1.0 gir1.2-gnomedesktop-3.0 gir1.2-goa-1.0 \<br />gnome-accessibility-themes gnome-bluetooth gnome-calculator gnome-calendar \<br />gnome-characters gnome-control-center gnome-control-center-data \<br />gnome-control-center-faces gnome-desktop3-data \<br />gnome-font-viewer gnome-getting-started-docs gnome-getting-started-docs-ru \<br />gnome-initial-setup gnome-keyring gnome-keyring-pkcs11 gnome-logs \<br />gnome-mahjongg gnome-menus gnome-mines gnome-online-accounts \<br />gnome-power-manager gnome-screenshot gnome-session-bin gnome-session-canberra \<br />gnome-session-common gnome-settings-daemon gnome-settings-daemon-common \<br />gnome-shell gnome-shell-common gnome-shell-extension-appindicator \<br />gnome-shell-extension-desktop-icons gnome-shell-extension-ubuntu-dock \<br />gnome-startup-applications gnome-sudoku gnome-system-monitor gnome-terminal \<br />gnome-terminal-data gnome-themes-extra gnome-themes-extra-data gnome-todo \<br />gnome-todo-common gnome-user-docs gnome-user-docs-ru gnome-video-effects \<br />language-pack-gnome-en language-pack-gnome-en-base language-pack-gnome-ru \<br />language-pack-gnome-ru-base language-selector-gnome libgail18 libgail18 \<br />libgail-common libgail-common libgnome-autoar-0-0 libgnome-bluetooth13 \<br />libgnome-desktop-3-19 libgnome-games-support-1-3 libgnome-games-support-common \<br />libgnomekbd8 libgnomekbd-common libgnome-menu-3-0 libgnome-todo libgoa-1.0-0b \<br />libgoa-1.0-common libpam-gnome-keyring libsoup-gnome2.4-1 libsoup-gnome2.4-1 \<br />nautilus-extension-gnome-terminal pinentry-gnome3 yaru-theme-gnome-shell<br />
sudo apt autopurge
</code>
<h2>Conclusion</h2>
<p>This was a fun exercise and if you're impatient waiting for official support from Docker Inc. or just want to try it for the fun of it, go ahead. Maybe you can extend the setup and tackle other problems I've yet to encounter. Else it might be best to just wait a bit longer as official support will be coming sooner rather than later.</p><p></p><p></p>geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com0Lakefront Trail, Chicago, IL 60657, USA41.9345771 -87.631886540.300546006686943 -89.829152125 43.568608193313054 -85.434620875tag:blogger.com,1999:blog-8947964.post-75441687756228143732020-08-07T19:11:00.004-05:002020-09-02T01:10:04.903-05:00Alternate VIM Reloaded
<p>A little over 6 years ago, or just 2 posts below :), I wrote about how I use VIM from the command-line to invoke a GUI version of VIM, where each file would appear in a separate tab. I've stuck with that setup to this day, where VIM is still my main IDE whether working on Linux or macOS.</p>
<p>Recently I bought a pine64 based <a href="https://www.pine64.org/pinebook-pro/" target="_blank">PINEBOOK Pro</a> laptop which is powered by a low-cost ARM64 <a href="https://en.wikipedia.org/wiki/Single-board_computer" target="_blank">Single Board Computer (SBC)</a>. This came pre-installed with <a href="https://manjaro.org/download/" target="_blank">Manjaro</a> KDE edition of Linux. Though this laptop is no slouch (I find myself using it more thanks to 8hrs+ battery), I wanted to keep it lean without loading too many GTK apps.</p>
<p>KDE, being built on Qt, I decided to try neovim-Qt, which is powered by neovim, a somewhat newer rewrite of VIM, that is pretty slick and fast. This is how I got neovim-Qt working in a similar manner to that of vim / gvim.</p>
<p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://1.bp.blogspot.com/-wt68jATduVM/Xy3qa7WgMjI/AAAAAAAAUMw/FGhdT0Ul9_QY53PBuWNcrqjPM1XfepkqACLcBGAsYHQ/s1095/neovim-pine64.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="974" data-original-width="1095" src="https://1.bp.blogspot.com/-wt68jATduVM/Xy3qa7WgMjI/AAAAAAAAUMw/FGhdT0Ul9_QY53PBuWNcrqjPM1XfepkqACLcBGAsYHQ/s640/neovim-pine64.png" width="640" /></a></div><br /><p></p>
<h2>Install packages</h2>
<p>First, install neovim-qt. On Manjaro, this would involve pacman.</p>
<code>
sudo pacman -Sy neovim-qt
</code>
<p>Next, the secret sauce ;), a project called <a href="https://github.com/mhinz/neovim-remote" target="_blank">neovim-remote</a>. Fortunately, it can be installed quite easily via pip. But first I had to install pip.</p>
<code><div><code>sudo pacman -Sy python-pip</code></div>
pip3 install neovim-remote
</code>
<p>This exposes a command nvr but because I use mvi with the VIM setup, I just aliased mvi to nvr so it behaves much like how I'm used to. I also had to add ~/.local/bin to my $PATH variable.</p>
<code>
echo "alias mvi='nvr --remote-tab'" >> ~/.bashrc</code><div><code>echo 'export PATH="~/.local/bin:$PATH"' >> ~/.bashrc</code></div><div><code><br /></code></div><div><code></code><p>Finally, we need to start a neovim server so I added a shell script to do that as well.</p>
<code>
echo "NVIM_LISTEN_ADDRESS=/tmp/nvimsocket nvim-qt" > ~/bin/start-nvim.sh <br />
chmod +x ~/bin/start-nvim.sh
</code>
<h2>Using it</h2>
<p>Usage is simple enough. First I'd launch neovim-qt as a server using the shell script.</p>
<code>
~/bin/start-nvim.sh
</code>
<p>Then open files using the mvi alias.</p>
<code>
mvi /etc/passwd /etc/hosts
</code>
<p>That's it! Checkout my <a href="https://github.com/geekaholic/mydotfiles/blob/master/.nvimrc" target="_blank">~/.nvimrc</a> for how I've customized it.</p>
</div>geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com0tag:blogger.com,1999:blog-8947964.post-20449380296229297592020-08-04T01:27:00.090-05:002020-08-04T02:24:32.063-05:00Re-installations made easy with Install Buddy<div>
<p>
I love Linux! For those who know me, that's somewhat of an understatement.
That's because, over the years, I've been known run almost every Linux
distro you can think of, on devices you probably didn't think of. I actually
enjoy the process of installing and configuring the system because every
distro and device/platform has its quirks and unique challenges to get it
working.
</p>
<p>
Over the years, I've compiled a list of useful software packages that I've
grown to like and can't live without. However, I didn't much enjoy manually
installing the same list of packages every time. I often found myself
comparing the newly installed system with the current one to figure out what
needed to be installed and configured. I got tired of this few years back
and so that's why I created Install Buddy!
</p>
<h2>Think bootstrapping, not a configuration management</h2>
<p>
Configuration management systems are great for managing clusters of
machines, each converging on a system specification. Most have complex
features to address complex deployment scenarios - an overkill for my hobby
requirements. In contrast, I wanted a simple tool to help me bootstrap a
freshly installed system and bring it up to speed.
</p>
<script src="https://gist.github.com/geekaholic/b86d9fb65f934d46026481b260e1f422.js"></script>
<p>
My design goals were to have a simple YAML config file that would work
across many Linux (and even other UNIX like) systems and made use of the
underlying package management system to take care of the installation.
</p>
<h2>How I use Install Buddy</h2>
<p>
To give a real-world example, I recently bought a new raspberry pi 4 and
booted it with a fresh install of
<a href="https://www.raspberrypi.org/downloads/">Raspberry Pi OS</a>. These
are the steps I'd follow in order to bootstrap it with my favorite packages.
</p>
<div class="separator" style="clear: both; text-align: center;"><a href="https://1.bp.blogspot.com/-JYoVRz0agO8/XykJ39luwRI/AAAAAAAAUME/V784aKWBZ3UFjxRnBb7xoiWJTfydj4WQwCLcBGAsYHQ/s1200/2020-08-04-004426_2560x1440_scrot.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="809" data-original-width="1200" src="https://1.bp.blogspot.com/-JYoVRz0agO8/XykJ39luwRI/AAAAAAAAUME/V784aKWBZ3UFjxRnBb7xoiWJTfydj4WQwCLcBGAsYHQ/s640/2020-08-04-004426_2560x1440_scrot.png" width="640" /></a></div>
<div>
Step 1 - git clone or download
<a href="https://github.com/geekaholic/install-buddy">Install Buddy</a> and
<a href="https://github.com/geekaholic/mydotfiles/blob/master/my-packages.yml">my-packages.yml</a>
from GitHub<br />
Step 2 - Install ruby runtime<br />
Step 3 - Install packages using <em>Install Buddy</em><br />
Step 4 (optional) - Copy other config files such as
<em>.bashrc, .vimrc</em>, etc from
<a href="https://github.com/geekaholic/mydotfiles">mydotfiles repo</a> to my
home directory.
</div>
<p>
And that's it! So far I've used this to set up several Macs running OSX (via
homebrew), laptops running Linux (Fedora, Debian/Ubuntu, Arch/Manjaro,
Solus, etc.), Raspberry Pis, iPhone running iSH (via x86 emulation using
Alpine) and inside of Docker. In doing so, I've been able to fix numerous
bugs and add a few features along the way.
</p>
<p>
Hope I've piqued your interest in trying <a href="https://github.com/geekaholic/install-buddy">Install Buddy</a> out. If you do,
consider ★ on github and leave a comment here, with feedback or even a <a href="https://github.com/geekaholic/install-buddy/fork">pull request</a>
</p>
</div>
geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com1tag:blogger.com,1999:blog-8947964.post-970621338946721022014-06-16T21:40:00.000-05:002014-06-16T21:40:41.615-05:00VIM Reloaded<div class="entry-content">
It’s been a really really long time since I
last blogged so bear with me while I try get a little bit of my mojo
back :) I’ve been busy changing countries, cities, apartments, jobs –
the usual stuff life throws at you. But the one thing I’ve tried to
change and keep coming back to is my beloved editor – VIM. This post is
on making VIM a little better, like that <a href="http://www.sublimetext.com/">other one</a>
developers tend to use these days. Don’t get me wrong, I’m not
suggesting VIM is better (or worse). But after having used VIM for so
long, I don’t want to just give up on it. If you’re in the same camp
then read on.<br />
<h2>
Get me some of that WYSIWYG</h2>
I know what you’re thinking – VIM is awesome because its all
terminal. And there is no reason to give that up! I haven’t, especially
for those quick edits. However when it comes to coding for any length of
time, my goto editor has been one where VIM is wrapped in WYSIWYG
goodness. I’m talking about the likes of gVim alternatives such as
vim-gnome (Linux Gnome), vim-gtk (Linux) or macVim (OSX). Pick one
depending on the platform. I use gnome-vim on Debian and macVIM on OSX.<br />
<blockquote>
For Debian based…</blockquote>
<pre><code>$ sudo apt-get install vim-gnome
</code></pre>
<blockquote>
For OSX visit <a href="https://code.google.com/p/macvim/">macVIM site</a></blockquote>
And so the fun begins! First I’ll show you how I got tabs working in a
helpful manner where each file you opened from the command line would
open up as a new tab within the same session window. This was easily
accomplished using a specially crafted alias which was placed in the <code>bash_profile</code> (or <code>bashrc</code>) file.<br />
<pre><code>alias gvi='mvim --servername VIM --remote-tab-silent'
</code></pre>
Here I use macVIM but one could easily substitute gvim on Linux. Now
try opening a file or two and they should open in separate tabs.<br />
<pre><code>gvi /etc/passwd /etc/hosts
gvi ~/.bash_profile
</code></pre>
Switching tabs can be performed via the mouse but I prefer using <code>command + shift + ]</code> and <code>command + shift + [</code>
key stokes on OSX which work out of the box on macVIM. However on Linux
to get a similar functionality, I had to add a special key mapping to <code>~/.gvimrc</code>. Since I wanted to keep the same feel as on OSX, I substituted the <code>alt</code> key in place of <code>command</code> key found on Macs.<br />
<blockquote>
~/.gvimrc</blockquote>
<pre><code>" Map tab switching
map <a-> gt
map <a-> gT
</a-></a-></code></pre>
All we are doing is making use of the <code>gt</code> and <code>gT</code> commands to switch between next and previous tab by mapping it to shortcut keys.<br />
<h2>
Add some color</h2>
While the default color scheme that comes with vim is pretty
straightforward, it lacked the vibrant and cheerful syntax highlighting
found in those other editors. Not a problem though, thanks to <a href="http://www.vimninjas.com/2012/08/26/10-vim-color-schemes-you-need-to-own/">Vim Color Schemes</a>. My favorite one, which I ended up using was <a href="https://github.com/tpope/vim-vividchalk">Vividchalk</a>. Simply download the color theme file of your choice (such as vividchalk.vim) to <code>~/.vim/colors/</code> and set it via .gvimrc.<br />
<blockquote>
~/.gvimrc</blockquote>
<pre><code>" Set color scheme
syntax on
set background=dark
colorscheme vividchalk
" Optionally set font and transparency
set guifont=Monaco:h13
" set guifont=Monospace\ Bold\ 10.5
set transparency=15
</code></pre>
<h2>
Better searching</h2>
Searching is an important part of editing and so there are few things we can do to improve VIMs default search capabilities.<br />
<pre><code>" Search related
set incsearch
set hlsearch
set ignorecase
set smartcase
</code></pre>
First option <code>incsearch</code> makes search incremental so that
as you type the search keyword (via /keyword in command mode), VIM will
start highlighting and transporting you to that location. <code>hlsearch</code> is useful in that it highlights all keywords matching the search criteria. The last two, <code>ignorecase</code> and <code>smartcase</code>
together provide a smarter way to search by making any keyword typed
all in lowercase to be a case insensitive search while still retaining
case sensitivity for mixed case (e.g /FooBar won’t find foobar but
/foobar will find bothFoobar and foobar)<br />
<h2>
Few more enhancements for coders</h2>
I’ll close off with a few more enhancements that can come in handy
when writing code. First is to turn on line numbers by default and it
looks great with <a href="https://github.com/tpope/vim-vividchalk">my color scheme</a>.
I also prefer seeing a vertical guide fixed to 80 characters in order
to keep me in check as to the length of my code. Along the same line,
its also a good practice to catch those trailing white spaces which can
appear ugly when viewed through a visual diff tool. Finally having auto
completion turned on may come in handy, especially if you suffer from
short term memory like I do :)<br />
<blockquote>
~/.gvimrc</blockquote>
<pre><code>" Show line numbers
set number
" Show 80 column guide
set colorcolumn=80
" Highlight trailing spaces
match ErrorMsg '\s\+$'
" Auto completion
setlocal omnifunc=syntaxcomplete#Complete
filetype plugin on
autocmd FileType html set omnifunc=htmlcomplete#CompleteTags
autocmd FileType javascript set omnifunc=javascriptcomplete#CompleteJS
autocmd FileType css set omnifunc=csscomplete#CompleteCSS
autocmd FileType php set omnifunc=phpcomplete#CompletePHP
" Ruby autocompletion
autocmd FileType ruby,eruby let g:rubycomplete_buffer_loading = 1
autocmd FileType ruby,eruby let g:rubycomplete_classes_in_global = 1
autocmd FileType ruby,eruby let g:rubycomplete_rails = 1
</code></pre>
To activate a language specific auto completion, use <code>Ctrl-X Ctrl-O</code>
combination while on the insert mode. Besides using language specific
auto completion, vim also supports a smart auto completion which relies
on text already found in the document being edited, which can be
activated using <code>Ctrl-N</code> after typing part of the word. It is quite useful when you want to retype a previously declared variable or method name.<br />
<img alt="VIM screenshot" src="https://farm4.staticflickr.com/3915/14456394103_f4f5b19677.jpg" /><br />
There are a ton of other tips that can make VIM so much better for coders such as <a href="http://vim.wikia.com/wiki/Folding">folding code blocks</a> or visually <a href="https://github.com/vim-scripts/ShowMarks">showing marks made via ma, mb etc.</a>. Chances are, if you can think of a feature you’d like to have, there’s probably a plugin or built in to make that happen!<br />
I’ll leave you with this <a href="http://www.redditmirror.cc/cache/websites/ivanidris.net_7xk05/ivanidris.net/wordpress/index.php/2009/02/03/sharpen-the-vim-saw.html">great post</a> which has few more useful tips.<br />
<br />
~Happy VIMing~<br />
</div>
geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com1tag:blogger.com,1999:blog-8947964.post-66338246167696293402012-05-14T00:19:00.002-05:002012-05-15T07:01:55.521-05:00Peer-to-Peer Collaborative Development Using GIT<p>Wow how time flies when you’re having fun! I first thought of writing this post back in January, when I was on a roll with writing blog posts. But it never materialized beyond notes I collected in preparation. Come several months later with a lot more experience on what I am about to tell you, and you have this post. The notes I’m referring to is about a development style that came about as a result of optimizing, how we at <a href="http://thinkcube.com">thinkCube</a> organize and work with source code in a <a href="http://en.wikipedia.org/wiki/Revision_control">revision control system</a>. Considering the wide use of such systems such as SVN and GIT, I thought I’d share our development experience in the hope it will help you to take another look at your own development style. But before I get into it, I’d like very briefly touch on some background on the evolution of development styles around version control systems.</p>
<h2>Thou shalt not commit, Yet!</h2>
<p>Back in the days of <a href="http://en.wikipedia.org/wiki/Concurrent_Versions_System">CVS</a>, source code lived centrally on a server called a repository. One had to earn the right to read/write to this repository in order to ensure “world order” (others had to submit a patch via <a href="http://bugzilla.org">Bugzilla</a>). While this led to a centrally managed system of Collaborative software development, it also created a new software development style of “Earn your commitership”, “Commit often” and “Communicate often”. Nothing much changed when everyone moved over to <a href="http://subversion.apache.org">SVN</a>, which started out as an improved CVS. This development style didn’t go down well with Linus Torvalds for his Linux work and so he created GIT instead and <a href="http://www.youtube.com/watch?v=4XpnKHJAok8">hates CVS/SVN</a>.</p>
<h2>Pray-Pull-Push style of development</h2>
<p>We started with CVS back in 2005 and then SVN for managing our development at thinkCube. A little while after git came along when it was stable enough to use and also was usuable by mere mortals, we made the switch. At first we had some trouble wrapping our heads around git and so just used it like SVN where we used to always do <code>commit/pull/push</code> operations as if they were atomic. But after about a week or two, we realized the power of git was in its ability to let you commit locally and push when you were ready to share. And so, yet another development style arose where git acted as a Collaborative Whiteboard for sharing code changes. This also meant developers needed push access to the central repository in the same manner commit access was required with SVN/CVS.</p>
<p>The unfortunate development style this results in is that developers may occasionally push unfinished worked upstream just to share it with the “Developer next door”. The consequence of that maybe felt by the poor developer that spent all night working on a feature, pulls in oder to push to find that his code now conflicts for no apparent reason!</p>
<p>Now I know a lot of you may be on this style of development and thinking, “If our developers do that, we punish them!” and so they don’t! Good for you! But my take on this is, if the system is fundamentally broken then it’s better to fix it than to enforce tough rules. For example, if I may digress for a bit, “Why are <a href="http://en.wikipedia.org/wiki/Auto_rickshaw">TukTuk</a> drivers and Motor Cyclist so careless?”. Is it likely that only careless drivers pick up these vehicles or is it more likely the vehicle made them reckless? My advice is if the tool is broken, then fix it! (and ban TukTuks :)</p>
<p>Therefore last year we decided to adopt our current style of development which is as follows.</p>
<h2>Fetch-Merge-Push style of development</h2>
<p><a href="http://en.wikipedia.org/wiki/Yan_Can_Cook">If Linus can do it, then so can you!</a> We didn’t invent this stuff but we did adapt it in a manner which scales for us. The idea is simple, stop devs from committing upstream as a means of sharing changes but instead get them to share peer to peer by fetching from each other. Our git repos are setup so that everyone has read access to clone the repo but only a couple of devs (usually just one dev) can push to a given repo. That means the dev who has push access usually will not need to <code>pull</code> prior to pushing because no one else can change it. This is what git was designed to do and yet it’s probably one of the least used features among git converts.</p>
<p>Ok so lets get technical shall we?</p>
<p>First off, I’m assuming you know git basics and are already using it. If not then <a href="http://digit.lk/09_dec_git">checkout my git article on Digit</a></p>
<h2>Sharing your repo</h2>
<p>Let’s look at how you can share your git repo with another P2P style. Suppose you have a repo as follows:</p>
<blockquote><p>/home/bud/repos/awesome.repo</p></blockquote>
<p>You can easily share your awesome.repo with anyone on the local networking using git-daemon command by first cd-ing to it’s parent directory.</p>
<pre><code>$ cd /home/bud/repos
$ git daemon --export-all --base-path=.
</code></pre>
<p>The above will share all your git repos under the current directory as read-only for others in the network to fetch. Git daemon will run in the foreground by default and so once you’re done sharing you can just Ctrl+C it.</p>
<h2>Fetching a shared repo</h2>
<p>In order to fetch from a fellow dev, you will first need to add him/her as a remote. Chances are you cloned the project from upstream in which case you have just one default remote called origin which points to your upstream repo.</p>
<blockquote><p>To add another remote for your friend joe for example</p></blockquote>
<pre><code>$ cd /home/bud/repos/awesome.repo
$ git remote add joe git://joes-computer.local/awesome.repo
</code></pre>
<p>From within your repo, you add a remote using the git remote command. The url above uses git’s special <code>git://</code> protocol that is understood by the git-daemon, while I’m relying on <a href="http://en.wikipedia.org/wiki/Multicast_DNS">mDNS</a> to resolve joes-computer.local automatically. If your network(or OS) doesn’t support this, then you can just use the IP address.</p>
<blockquote><p>Finally to fetch joe’s changes over the network to your machine issue:</p></blockquote>
<pre><code>$ git fetch joe
</code></pre>
<p>The above command should give you some feedback as to the success of the fetch operation. Remember that fetch is safe since it only “fetches” as opposed to <code>pull</code> which fetches and then tries to merge. So while you could’ve used <code>pull</code> instead of <code>fetch</code>, I wouldn’t recommend it!</p>
<h2>Merging and deleting</h2>
<p>Now that you have a copy of the remote changes, what you’d want to do next is to see which branches they were working on. Usually joe will tell you, hey my latest changes are on the <code>new-cool-feature</code> branch.</p>
<pre><code>$ git branch -a
* master
experimental
remotes/joe/master
remotes/joe/new-cool-feature
</code></pre>
<p>Git branch will first show your local branches (master, experimental) followed by the remote ones. At this point you should checkout a remote branch you’re planning to merge and just make sure everything is working.</p>
<pre><code>$ git checkout -b joes-new-cool-feature remotes/joe/new-cool-feature
</code></pre>
<p>This creates a local branch called “joes-new-cool-feature” which tracks the remote branch <code>remotes/joe/new-cool-feature</code> and switches the current HEAD to it. Once your happy then you can switch back to master and merge.</p>
<pre><code>$ git checkout master
$ git merge joes-new-cool-feature
</code></pre>
<p>But… if you have any merge conflicts then you will have to resolve it! If you don’t then your master will remain in a state of CONFLICT. If that sounds like additional work, then do what I do instead of above.</p>
<pre><code>$ git checkout master
$ git checkout -b master-merge-joes-new-cool-feature
$ git merge joes-new-cool-feature
$ git checkout master
$ git merge master-merge-joes-new-cool-feature
$ git branch -d master-merge-joes-new-cool-feature
</code></pre>
<p>Wow that’s a handful of commands to type, you say. Trust me, it beats wasting time trying to resolve someone elses conflict! In above, we fork master as <code>master-merge-joes-new-cool-feature</code> in anticipation of a bumpy merge. If things go right, we then merge the merged to master :) The last line is just to delete the temporary branch which we no longer need.</p>
<p>Of course, as you go back and forth merging these micro commits with a dev, you will get into a comfort zone of realizing things won’t go wrong in which case you could merge the remote directly.</p>
<pre><code>$ git merge remotes/joe/new-cool-feature
</code></pre>
<p>It all depends on how much you trust the other dev :) and how much of merge conflict resolution you’re prepared to take on. The branching approach is safer and if you’re the BOSS and you have a merge conflict you can simply abandon the branch and ask your peer dev to fetch from you and fix the conflict and let you know! (Which is what Linus would generally do)</p>
<p>One particularly useful technique to extract a good commit from a potentially set of conflicting ones is to use <code>Cherry Picking</code>. If you know the commit log’s SHA1 then you can use that to do a cherry-pick style merge.</p>
<pre><code>$ git cherry-pick 623a3dfb5e86f4da4e043f26b6f075f6e3be77ad
</code></pre>
<h2>Working remotely </h2>
<p>The issue with git-daemon is that it is more difficult when you’re not on the same network and masked by a bogus IP. One technique is to get the router to <a href="http://en.wikipedia.org/wiki/Network_address_translation">DNAT</a> the 9418 port. Another option is to setup a VPN. A third option is to use a bit of SSH tunneling magic to get everything to work. I’ll cover that in a different post, perhaps.</p>geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com2tag:blogger.com,1999:blog-8947964.post-74947514372358965992012-01-01T03:42:00.001-06:002012-01-02T10:55:17.081-06:00A New Desktop for a New Year<img src="http://farm8.staticflickr.com/7011/6610804679_62d58a3e30.jpg" alt="Simple, minimilistic Mint desktop" /><br />
<br />
<p>Compared to other new years where I would spend some time cleaning up my room or <a href="http://www.geekaholic.org/2007/12/christmas-came-early-for-me-this-year.html">upgrading my gear</a>, this year I did none of that! Instead I invested some time cleaning up my online space starting with a brand new distro. I’m a <a href="http://www.geekaholic.org/2011/08/theres-something-about-lion.html">big fan of MacOSX’s Lion desktop</a> and wanted to bring some of that minimilistic simplicity to the Linux desktop. So if the above screenshot looks appealing, then read on to see how you too can get a modern desktop that is simple and elegant. Unlike some of my previous desktop customization article, I’ll keep this one to the bare minimum so that you can implement it fairly quickly.</p><br />
<h2>Installing the base OS</h2><br />
<p>IMHO, Gnome 3 is the best next thing when it comes to being a modern desktop. I realize this is a controversial statement, given news of <a href="http://linux.slashdot.org/story/11/08/04/0115232/linus-torvalds-ditches-gnome-3-for-xfce">some ditching</a> and some <a href="http://www.zdnet.com/blog/open-source/linux-mints-cinnamon-a-gnome-3x-shell-fork">forking</a> the project. But IMHO when you have a project that can polarize a community that was once united, it means you’ve got true innovation - not just incremental tweaks.</p><br />
<p>Having said all that, I’ve started with Linux <a href="http://blog.linuxmint.com/?p=1889">Mint 12</a>, as the base distro to build my minimilistic desktop even though Mint has many of the traditional Windows like UI elements. Now, I’m sure you could do this with Ubuntu 11.10, the base for Mint 12, but I like Mint due to their focus on usability where most of what I need work out of the box!</p><br />
<h2>Let the tweaking begin!</h2><br />
<p>Don’t worry I’ll keep it to a minimum! The first thing I did was to get rid of the bottom taskbar completely, because its soo Windows 95 like! Fortunately, Gnome 3 comes with an “Advanced Settings App” (<code>gnome-tweak-tool</code> in CMD line) which uses the same iPhone like on-off toggles to do the job.</p><br />
<p><img src="http://farm8.staticflickr.com/7160/6610969247_67299031a2_m.jpg" alt="Gnome 3 Advanced Settings App aka Gnome tweak tool" /></p><br />
<blockquote><p>Goto Desktop section within Advanced Settings App.</p></blockquote><br />
<pre><code>Have file manager handle the desktop -> ON
Computer icon visible on desktop -> OFF
Home icon visible on desktop -> OFF
Network Servers icon visible on desktop -> OFF
Trash icon visible on desktop -> ON
Show mounted volumes on the desktop -> ON
</code></pre><br />
<p>From the above list if you decide you want to see the Computer and Home icons or perhaps not have any icons then go right ahead. I have my reasons for the above :)</p><br />
<blockquote><p>Goto Shell section and make sure every toggle is OFF</p><br />
<p>Goto Shell Extensions</p></blockquote><br />
<p>This is where we do most of the customizations. We will come back to this but for now:</p><br />
<pre><code>Menu Extension -> OFF
Media Player Extension -> ON
Smart Overview Extension -> ON
Monitor Status Extension -> OFF
Bottom Panel Extension -> OFF
User Themes Extension -> ON
noa11y Extension -> ON
Alt Tab Extension -> ON
Notification Extension -> ON
Shutdown Menu Extension -> ON
Window List Extension -> OFF
</code></pre><br />
<blockquote><p>Goto Theme section and select Mint-Z-Dark as the Shell theme</p></blockquote><br />
<h2>Installing tweaks</h2><br />
<p>As I’m running this distro on a 10” netbook, I wanted all the screen real-estate I can get. Besides running apps in full screen mode is all the rage these days! The other tweak that comes from the same repository, is to overlay an icon when viewing all open windows in the <a href="http://en.wikipedia.org/wiki/Exposé_(Mac_OS_X)">Exposé view</a>. This makes it a lot easier to figure out which window preview is for which App.</p><br />
<blockquote><p>Install PPA repository and plugins</p></blockquote><br />
<pre><code>sudo add-apt-repository ppa:webupd8team/gnome3
sudo apt-get update
sudo apt-get install gnome-shell-extensions-autohidetopbar
sudo apt-get install gnome-shell-extensions-windowoverlay-icons
</code></pre><br />
<p>After logging out and back logging back in (or Alt+F2 followed by r), head over to the “Advanced Settings App”</p><br />
<blockquote><p>Next to the “Shell Extensions” section you should find new toggles</p></blockquote><br />
<pre><code>Smart Overview Extension -> ON
Auto Hide Top Panel Extension -> ON
</code></pre><br />
<p>In order to get more space and simplify the window, I wanted to get rid of the menubar by integrating it as a global menubar. Unlike MacOSX’s global menubar which is horizontal, this one is vertical and works perfectly with my limited screen width.</p><br />
<pre><code>sudo apt-get install gnome3-globalmenu
</code></pre><br />
<blockquote><p>Next I wanted get rid of the scrollbar, similar to how it is with Ubuntu. I used an updated verion of it from another PPA.</p></blockquote><br />
<pre><code>sudo add-apt-repository ppa:ayatana-scrollbar-team/release
sudo apt-get update
sudo apt-get install overlay-scrollbar
</code></pre><br />
<p>Ok almost done. The final step I did was to install a nifty applet that is great for laptops to manage power settings and screen resolutions, especially when you want to project. It adds itself right to the top menubar for easy access.</p><br />
<blockquote><p>Install Jupiter</p></blockquote><br />
<pre><code>sudo add-apt-repository ppa:webupd8team/jupiter
sudo apt-get update
sudo apt-get install jupiter
</code></pre><br />
<p>And we’re done! Enjoy your new desktop this holiday season and Happy 2012!</p><br />
<h2>Update (02-Jan-2012)</h2><br />
<p>After writing the article I wondered if it was possible to get the Window titlebar to dissapear whereby Apps can run in full screen similar to Lion. Though its not perfect, the following hacks will give you just that!</p><br />
<blockquote><p>Install window-buttons extension to mirror <code>minimize, maximize, close</code> buttons in the top menubar</p></blockquote><br />
<pre><code>sudo apt-get install gnome-shell-extension-window-buttons
</code></pre><br />
<p>Now enable the extension using “Gnome Advanced Settings” by visiting the “Shell Extensions” section.</p><br />
<pre><code>Window Buttons Extension -> ON
</code></pre><br />
<blockquote><p>Install maximus daemon to automatically remove the title bar as a window is maximized.</p></blockquote><br />
<pre><code>sudo apt-get install maximus
</code></pre><br />
<p>Restary Gnome 3 and your set. You can restore a maximized window using the buttons in the top right corner of the top menu bar.</p><br />
<hr /><br />
<h1>References</h1><br />
<ul><li><a href="http://www.webupd8.org/2011/11/autohide-top-bar-extension-finally.html">Gnome 3 auto hide top bar</a></li>
<li><a href="http://www.omgubuntu.co.uk/2011/12/ubuntus-overlay-scrollbar-gets-updated-heres-how-to-upgrade/">Ubuntu’s Overlay Scrollbar Get Updated</a></li>
<li><a href="http://www.webupd8.org/2011/11/install-gnome-shell-global-menu-in.html">Gnome 3 global menu</a></li>
<li><a href="http://www.webupd8.org/2011/05/how-to-remove-maximized-windows.html">How to remove maximized window buttons</a></li>
<li><a href="http://www.webupd8.org/2011/05/how-to-remove-maximized-windows.html">Get Unity like window button extension</a></li>
<li><a href="http://www.webupd8.org/2011/09/jupiter-applet-finally-available-for.html">Jupiter Applet finally available for Ubuntu 11.10</a></li>
<li><a href="http://www.flickr.com/photos/babytux/tags/screenshots/">High-Res screenshots over at flickr</a></li>
</ul>geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com0tag:blogger.com,1999:blog-8947964.post-47619818837198935862011-12-31T10:11:00.004-06:002011-12-31T11:56:17.089-06:00Track Your New Year Resolutions With ii.do<div class="entry-content">I know, I know, it’s a bit of a cheesy title to promote my Open Source project but here me out - it really does work. Though I didn’t really write it to track my new year resolutions (I’ve never found them effective beyond a week), I did write it out of constant frustration with not finding a TODO productivity tool that stuck with me.<br />
<br />
Let me explain… I’ve tried EVERYTHING!<br />
<ul><li>Google Calendar & tasks</li>
<li>Sending myself Emails</li>
<li>Wiki (Dokuwiki, Mediwiki, Twiki etc.)</li>
<li>Tomboy</li>
<li>Tomboy with UbuntuOne & Android App</li>
<li>EverNote</li>
<li>Remember the Milk</li>
<li>Gedit notes on my desktop</li>
<li>Post-it notes widget</li>
<li>Actual Post-it notes</li>
<li>Old-skool diary</li>
<li>Pieces of paper including backs of payment receipts</li>
</ul>From the above list, if anything came remotely to sticking as a habit, then it’d be the sticky notes & pieces of paper. I’m not suggesting the other methods suck, but I wouldn’t use it beyond a couple of weeks.<br />
<br />
Why was it that despite me spending hours on a computer, the best thing that had a chance of remotely working was old-skool pen & paper? And then it hit me. A good todo App should be:<br />
<ol><li>In your face!</li>
<li>Really simple to use (like pen & paper)</li>
<li>Did I mention in your face?</li>
</ol>Now my solution isn’t for everyone. At this moment, it is intended ONLY for fellow geeks who spent a lot of time in the command line (GNU & UNIX only), and I call it ii.do.<br />
<h2>What is ii.do</h2>ii.do, pronounced “I do”, but really a roman play on 2.do, is a command line todo list manager that uses a simple text file and simple MarkDown syntax to track your todo tasks. The beauty of using MarkDown syntax is in it’s resemblance to the natural way we jolt down text on a piece of paper.<br />
ii.do is optimized around querying tasks as opposed to updating tasks. For entering and updating tasks, it uses a plain old vim text editor, which has syntax highlighting for Markdown built-in. If vi is not your thing, then it’s relatively easy to configure another editor by exporting the shell $EDITOR variable.<br />
<br />
The other main design goal I had was to make it into a standalone shell script which could stand (mostly) on its own. Except for standard shell commands like sed, grep and bash itself, it doesn’t demand much.<br />
<br />
But the main power of ii.do comes ONLY (and I repeat ONLY), if you modify your shell to :<br />
<ol><li>Define an easy alias (such as t) which can be used to summon ii.do from anywhere within the shell</li>
<li>You modify your $PS1 shell prompt to update it with the pending number of tasks (this is the in your face bit)</li>
</ol>The second one point above is <em>important</em>, if you plan on actually using it productively, for there is nothing like an App stalking you with a reminder of how many things you have left todo.<br />
<h2>Installing ii.do</h2>You can download a tarbar ball of ii.do from <a href="https://github.com/geekaholic/ii.do/downloads">github</a>. Then just extract it to your home directory, make the shell script executable (just in case) and finally copy the sample todo.markdown to your $HOME<br />
<pre><code> </code></pre><pre><code> $ tar zxf geekaholic-ii.do-iido-xxx.tgz
mv geekaholic-ii.do-\* ~/ii.do
chmod +x ~/ii.do/ii.do
cp ~/ii.do/todo.markdown ~
</code></pre><br />
Though ii.do is now usable, you should create an alias in order to make it more accessible and add it to ~/.bash_profile or ~/.bashrc<br />
<pre><code> </code></pre><pre><code> echo "alias t='$HOME/ii.do'" >> ~/.bashrc
</code></pre><br />
Finally the <em>most important</em> step of adding a counter to your SHELL prompt is semi-automated via ii.do!<br />
<pre><code> </code></pre><pre><code> ~/ii.do/ii.do -S "$PS1" >> ~/.bashrc
</code></pre><br />
Your all set! You might want to logout and login or do a source ~/.bashrc<br />
<em>Note:</em> if you would rather relocate the todo.markdown, for instance in your Dropbox folder then use the -f option.<br />
<pre><code> </code></pre><pre><code> echo "alias t='$HOME/ii.do -f $HOME/Dropbox/todo.markdown'" >> ~/.bashrc
</code></pre><h2>Using ii.do</h2>Now the fun begins! Lets start with the most basic.<br />
<pre><code> </code></pre><pre><code> t -h
Version: 0.6.1
Usage: ii.do [-f todo_file.markdown] [-T topic_number] [options]
Options :
-e Open TODO file using $EDITOR
-n Count number of pending tasks. Can be filtered using -x, -X etc.
-X Filter to show only pending tasks
-x Filter to show only completed tasks
-i Filter to show only important tasks
-t Filter to show only topics with topic_number
-C Don't colorize output (useful for piping)
-H HTMLize the output
-S "$PS1" Will return modified PS1 prompt to contain pending task count
-h Show this help screen
By default, we expect a ~/todo.markdown to be in your $HOME if not overridden
by the -f option. Refer to http://github.com/geekaholic/ii.do for examples of
creating this file.
</code></pre><hr />To edit a file using vi or $EDITOR<br />
<pre><code> </code></pre><pre><code> t -e
</code></pre><br />
Using markdown syntax to maintain todo.markdown is simple. You start out with a main heading called a topic.<br />
<pre><code> </code></pre><pre><code> # Weekly Activities
</code></pre><br />
or using the alternate style<br />
<pre><code> </code></pre><pre><code>Weekly Activities
=================
</code></pre><br />
Next you start your list of tasks as a bullet <code>*</code> list<br />
<pre><code> </code></pre><pre><code>* Come up with a BIG idea
* Implement BIG idea and be awesome
</code></pre><br />
You could further break up your topic into subtopics as follows:<br />
<pre><code> </code></pre><pre><code># Weekly Activities
## Entertainment
* Watch a Movie
* Go bowling
Excercise
---------
* Go to gym at least 3 days a week
* Play some wii sports
</code></pre>As you might have guessed the <code>---</code> are the alternate form for a sub level topic. This way you have have multiple top level topics followed by sublevel topics, having tasks at each level.<br />
<hr />Now we got the data entry part sorted, lets see how we can query the task list.<br />
<pre><code> t
</code></pre><img alt="ii.do output" src="http://farm8.staticflickr.com/7008/6606634025_e29d008501.jpg" /><br />
<br />
Will show all your tasks using terminal colors.<br />
<hr />To filter tasks to show <code>only pending</code> or <code>only completed</code> ones:<br />
<pre><code> </code></pre><pre><code> t -X
t -x
</code></pre><br />
To filter by topic, so that it only shows tasks belonging to one topic including it’s sub topic:<br />
<pre><code> </code></pre><pre><code>t -t
1: # Weekly Activities
2: ## Entertainment
3: ## Excercise
4: # Home Work
t -T 1
</code></pre><br />
The above will show everything up to topic 4 (Home Work)<br />
<hr />To count the number of pending and competed tasks<br />
<pre><code> </code></pre><pre><code> t -X -n
t -x -n
</code></pre><hr />To update the task to mark it as complete, place an x in front of the task<br />
<pre><code> </code></pre><pre><code>* x Take out the trash
</code></pre><br />
To mark a task as important, place an <code>! mark</code> in front of it<br />
<pre><code> </code></pre><pre><code>* ! Go to gym at least 3 days a week
</code></pre><br />
To mark a task with a high priority, place the priority number in front of the task<br />
<pre><code> </code></pre><pre><code>* (1) Finish history essay
</code></pre><hr />Finally, ii.do has two options that customize the output. The first option is to turn off color which is handy when you want to pipe the output of ii.do with more unix commands.<br />
<pre><code> </code></pre><pre><code>t -C | grep '^*'
</code></pre><br />
The other option is to export the todo list as html<br />
<pre><code> </code></pre><pre><code>t -H > ~/todo.html
</code></pre><img alt="ii.do html output" src="http://farm8.staticflickr.com/7145/6606925889_c86b0fc9a8.jpg" /><br />
<h2>Other Uses</h2>Besides tracking my todo list on a daily basis, I’ve recently found another use for ii.do - track my bookmarks. I know, your probably thinking of delicious or firefox/chrome bookmark syncing but for me those solutions just don’t cut it. For one, I use about 3 browsers and finding old bookmarks can be a real pain. So now I just use an alias with a custom bookmarks.mdown<br />
<pre><code>alias bm="$HOME/ii.do/ii.do -f $HOME/Dropbox/bookmarks.mdown"
</code></pre>Another use was to keep track of lecture topics by marking them off as I taught them over a period of two months. I also use it to keep track of some interesting quotes I come across, just for inspiration.</div><br />
<br />
<h2>See also</h2><br />
Watch a talk I gave to introduce ii.do at <a href="http://www.refreshcolombo.org">RefreshColombo</a><br />
<br />
<iframe width="340" height="315" src="http://www.youtube.com/embed/9GrFj5gUMAI" frameborder="0" allowfullscreen></iframe>geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com3tag:blogger.com,1999:blog-8947964.post-67221586951546882082011-10-22T09:24:00.002-05:002011-10-22T09:28:05.963-05:00Remembering Steve JobsThe passing of Steve Jobs came in as an instant shock that morning as I was reading the news on my phone. While it wasn't as much of a surprise as when MJ passed away, mainly because I had seen that <a href="http://cdn.technoreview.net/wp-content/uploads/2011/09/steve-jobs-health.jpg">one photo of Steve</a> wearing a black gown coming back from the hospital, something inside me felt empty. When many flooded the social media and TV with messages of condolences and looking back at his achievements, I just watched not sure how I should express the loss.<br />
<br />
So after about a week later, when I was asked if I can do a talk on Steve Jobs for <a href="http://www.refreshcolombo.org/">Refresh Colombo</a>, I immediately and almost impulsively said YES! But I still wasn't sure what I wanted to say. One thing I did know was that I didn't want to recap his life or accomplishments like I knew the guy.<br />
<br />
Soon after accepting the talk, the next thing I almost instantly realized was that I needed to get Chanux in as a co presenter, not because we've recorded so many <a href="http://sinhalenfoss.org">episodes of a podcast</a> together but due to <a href="http://chanux.wordpress.com/2011/10/06/dear-steve-jobs">his reaction to Steve's death</a>, which surprised me even more than Steve's death itself.<br />
<br />
Why the majority of the people reacted the way they did will probably take a book to investigate, rather than a simple blog post, but I suspect it's complicated. So rather, I asked myself, what is it about Steve Jobs that I'm mostly going to miss. The keyword here being mostly, I realized I was going to miss his persona, his insight, his principals and his approach to doing things. And so that is what I decided the talk should be about - what made Steve great, or rather insanely great!<br />
<br />
Myself and Chanux started doing brain dumps of Steve quotes which captured his philosophy just from the top of our heads. The way I saw it, if we couldn't really remember a particular quote and had to research on the Internet then it hasn't really had much of an impact on us personally. And so except for looking for some great images of which suited the slides, we didn't much go looking for his quotes. Consequently this probably means we might have not got the quotes in verbatim accuracy.<br />
<br />
What follows is our presentation slides which we delivered last week. By highlighting these tidbits, we hope that it will inspire you to think differently about what you are currently doing before it is too late. Because our time on this planet is quite limited and even if we believe we're coming back it doesn't matter when it takes a new form factor with a completely new UI and Operating System!<br />
<br />
<div style="width:425px" id="__ss_9803976"><strong style="display:block;margin:12px 0 4px"><a href="http://www.slideshare.net/geekaholic/remembering-steve" title="Remembering steve" target="_blank">Remembering steve</a></strong> <iframe src="http://www.slideshare.net/slideshow/embed_code/9803976" width="425" height="355" frameborder="0" marginwidth="0" marginheight="0" scrolling="no"></iframe> <div style="padding:5px 0 12px">View more <a href="http://www.slideshare.net/" target="_blank">presentations</a> from <a href="http://www.slideshare.net/geekaholic" target="_blank">Buddhika Siddhisena</a> </div></div>geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com1tag:blogger.com,1999:blog-8947964.post-47308409228036705402011-09-03T02:08:00.001-05:002011-09-03T02:23:14.272-05:00Getting Down With Markdown<p>Recently I've been looking for an alternative to <a href="docbook.org">docbook</a>, which I've used for most of my tutorial handouts and internal developer documentation at <a href="thinkcube.com">Thinkcube</a>. But the more I used it docbook the more I wanted a simpler solution which didn't require me to make sure my XML was in order.</p>
<br /><p>Naturally, at first I thought I'd try <a href="latex-project.org">Latex</a> since it had a pretty good wrap with geeks and has even <a href="http://openwetware.org/wiki/Word_vs._LaTeX">surpassed usability expectations</a> set forth by some of the mainstream wordprecessors :). What I loved about Latex was you could concentrate on the content first and formatting later. Its legendary ability to output desktop publishing quality documents and convert to a variety of formats such as html, pdf or odt was a killer.</p>
<br /><p>Just as I was about to dive into Latex, <a href="twitter.com/chanux">Chanux</a> suggested <a href="daringfireball.net/projects/markdown">Markdown</a> as an alternative. Hmm, Markdown, I pondered... I even liked the sound of it. It turns out Markdown is even better! You could think of it as a simplified wiki syntax but a better description would be to call it a WYSIWYG wiki syntax.</p>
<br /><p>I've always endorsed the <a href="en.wikipedia.org/wiki/KISS_principle">KISS</a> philosophy. There is nothing more simple and satisfying than to write a text file using <a href="vim.org">vim</a> and track its progress via <a href="git-scm.com">git</a>. After briefly going through the syntax, I realized this is exactly what I needed. I also realized that I had already used Markdown without actually thinking about it as part of using <a href="github.com">github</a> for a pet project. Everything about Markdown was all good and the whole <a href="http://www.zdnet.com/blog/violetblue/when-software-offends-the-pantyshot-package-controversy/509">controversy</a> around Markdown's html compiler names were exactly the kind of celebrity gossip it needed to grab attention! </p>
<br /><p>It was around this time, I was due to create a note for a tutorial for the <a href="http://www.icter.org/UCSCConf/index.php/icter/ICTer2011/schedConf/workshops">ICTer workshop</a> myself and Dr. Ajantha from UCSC was to deliver. By now, I had decided on Markdown with <a href="https://github.com/chobie/upskirt">upskirt</a> (yes this is one of the controversal names) to create the notes but what about the slides? Could I use Markdown for that as well? After a little looking around, I found a wonderful system called <a href="https://github.com/adamzap/landslide">Landslide</a> which enabled me to compile Markdown syntax into a beautiful html5 slide show presentation. After a little playing around I managed to build slides as well as the note using a single markdown source code! How cool was that? I will write a separate post soon on the HOWTO details but for now enjoy the <a href="http://geekaholic.github.com/slides">slides</a>, if thats your cup of tea. My Markdown adventures don't end there. This post too was written using Markdown and converted to html using <a href="octopress.org">octopress</a>.</p>
<br />geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com2tag:blogger.com,1999:blog-8947964.post-27702977585285438032011-08-10T18:34:00.005-05:002011-08-10T22:31:51.603-05:00There's something about Lion!<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://2.bp.blogspot.com/-eC6np1G_DHk/TkM-hK-AB-I/AAAAAAAAIn4/mQofKNoZV1E/s1600/tumblr_liu3bstunX1qa9omho1_500.jpg"><img style="float:right; margin:0 0 10px 10px;cursor:pointer; cursor:hand;width: 218px; height: 320px;" src="http://2.bp.blogspot.com/-eC6np1G_DHk/TkM-hK-AB-I/AAAAAAAAIn4/mQofKNoZV1E/s320/tumblr_liu3bstunX1qa9omho1_500.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5639419897999001570" /></a>
<br />So it's been a little over a month since I switched to MacOSX Lion as my primary desktop after running Ubuntu and before that Gentoo on this Macbook. So why'd I finally switch? Well I'm sure it comes at no surprise to existing mac owners who mostly run OSX anyways.
<br />
<br />I'm different! I owned two Mac's to date, first a MacMini G4 (PPC) since 2006 and now a Macbook Aluminum since 2009, both primarily ran some flavor of Linux. Sure I had OSX lying around in another partition, but I'd only boot into it once in a while, just to update or checkout an interesting app or two.
<br />
<br />Linux was what i used, not because OSX was bad (like Windows is) but because I was more comfortable with it, it was more flexible, had more innovation happening (pretty much every 6 months) and above all it was fun to use. Part of that fun was really the do-it-yourself attitude Linux has had for years but is somewhat fading away IMHO with distros such as Ubuntu. Its also no <a href="http://www.geekaholic.org/2005/12/obsessed-with-mac-on-linux.html">secret</a> that I like the OSX interface and have tweaked my desktop in the past to resemble it somewhat. At the time, what made me buy a Mac was not the OSX interface. I really bought the Mac for its beautiful hardware design and higher build quality despite Linux not being treated as a first class citizen on it.
<br />
<br />After years of owning other laptops I was fed up! I recall my first ever laptop, a Sony Vaio back in 2000 which only lasted 3 months before the disk died. Coupled with my project manager's ignorance it would later never be able to have a hard drive! Years later I bought an IBM thinkpad which worked fine except that the plastic frame around the LCD started to crack one day as I opened or closed the lid and over time the crack increased to the point it was crippled as a portable device. My next HP pavilion developed a random boot feature (especially when compiling or transcoding) which I could never reproduce to get a replacement and lastly the Acer which had far too many problems to mention before it too died. Seriously people, whats a good Laptop thats reliable? Is Dell reliable?
<br />
<br />When I first saw Lion being previewed at one of the Apple events I brushed it off as the same old OSX with an iPad like icon interface, which they now call Launchpad. It wasn't till the recent WWDC event when they showed off the full deal that I got an urge to install it. And so I pondered, "How do I get this beast on here?". My Mac partition was small and had less that 2GB free. I hardly had room on my Linux partition either to get away with a resize. Besides I knew Lion would not install with Linux lying around as the installer wasn't that smart to deal with it. I decided, it was time to delete the Linux partition!
<br />
<br />Now I'd like to say it was a sad move which I pondered for days like when you have to move to a new place, leaving your friends behind. It really wasn't! I don't miss leaving Ubuntu. Not a lot at least...
<br />
<br />And I ask my self, why am I not missing Ubuntu as much as I missed Gentoo or Debian before that. IMHO, Ubuntu isn't fun anymore. It was even somewhat frustrating to use with the new Unity interface being default and I tried to install Gnome 3 over it. That didn't go too well. It's not just in the GUI but also in the command line where things aren't that much fun. It just too easy! and you mostly don't need it. Don't get me wrong - this is all good and Ubuntu is doing fantastic work for getting Linux to the mainstream. Its just not that fun as say Gentoo or even Fedora. But the reason I'm not switching isn't because Ubuntu is not fun or because I'm frustrated with Ubuntu. I'm really not! If that was the case, I would have switched to another distro like Mint or Debian.
<br />
<br />Lion had a few features which I thought was neat and fun to explore. One such feature I really like is full screen apps on its own desktop being a built in feature of the core OS. This is a usage pattern I was quite used to when working with multiple desktops on Linux. I'd always open several apps in maximized form and move them to their own desktop and switch between them using control + arrow keys. But its better on the Mac...
<br />
<br /><ul>
<br /><li>Your not limited by the static number of multiple desktops. On Linux I sometimes run out of desktops</li>
<br /><li>It's an app feature so the desktop is automatically created and destroyed as you click the new full screen button found on the titlebar</li>
<br /><li>Apps are in true full screen (no titlebar or menubar)</li>
<br /><li>..and my favorite, it supports 3 finger swipe in addition to the keyboard short cut. Its really fun swiping between multiple desktops</li>
<br /></ul>
<br />
<br />Gnome3 looks promising in this respect as it too has the concept of dynamic desktops. Unfortunately it resets your desktops to always having just one and you have to create them all the time, which is more annoying than having a static number of desktops.
<br />
<br />The other killer feature I like in Lion is support for multiple versions of a document. When Apple introduced "Time Machine" back in the day, the UI looked cool but it was not so practically usable as I would imagine as you needed an external hard drive in order to do continuous backup. Unlike what many speculated to be the adoption of ZFS and its online snapshotting capabilities, it turned out to be a far less elegant method under the hood. I doubt its as elegant as ZFS even with this feature because the underlying filesystem is still HFS+ but it works quite well in a practical manner for apps that makes use of this new API (Apple's own apps at the moment). Whats really cool is that the same time machine like UI is adopted to browse the different versions of a document where you can even copy & paste objects between versions.
<br />
<br />The automatic save and resume of application's state between reboots is another interesting feature worth studying about. Again it only works for apps that use special APIs but essentially the app is hibernated and resumed as opposed to the whole OS. As a result its a lot faster. The ability to access documents by file type regardless of where its stored in the filesystem is another good usability feature which I always wanted the Linux desktop to have. In a world where hard drives are large and there are too much clutter, the filesystem organization is really a bottleneck. I don't think Lion nailed it either but its a good start. Vista tried and failed with WinFS. Google, beagle (now tracker) and Spotlight's approach of giving a search engine doesn't quite scale in my opinion. You'd think search would work, especially from Google but the more I think about it the more it seems to me that, we think Google is awesome because it finds the information we're looking for and gives us an answer - not necessarily the the needle in a haystack. For instance I really find Gmail frustrating for finding a mail which I vaguely remember about. Desktop search breaks down as opposed to web search because it is needle in a haystack problem but I digress!
<br />
<br />There seems to be tons of other small technological as well as usability features of Lion and OSX which are pleasant having around and are interesting to study. So to summarize why I switched, its not one single thing but many things. I'm looking forward to the MacOSX command line and I've already started exploring it with the help of O'Reilly's book "Mac OSX for Unix Geeks". It was only recently after arriving at OSX, I really appreciated the bonjour protocol and its implementation on Linux via avahi. Most Linux users don't know that they can ping their neighbors machine without DNS or IP address but merely using the machine's hostname with .local appended to it. When did you drop in to a command line on Ubuntu and try running <em>avahi-browse -a</em> or <em>avahi-discover</em>?.
<br />
<br />Trying to install LAMP on MacOSX I realized how frustratingly fun it was. It was like going back to the Redhat 9 days of manually enabling apache modules. There is also Mac Ports which is akin to Gentoo's portage where you download and compile apps, which is fun!
<br />
<br />Having said all this, there is so much more of little things that I like (such as being able to right click on a word and look it up or have it read me the text via the excellent TTS which can also be done on the command line using the say command) and some things that are annoying like Finer (file browser) not supporting tabs, cut & paste of files or ability to delete without first sending to the trash. And I hear printer configuration is also non intuitive if your coming from Windows/Linux despite it using CUPS. I also find it using a bit more memory than usual on some apps and as a result my 2GB is almost fully used making it too slow for running Virtualbox.
<br />
<br />All in all, I'm pleased with the move and look forward to learning more of its underlying intricacies and BSD origins in days to come. There sure is something about Lion!
<br />
<br />
<br />geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com3tag:blogger.com,1999:blog-8947964.post-15856632855808665802011-07-31T13:50:00.006-05:002011-07-31T15:55:29.478-05:00Why read when you can listen?There's an old quote, "Reading makes a full man". I'm not quite sure who said it but I know it's got to be at least a couple of years older than myself because I was introduced to the concept early on by my dad. Despite the early infusion of such words of wisdom, I really never turned out to be a good reader.<br /><br />For one, I could never read on a moving platform for more than 60 seconds without feeling nauseous. Forget about reading in bed either, for more than 40 - 60 minutes (my record might be about 90 minutes) before the book starts slapping my face multiple times as I loop over the same paragraph of text over and over before giving up and falling asleep! <br /><br />When I do get into the mood of reading, it still seems to take me much longer to finish a book when compared with, oh I don't know my sister! She's got a ton of books, almost all novels and I remember seeing her finish them fast.<br /><br />So early on as a kid, I realized, my reading time was precious and best spent on things that I ought to really care about. Things that add real value as opposed to mere entertainment. As a result, I hardly read fiction or even science fiction for that matter but instead resolved to science fact books like astronomy, GW Basic programming and my favorite - books on magic. This was probably early signs of my geekiness. <br /><br />Eventually I did get around to reading fiction (and sci-fi) in limited quantities and throughly enjoyed it. I got "what the fuss was about", but still felt it was a waster of effort which I could substitute with watching the movie.<br /><br />Now don't get me wrong! I love books as much as the next guy. I have a shelf full of books and a few more overflowing around the house. The problem is, I have way too many books that I've started to read but never finished. Fortunately almost all of them are computer books. More recently though, I've had better success completely reading books thanks to the Kindle because I can carry it around.<br /><br /><br />And then I decided to try out <a href="http://www.audible.com">audible</a>. They had a couple of deals which they advertised often on Amazon (amazon owns it), but only made me consider after listening to a sample of a book I owned and loved.<br /><br />My first audio book which I purchased and downloaded was <a href="http://en.wikipedia.org/wiki/Contact_(novel)">"Contact" by Carl Sagan</a>. It was one of the few sci-fi books I already owned and read only because I fell in love with the movie. It was also one of those rare stories in which I felt the movie was as good (if not better) as the book even though the two had a considerable difference to the story line. After hearing a sample narration of the audio version by <a href="http://en.wikipedia.org/wiki/Jodie_Foster">Jodie Foster</a>, I wanted to download it immediately. Contact was my catalyst (aka killer book) for getting on the audio book bandwagon.<br /><br />Since then I've picked up several books, fiction and non-fiction. While I won't mention them here, I will mention one great book I just finished today listening to and which made me want to write this post. The audio book is none other than "The hitchhiker's guide to the galaxy (HHGG)" by Douglas Adam and narrated by Stephan Fry. Unlike Contact, I had never read the book though I owned it and always meant to get to it. Having obsessively listening to it the past week, even falling asleep to it, I admit this was probably the best narration I have ever heard of anything so far, ever! (not that it's saying a lot given my geek background, but I do have good taste!)<br /><br />Before I conclude this post which I didn't expect to be this long, I'd like to throw in this food for thought. Was my dad right about the "Reading makes a full man?" quote or is reading overrated? Should reading even be considered as an optimal means of consuming knowledge or gathering information? I mean think about it! We live in an age where information is available at the tip of Google but its mostly just text which needs to be read. The more you think about it, the more you realize that reading is the "last mile" bottleneck between information and your brain.<br /><br />I am not saying reading and writing must die! I do feel reading fast (skimming) might be more efficient to discover the relevance of information than listening. Writing and reading what you've written might also be more useful for gathering thought. What I am debating is whether once the message has been devised (through iterative writing and reading), delivering that concise message need always be done via text to be read.<br /><br />Why then are we still fixated on reading? Is reading a legacy practice that's been passed from generations and one that made perfect sense when it was the only viable means of recording information? If so, given today's rich multimedia options, why aren't we switching our primary medium? <br /><br />The answer may well be the generation gap. Perhaps we will realize once the kids who are now growing up in a world of podcasts, youtube and interactive games decide what to do with it when they rule the world. What do you think? Should I stop writing and just stick to <a href="http://sinhalenfoss.org">podcasting</a> after all?geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com2tag:blogger.com,1999:blog-8947964.post-1268400413309513272011-07-29T12:22:00.002-05:002011-07-29T13:04:35.160-05:00It's true, I'm on a LionGreeting fellow bloggers and no I'm not dead. Blogging for me has got to a point where its a post or two per year! So at this point it doesn't matter what I say because I doubt any of my old followers are going to read this any time soon.<br /><br />As usual, I would have blamed busyness (a result of %s/y/i/) and twitter but honestly speaking it was the absence of the mood which killed it. Lately I've been doing so much of writing at work, the last thing I want to do is come home and do more of it. So instead, I'd prefer to pretend to hack on some C/C++ or Ruby or PHP and play some PS3 and call it a day.<br /><br />Today was different, it was our celebration of the International <a href="http://www.sysadminday.com">Sysadmin's day</a>. So unlike my typical day, I went to a bar and talked geek with fellow sysadmins, all the while munching away on bites and sipping on some <a href="http://www.gnu.org/philosophy/free-sw.html">free beer</a> (free for now, IOU Deep). During our conversation, amidst the noisy band that was playing, one of the things, I came out and confessed is the topic of this long awaited post.<br /><br />And since I'm in my geeky little mood, I thought I'd share my confession - I run Mac OSX <a href="http://en.wikipedia.org/wiki/Mac_OS_X_Lion">Lion</a>! Further more I run Mac OSX Lion, wait for it....., at the expense of Linux. <br /><br />Now for the random reader this might not be such a surprise. Apparently you can't go to a conference in the US without seeing 90% of the devs on a Mac. But I'm not the type to join a mono culture for the sake of it. Those who know me, knows I've had this Macbook since the days of Mac OSX Leopard but 99% of the time it was running on Ubuntu.<br /><br />So what happened? Why'd I switch? My answer isn't the stereo typical "I'm a Mac fanboy", "Linux is frustrating" or "I want it to just work" sort you'd get. Its much deeper than that. <br /><br />Stay tuned....<br />(P.S. Don't worry it's not 42 and I'll try to get to the post soon!)geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com4tag:blogger.com,1999:blog-8947964.post-72532440089879279042010-11-21T23:50:00.002-06:002010-11-22T01:16:45.632-06:00The Linux Speed Boost!Oh my, it has been a while since I visited my Blog. While there were few worthy posts which I should have blogged about, that never happened. Recently, when I came across a <a href="http://www.omgubuntu.co.uk/2010/11/linux-to-get-a-lot-faster-due-to-new-patch/">post on OMG Ubuntu</a>, about a new kernel patch which supposedly speeds up Linux, I just had to try it out. Its been a while since I've compiled my own kernel (these days I rely on stock Ubuntu), but after seeing the results myself thought it was very much blog worthy for me to share with you.<br /><br />First off, for the impatient or the unmotivated, let me <a href="http://www.phoronix.com/scan.php?page=article&item=linux_2637_video&num=2">point you to another post</a> on Phoronix, which contains a video showing the night and day difference this patch brings. If your still not impatient, then you could wait for 2.6.38, which will hopefully have this, considering <a href="http://marc.info/?l=linux-kernel&m=128979084506774&w=2">Linus's supportive comments</a>, regarding the patch.<br /><br />Ok now for the glory details on getting this patch up and running. I did this on Ubuntu 10.10 but it should work the same for other Debian and Debian like distros as well as other popular distros with minor tweaks. <br /><br /><dl><br /><dt>Get the kernel</dt><br /><dd>Download the kernel 2.6.37 from <a href="http://kernel.org">kernel.org</a>. At the time of writing, 2.6.37 had not been officially released and was at RC2<br /><br /><pre><br />cd /usr/src<br />wget http://www.kernel.org/pub/linux/kernel/v2.6/testing/linux-2.6.37-rc2.tar.bz2<br />tar jxf linux-2.6.37-rc2.tar.bz2<br /></pre><br /></dd><br /><br /><dt>Get the patch</dt><br /><dd>The <a href="http://marc.info/?l=linux-kernel&m=128978361700898&w=2">patch published by Mike Galbraith</a> can also be copied from <a href="http://paste-bin.com/view/raw/6e4a38d4">here</a>. Save it to a file called kernelboost.patch in /usr/src/kernelboost.patch<br /></dd><br /><br /><dt>Dry-run and Apply patch</dt><br /><dd>Go to the kernel directory and dry-run the patch to make sure it applies cleanly. If everything is ok, go ahead and apply for real ;)<br /><br /><pre><br />cd /usr/src/linux-2.6.37-rc2<br />patch --dry-run -p1 < /usr/src/kernelboost.patch<br /><br />#if everything is ok<br />patch -p1 < /usr/src/kernelboost.patch<br /></pre><br /></dd><br /><br /><dt>Configure and compile the kernel</dt><br /><dd>This is a generic step of compiling the kernel. The configuring step can be performed using your current config file. Make sure you enable the patch by answering Y to CONFIG_SCHED_AUTOGROUP.<br /><br /><pre><br />cp /boot/config-2.6.35-22-generic /usr/src/linux-2.6.37-rc2/.config<br />make oldconfig<br />make<br />make modules_install<br />make install<br /></pre><br /><br />If your on Ubuntu, I'd recommend <a href="https://wiki.ubuntu.com/KernelTeam/GitKernelBuild">installing the kernel the Debian way</a> instead of using the make, make modules_install and make install steps.<br /><br /><pre><br />apt-get install kernel-package fakeroot build-essential ncurses-dev<br />cd /usr/src/linux-2.6.37-rc2<br />sed -rie 's/echo "\+"/#echo "\+"/' scripts/setlocalversion<br />make-kpkg clean<br />CONCURRENCY_LEVEL=`getconf _NPROCESSORS_ONLN` fakeroot make-kpkg --initrd --append-to-version=-custom kernel_image kernel_headers<br /><br />cd ..<br /></pre><br /></dd><br /></dl><br /><br />You should find two .debs in /usr/src related to the new custom kernel. Install using dpkg -i or by double clicking on the deb from the file manager. Once you reboot to the new kernel everything should be all set. Just to double check make sure :<br /><br /><pre><br />cat /proc/sys/kernel/sched_autogroup_enabled<br />1<br /></pre><br /><br />If it has 1 then its enabled. You can echo 0 > /proc/sys/kernel/sched_autogroup_enabled to disable the scheduler on the fly.<br /><br />I noticed a significant improvement when trying to play <a href="http://store.steampowered.com/agecheck/app/7670/">BioShock from Steam</a> using <a href="http://www.codeweavers.com/products/cxgames/">CrossOver Games</a>. With the previous kernel, I had a lot of lag, especially with Compiz turned on. For instance, when moving the mouse to look around, it was very much discrete. Now I notice things are smoother and the game is much playable.<br /><br />Your experience may be different depending on how you use the computer. Chances are that if you tend to use CPU hogging applications such as playing games or watching HD movies, you will notice an improvement.<br /><br />Till next time a blog worthy event that excites me happens~geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com10tag:blogger.com,1999:blog-8947964.post-22781911464088543772010-02-08T04:22:00.008-06:002012-05-15T07:04:15.604-05:00Splitting a git repoIts been almost an year since I last blogged and what can I say, micro-blogging killed blogging for me. Even micro-blogging has got to a point, I don't do as much as I used to. Perhaps the end of web 2.0 or perhaps I'm getting too old for this :)<br /><br />Anyhow, getting back to the subject of this quick post, it seems splitting a git repo into two separate git repos is somewhat obscure and required a bit of googling around. Fortunately I came <a href="http://bytebaker.com/2009/05/18/refactoring-my-personal-git-repository">across this great blog post</a>, but wanted to summarize it in one place (the author made me look at several pages to put it together).<br /><br />Say I have a git project called foo.repo which had a subdirectory called bar, that I now want to make its own separate git project called bar.repo.<br /><br /><b>Current state</b><pre><br />foo.repo/<br /> .git/<br /> bar/<br /> abc/<br /> xyz/</pre><br /><br /><b>target state</b><pre><br />foo.repo/<br /> .git/<br /> abc/<br /> xyz/<br /><br />bar.repo/<br /> .git/<br /></pre><br /><br /><dl><br /><dt>Step 1 : Clone existing repo as desired repo on the local clone</dt><br /><dd><br /><pre><code>$ git clone --no-hardlinks foo.repo bar.repo</code></pre><br /></dd><br /><dt>Step 2: Filter-branch and reset to exclude other files, so they can be pruned:</dt><br /><dd><pre><code><br /> $ cd bar.repo<br /> $ git filter-branch --subdirectory-filter bar HEAD -- --all<br /> $ git reset --hard<br /> $ git gc --aggressive<br /> $ git prune</code></pre><br /></dd><br /><dt>Step 3: Create new empty repo on git server</dt><br /><dd><pre><code><br /> $ mkdir /var/git/bar.git<br /> $ cd /var/git/bar.git<br /> $ git --bare init</code></pre><br /></dd><br /><br /><dt>Step 4: Back on the local machine, replace remote origin to point to new repo</dt><br /><dd><pre><code><br /> $ cd bar.repo<br /> $ git remote rm origin<br /> $ git remote add origin git@git-server:bar.repo<br /> $ git push origin master</code></pre><br /></dd><br /><dt>Step 5: Remove bar directory from original foo.repo</dt><br /><dd><pre><code><br /> $ git filter-branch --tree-filter "rm -rf bar" --prune-empty HEAD</code></pre><br />or supposed to be faster, I haven't tried<br /><pre><code>$ git filter-branch --index-filter "git rm -r -f --cached --ignore-unmatch bar" --prune-empty HEAD</code></pre><br /></dd><br /></dl><br /><br /><h2>Reference</h2><br /><ul><br /><li><a href="http://bytebaker.com/2009/05/18/refactoring-my-personal-git-repository/">Refactoring my personal git post</a></li><br /><li><a href="http://stackoverflow.com/questions/359424/detach-subdirectory-into-separate-git-repository">Detach subdirectory into separate GIT repository</a></li><br /><li><a href="http://toolmantim.com/articles/setting_up_a_new_remote_git_repository">Setting up a remote GIT repository</a></li><br /></ul>geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com13tag:blogger.com,1999:blog-8947964.post-60272121688241353102009-06-15T15:28:00.001-05:002009-06-15T22:50:27.847-05:00Fedora 11: First impressionsFedora 10 (F10) was one of the best fedora release I've come to use. I liked it so much that I made it the default OS on my Mac Mini PPC. So naturally I was looking forward to the release of Fedora 11 (F11). But with a couple of release dates slipping, the wait was somewhat of a torture. This of course is to be expected with Fedora, I was told, because all good FOSS community software follow the philosophy "IT WILL BE RELEASED WHEN IT IS READY!" (oh and "RELEASE OFTEN" but thats beside the point :).<br /><br />Now before I give my first impressions, I must warn that if your a die hard Fedora fanboy, then please STOP reading.<br /><br />....<br /><br />Even if your not please note that these impressions are based on the Fedora Live CDs (both standard and kde based) and I have yet to download the DVD release. I did not install the Live CDs but merely ran them Live and only for a short period of time (a little over an hour combined). I tested the Live CDs on both my older Acer TravelMate 4200 laptop as well as the relatively newer Macbook Aluminium Unibody.<br /><br /><h3>F11 on the Acer notebook</h3><br />The first thing that I noticed was that finally <a href="http://kernelnewbies.org/Linux_2_6_29#head-e1bab8dc862e3b477cc38d87e8ddc779a66509d1">Kernel ModeSetting (KMS)</a> worked on this Intel GMA 950 GPU. If I recalled correctly, with F10, the Intel drivers were broken with KMS. Unfortunately I found the KMS boot splash somewhat boring. Compared to F10, which really showed off the power of KMS with an awesome solar system animation (I saw it on an ATI GPU), F11 had a much simpler Fedora logo that filled up as the system booted. I'm sure there may have been a logical reason for simplifying the animation, but my first impression was it needed more sex appeal.<br /><br />Once the system had booted and transitioned nicely to the login manager (thanks to KMS), followed by the desktop (after auto login), I found my self further disappointed esthetically, this time due to dull looking wallpaper and theme. IMHO, F11's wallpaper is several steps back when compared to the beautiful wallpapers of F10, F9 or even F8. The desktop theme plays a key part in the first impression and so perhaps they should have stuck to the wallpaper they had during the beta release.<br /><br /><a href="http://www.flickr.com/photos/babytux/3629199965/" title="F11 beta desktop wallpaper"><img src="http://farm3.static.flickr.com/2396/3629199965_fe9518c38a.jpg" width="500" height="400" alt="F11-beta-desktop" /></a><br /><small>F11 beta wallpaper wasn't so bad</small><br /><br />One thing I did like visually over the previous releases was that of the notification popup balloon.<br /><br /><a href="http://www.flickr.com/photos/babytux/3614488264/" title="f11_popup by Bud's photo blog, on Flickr"><img src="http://farm4.static.flickr.com/3624/3614488264_9dfbbf5a32.jpg" width="440" height="141" alt="f11_popup" /></a><br /><br />Beyond the look, F11 felt like a very solid distro. Before F10, with earlier releases, I felt somewhat on unstable and shaky ground. F10 changed my perception and F11 takes it a little further. <br /><br />While pretty much everything worked on my Acer notebook, there were few noteworthy bad first impressions I had with the out of the box experience. First up was the touchpad tapping being turned off by default. Now while I'm aware this is the default behavior on Windows, it is nevertheless a bad idea. Why disable this feature that most find useful? Other desktop focused distros such as <a href="http://www.ubuntu.com">Ubuntu</a> or <a href="http://www.linuxmint.com">Mint</a> seem to have tap turned on by default. Having said that, its pretty easy to turn tap on using the Gnome mouse preferences dialog. Unfortunately the same cannot be said with KDE 4 Control Center. I was unable to find this setting on the Fedora KDE spin.<br /><br /><a href="http://www.flickr.com/photos/babytux/3614488258/" title="f11_mousepref by Bud's photo blog, on Flickr"><img src="http://farm3.static.flickr.com/2470/3614488258_1e0b461c15.jpg" width="404" height="500" alt="f11_mousepref" /></a><br /><br />Another default setting that I didn't understand was the 3D desktop (aka Compiz) being turned off. Considering Linux very well supports composition on the Intel GPU, I don't understand why Fedora does not want to give this cool first impression feature (actually for me Compiz is beyond cool, its a necessity). The kde spin did not even have the "Turn on desktop effects" helper application and the one that is part of the kde 4 Contol Center was functional but slow.<br /><br />As far as the applications went, I was thrilled to see <a href="http://www.mozilla.com/en-US/firefox/all-beta.html">Firefox 3.5 beta</a> in there. Even though this was a beta release, most people have reported that it was stable and visibly faster than the previous version. Unfortunately if your on the kde spin, you wont find Firefox at all! I'm also glad to see <a href="http://pidgin.im">Pidgin</a> back as the default IM instead of the cool but not so functional <a href="http://live.gnome.org/Empathy">Empathy IM</a>, the default on F10. The lack of <a href="http://openoffice.org">OpenOffice</a> on the LiveCD struck me as a definite lack, and other Office suites (Gnome Office nor Kde Office) are simply not worthy of being default replacements.<br /><br />A complaint most often heard from the Windows cross over crowd is the lack of basic codecs for playing multimedia formats. Given the legal nature of shipping these codecs, most desktop distros try to simplify the "finding and installation" process by providing helper software. As far as my tests went, F11 was not friendly in helping me find and install some of the basic codecs such as mp3 or xvid. In contrast Ubuntu seemed to do this reasonably well.<br /><br />Nevertheless I saw improvements in terms of functionality and stability with <a href="http://www.packagekit.org">Package kit</a> - Fedora's answer to unifying package management in a consistent manner across distros. While its exciting to see where this project is heading, there are still some fundamental usability issues that need to be fixed. For example, a simple thing such as the lack of a proper progress indicator when updating the package database can be frustrating to watch and wait, especially with low speed broadband Internet connections. Another annoyance are the multiple dialog boxes that can appear during an install, in which all of them are waiting for the package database to be released. What would have been better was to have a single dialog which showed the task queue with the current one highlighted and progress indicated.<br /><br /><a href="http://www.flickr.com/photos/babytux/3613845277/" title="f11_waitingfortasks by Bud's photo blog, on Flickr"><img src="http://farm4.static.flickr.com/3629/3613845277_61c9b1e457.jpg" width="500" height="313" alt="f11_waitingfortasks" /></a><br /><br /><h3>F11 on the Macbook</h3><br />Next up was to test how F11 fared with my Linux hating Macbook. As expected I ran into many problems most of which are not unique to F11 but are seen in pretty much all current Linux distros. Two things that did not work with F11 at all but partially worked with Ubuntu/Mint was the touchpad and wireless. On F11 the mouse cursor would not move at all, while on Ubuntu it can be used as a single buttoned mouse (no tap support but click works). The wireless also worked on Ubuntu using the free Broadcom driver with the option to install the proprietary one. Not so in F11 and I had to use the wired connection to fix this. Sound did not work either, but neither did it on other distors. The lack of <a href="http://www.technologeek.org/projects/pommed/">pommed package</a> in the F11 repo was another frustration in trying to get the touchpad and some multimedia keys working. But this is not to say its impossible and if you google around, you should be able to find a few HOWTOs. I was a bit disappointed that even with the latest Nuova drivers that came with F11, it was still unable to boot via KMS or even auto load the driver under Xorg. Instead F11 seemed to use the vesa driver instead.<br /><br /><h3>Conclusion</h3><br /><br />Overall I'd recommend F11 for anyone fond of RPM distros, those thinking of upgrading from a previous version or anyone who wants a bleeding edge but rock solid distro with the (almost) latest kernel and set of packages. Personally I prefer deb based ones but with YUM and Package Kit getting better the difference is likely to disappear. I'm definitely going to upgrade (or install from scratch), F11 on the Mac Mini PPC because Fedora supports PPC quite well. If your going the Fedora route, I really couldn't recommend the Kde spin due to the lack of some important pieces and the poor integration. I would rather install the standard Gnome based one and install kde on it.geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com3tag:blogger.com,1999:blog-8947964.post-23746667485380831912009-06-06T10:07:00.002-05:002009-06-15T00:30:38.503-05:00Lights, Camera, Action!<a href="http://www.flickr.com/photos/babytux/3600872414/" title="UCSC.tv launch by Bud's photo blog, on Flickr"><img src="http://farm3.static.flickr.com/2422/3600872414_d9ca0363d8.jpg" width="500" height="417" alt="UCSC.tv launch" /></a><br /><br /><a href="http://www.sinhalenfoss.org">SinhalenFOSS</a>, the audio podcast we started a little over an year ago is now available as a <a href="http://www.sinhalenfoss.tv">vidcast</a>. A couple of months back, I got a call from Dr. Ajantha and I was thrilled to hear of <a href="http://www.ucsc.tv">UCSC.tv</a>, the latest venture by <a href="http://ucsc.cmb.ac.lk">UCSC</a>. I was even more thrilled when Dr. Ajantha invited us to produce the SinhalenFOSS podcast on it.<br /><br />Despite our enthusiasm, we soon learned how hard it was to produce the show as a vidcast. We were used to recording the audio podcast at our own time, sometimes in the car, sometimes at home or office or even on the road. We would sometimes answer phone calls from loved ones in the middle of the recording and edit that out. Our podcast was pretty much a basement operation by 3 sweaty guys :)<br /><br />So I guess the most difficult task was to make us presentable on video. To that extent, the UCSC TV crew has done a good job with makingup us! The next challenge was time management. While our audio show was roughly 1 hour long, it wasn't strictly 1 hour long. Sometimes we went on and on for 1:45 hours and at times was done in 45 min. With the vidcast, we now have to be aware of time and wrap up when we see the 2 minute sign. To further complicate time management, we have to break the show up to 2 parts of 25 minutes each but make it appear to be one show on the audio podcast.<br /><br />While all this might seem like a list of complaints, its actually been a blessing. We've now have the opportunity to show visuals of news sites and software screens when doing reviews as opposed to just talking about it. The sound quality has also improved vastly due to the studio setting.<br /><br />Unfortunately the vidcast is still not in line with the audio podcast as the video post production <br />takes time and there many shows on UCSC.tv. They're also still on a pilot run and looking for a partnership to broaden the reach of the online tv channel. As a result, atleast initially, shows are likely to have a delay when they air.<br /><br />In addition to Sinhalenfoss, I'm also producing another geek talk show appropriately named <a href="http://tv.ucsc.cmb.ac.lk/mediadetails.php?key=5a531d660eeb202919ce&title=Geek+Katha+Epi1">Geek Katha</a>. On this show, we talk about gadgets and technology in Sinhala. Producing a gadget/tech talk show is a challenge and takes a lot of preparation, especially with the style of live recording we do. Once we start recording, its a continuous recording for 25 minutes. The <a href="http://tv.ucsc.cmb.ac.lk/mediadetails.php?key=5a34242b7ebd586a5bfa&title=Geek+Katha+Epi3">PS3 show</a> which will hopefully air next week, took a considerable amount of time to produce. But I enjoy producing both shows and so far the response for both shows have been quite good.<br /><br />So finally I ask you to head on over to the <a href="http://www.ucsc.tv">ucsc.tv web site</a> and watch the shows and provide your valuable feedback and encouragement to take this effort forward. The success of this new venture depends on you.geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com0tag:blogger.com,1999:blog-8947964.post-5926310239782306022009-05-01T12:56:00.006-05:002009-05-01T14:09:13.179-05:00Birthday gift<a href="http://www.flickr.com/photos/babytux/3492063236/" title="pcfix_dsc03406 by Bud's photo blog, on Flickr"><img src="http://farm4.static.flickr.com/3395/3492063236_7705dd0c34.jpg" alt="pcfix_dsc03406" height="375" width="500" /></a><br /><br />Kanchana, my wife had an old P4 machine from back in the days before we met. It was a <a href="http://www.pchouse.lk">PC House</a> branded G-Max. Anyways, she's been wanting to fix it up and give it to her sister to use and so we'd figure it would be a good B' day gift for her. So last weekend, about this time, I was sweating it off trying to put it back together in time for April 25th, her B' day.<br /><br />We'll it began a couple of days earlier, 23rd to be exact, when we picked it up from her parents place and cleaned it up and tried plugging it in. For better or worse, nothing happened - nothing worked. At least this thing wasn't going up in smoke.<br /><br />I removed the casing which was slightly more difficult than usual as a result of the case design. You couldn't pull out the motherboard because the CPU fan hits the CDROM and you couldn't pull out the CDROM for exact same reason. So I removed the power supply, slid the motherboard and lifted it up once the CPU fan was in the clear.<br /><br />The fun part was cleaning off the layers of dust and the few cockroach eggs that could be seen on the chassis. After blowing at the board frantically, I know I needed some sort of compressed air. Not having it, I improvised with my deodorant spray which did the trick. After letting the deodorant alcohol to dry, I went about setting up the system on top of a piece of wood. To my pleasant surprise it turned on, after shorting the jumper pins that would otherwise be connected to the power switch.<br /><br />As expected the CMOS battery was dead and needed replacing, which I did. I also thought it would be good to upgrade the CDROM to a DVD ROM/Writer which I had lying around. This proved quite difficult as it was impossible to get the CDROM from the back and getting it from the front proved quite difficult. There was no straight forward way of getting the front blue panel off from the casing and I struggled for at least half an hour. A half and hour and a bucket full of sweat later I just ripped it off damaging just one revert in the middle. Fortunately this wasn't a big deal as I was able to upgrade the CDROM and reattach the panel.<br /><br />Timing was good as <a href="http://releases.ubuntu.com/9.04/">Ubuntu Jaunty 9.04</a> had been released just a day earlier so I slapped that on while wiping out Windows XP and Redhat 9. Despite the machine being several years old, everything worked quite well, even with the 512MB ram it had. The machine booted in under 20 seconds thanks to the bootup optimizations found in the new Ubuntu. Unfortunately 3D (compiz) did not work even though it had a built in Intel 945 graphics.<br /><br />Since she was into media, I installed inkscape, skype, elisa, audacity, sinhala stuff and various codecs needed to get by. To top it off, we bought her a 17" View Sonic flat panel display which cost only 13K - not bad IMHO.<br /><br />Overall she was quite happy and has somewhat gotten used to working on Linux. A couple of days ago she commented that it was becoming easier to use - easier than Windows XP.<br /><br /><a href="http://www.flickr.com/photos/babytux/3491385907/" title="pcfix_dsc03411 by Bud's photo blog, on Flickr"><img src="http://farm4.static.flickr.com/3399/3491385907_d18cfbee84_t.jpg" alt="pcfix_dsc03411" height="75" width="100" /></a><a href="http://www.flickr.com/photos/babytux/3491385917/" title="pcfix_dsc03401 by Bud's photo blog, on Flickr"><img src="http://farm4.static.flickr.com/3545/3491385917_dafd99e02e_t.jpg" alt="pcfix_dsc03401" height="75" width="100" /></a><a href="http://www.flickr.com/photos/babytux/3491385943/" title="pcfix_dsc03408 by Bud's photo blog, on Flickr"><img src="http://farm4.static.flickr.com/3650/3491385943_ef0f9bf2b8_t.jpg" alt="pcfix_dsc03408" height="100" width="75" /></a>geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com5tag:blogger.com,1999:blog-8947964.post-77206997269765041682009-04-17T19:28:00.001-05:002009-04-17T19:28:21.005-05:00Pulling along ...<p>Considering it's been exactly 1 year since my last post, I pondered a bit on a title for this post. "The comeback", "Still live and kicking" or "I'm back!" sprang to mind. But then I realized I'd be fooling my self and you, if I even remotely meant this blog was going to be regularly updated. Hence a title which better reflects the current state of my blog and to an extent my life - pulling a long a day at a time. <br /><br />Now don't get the wrong impression - I'm not depressed! As some of you already know my life's gone through a few transformations. Hmm let me see, first there was the marriage, then moving to our own (rented) place, loosing my iPod touch (oh the drama), moving again to a bigger house, moving again to a new office, moving away from<br />Gentoo, moving back to Gentoo, buying new iPod touch 2g, getting a new macbook, buying a few ps3 titles including guitar heros 3 and buying an actual guitar and taking up lessons.<br /><br />Ok so that last para covers the highlights that come to my mind at this early 1:42 sweaty AM. For the juicy details leading up to now checkout my Twitter time line. Basically I've been a geekaholic all along, you just didn't see it here. <br /><br />Besides the random stuff, I've been atleast (somewhat) keeping up with the SinhalenFOSS podcast to the point we have at an average 2 shows a month, which is what we planned initially before going for a weekly format that we find difficult sticking to - pulling along. <br /><br />My usual old reason for blogging less was micro blogging and it still holds true. The one thing that will replace micro blogging is telephathy and I suspect we're a couple of decades away from seeing that. <br /><br />zzzzzzzzzz ....<br /><br />5.45am<br />Looks like I fell asleep while typing on the iTouch. Anyway I conclude by asking you to watch this space while I pull my way back in to blogging. <br /></p><br /><p>Posted with <a href='http://lifecast.sleepydog.net'>LifeCast</a></p><br />geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com3tag:blogger.com,1999:blog-8947964.post-7335912240273124282008-04-17T13:46:00.004-05:002008-04-17T15:42:29.990-05:0024hrs Sinhala Blog Marathon Starts Today!The <a href="http://sinhalablogs.com/">Sinhala Bloggers Union</a> is organizing a 24hrs Blog Marathon starting from 8pm local time today, 18th to the 19th. There are currently over 30 participating blogs, including our sister (or brother depending on where your from :) site <a href="http://sinhalenfoss.org">sinhalenfoss.org</a>. Each blogging site will be trying to reach a crazy difficult target of 96 posts! In otherwords, we're talking about a blog post appearing from each site every 15 min! Multiply that by 30+ sites and boy I hope the net doesn't crash :P<br /><br />The easiest way to catch all the fun is to constantly reload <a href="http://marathon.sinhalabloggers.com/">http://marathon.sinhalabloggers.com/</a>, the official blog aggregator (syndicator) for the event. But it doesn't stop there. These guys have created a <a href="http://apps.facebook.com/sinhalablogs/">custom Face Book application</a> for the FB junkies, a <a href="http://twitter.com/BlogMarathon">twitter channel</a> for twitter addicts like myself, a <a href="http://tinyurl.com/5h8q35">Yahoo Pipes mash up</a> for the web services oriented and even IM based updates.<br /><br />If your interested in blogging in Sinhala Unicode, it might not be too late to enroll your self by <a href="http://www.sinhalabloggers.com/contact-us">contacting them</a>.geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com1tag:blogger.com,1999:blog-8947964.post-68829624052928053712008-04-14T03:17:00.006-05:002008-04-14T04:32:41.329-05:00Best Last place to listen to Music<a href="http://www.flickr.com/photos/babytux/2412406449/" title="lastfm by Bud's photo blog, on Flickr"><img src="http://farm4.static.flickr.com/3122/2412406449_791ea0f307_m.jpg" width="240" height="197" alt="lastfm" /></a><br /><br />If your a music lover and also inclined to discover new (&old) music then <a href="http://www.last.fm">last.fm</a> is the place to be. <a href="http://www.last.fm/user/budks/">I've signed up</a> a couple of months back and have really been enjoying music surfing. Thats right! similar to channel surfing last.fm is about discovering music based on what you like and what others who are like you like. There is plenty of research in this field, <a href="http://blog.last.fm/2007/10/01/lastfm-community-enables-music-research">some of which</a> are headed by the folks at lastfm, but what it essentially means is a good trip down music lane :)<br /><br />You can sign up for a free account and search based on artist or a tag such as <a href="http://www.last.fm/tag/pop">pop</a>, <a href="http://www.last.fm/tag/rock">rock</a>, <a href="http://www.last.fm/tag/jazz">jazz</a> or <a href="http://www.last.fm/tag/female%20vocalist">female vocalist</a>. In any case your search result becomes a virtual radio station which goes off on all sorts of tangents discovering similar artists or similar artists, tags and so on. When you hear a song you really like, you can express your pleasure by marking it as loved, or display your hate by banning it and you shall never hear it. You can also skip tracks, unlike web streaming radio but can not seek within the song, since it isn't meant to be "music on demand" sort of a service but more of a personalized radio like service.<br /><br />The killer feature is <a href="http://www.last.fm/help/">scrobbling</a>, which you can get by downloading and installing the <a href="http://en.wikipedia.org/wiki/Free_and_open-source_software">Free/Open Source</a> <a href="http://www.last.fm/download/">last.fm client</a> on your computer or portable device such as an ipod touch/iphone or <a href="http://en.wikipedia.org/wiki/Nokia_N800">N800</a>. What scrobbling essentially is tracking the music you play and uploading that information to the last.fm server against your profile to improve the kind of music you hear and to provide user listening patterns and <a href="http://www.last.fm/charts">charts</a>. <br /><br />The ipod touch/iphone client for those who've managed to <a href="http://www.geekaholic.org/2007/11/hacking-ipod-touch-part-1.html">jail break</a> their device is simply awesome. I am so hooked on listening to music wirelessly while lying in bed and marking tracks I like and checking out lyrics, information on the artist using the integrated wiki lookup as well as checking artist tour dates and locations (not that I'll ever go but its fun). Unfortunately the desktop clients seems to lack some of these fine features integrated for the time being.<br /><br />So if your music lover, go check it out. Its web 2.0 done for music where you can form your own little friend network and discover music together. if your really hooked on it, might even be worth dishing out a couple of extra bucks for the value added features the network provides.geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com0tag:blogger.com,1999:blog-8947964.post-54697700132340714132008-04-09T23:56:00.002-05:002008-04-10T01:05:38.955-05:00Sinhalen FOSS BlogcastRecently I've been trying to start a Blog and Podcast as a means of reducing the language barrier when it comes to learning and playing around with Free and Open Source Software, aka FOSS. So a couple of months back I started the <a href="http://sinhalenfoss.wordpress.com">Sinhalen FOSS Blog</a> at Wordpress.com. The idea of the blog is really to introduce various aspects of FOSS such as howto's, command line tips/tricks etc. and provide a forum for other Bloggers to contribute related articles. In this regard, I am grateful to <a href="http://lakwarus.wordpress.com/">Lakmal</a> and Kanchana for their contributions and hope more of you will take the challenge in <a href="http://si.wikipedia.org/wiki/Wikipedia:Sinhala_Font_Guide">installing Sinhala Unicode</a> and getting in to the habbit of remembering to write in Sinhala :). Yes it can be a pain, especially with the current state of input methods. Yes there are always complexities in writing proper Sinhala. But as far as I'm concerned writing something is better than not trying at all. In this regards, I'd have to thank my mom who is my personal Sinhala specialist (she's got a degree in it), and <a href="http://dhananjayapa.blogspot.com/">Dhahajaya</a> who is an upcoming singer/song writer (not to mention our accountant).<br /><br />So the next natural evolution from text is to get into audio. Actually thats not really the case, but for now about an year I've wanted to start podcasting. I've, myself am (me me me) a huge fan of podcasts and listens to over 10 podcasts while on the road on my ipod. Besides I've always had a thing for talking so finally with the help of 2 more geeks, <a href="http://chanux.wordpress.com/">chanux</a> and <a href="http://seejay.net/">seejay</a>, I've managed get over the inertia of getting into podcasting.<br /><br />With the launch of the podcast, I've decided to get its own domain so from now on both the Sinhalen FOSS Blog and Sinhalen FOSS podcast will live at <a href="http://www.sinhalenfoss.org">http://sinhalenfoss.org</a>. We'd like to call it a BlogCast because its a fusion of both those elements. Talking about the show, its going to be bi-weekly (hopefully) at first and depending on the feedback and our time availability we might go in to a weekly format. The show will roughly be an hour long, where we will talk informally about news events in the foss world, reviews, gossip and what not in sinhala.<br /><br />So go <a href="http://www.sinhalenfoss.org/?p=4">check out the pilot episode</a> and give us feedback on how to improve, what you'd like to hear more of less of.geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com0tag:blogger.com,1999:blog-8947964.post-82235283560724847712008-04-01T06:20:00.007-05:002008-04-01T23:03:49.674-05:00Good bye Gentoo - Hello Leopard<strong>The following post was an <font color="orange">April Fools Hoax</font> :) While bits of it are true such as I did finally move away from Gentoo as my primary distro its got more to do with lack of time and laziness to reinstall gentoo. Hopefully I'll do another post on me moving to Kubuntu (Ubuntu + kubuntu-desktop package). I did find few annoyances but one should expect that running a beta software. I am happily using kubuntu with full desktop effects that is far superior that of Apple of Microsoft.<br /><br />I did buy the Mighty Mouse and Apple keyboard as mentioned I wanted a good portable bluetooth device and they worked effortlessly (almost :) on the latest Ubuntu 8.04.<br /><br />No Leopard's were harmed in creating this hoax as the desktop your seeing on my Acer laptop is actually a full screen screenshot sent to me by Siraj using his Mac notebook which I loaded before taking the picture.<br /> </strong><br /><br />Its been a little over 3 years since I switched from Debian to Gentoo. I still remember my blog post titled "<a href="http://www.geekaholic.org/2005/02/goodbye-debian-hello-gentoo.html">Goodbye Debian, Hello Gentoo</a>" like it was yesterday. Gentoo was a gr8 distro for me since my machine ran like a speeding bullet as a result of being able to optimize it to my old P4 notebook at the time. Actually before loading Gentoo on my notebook I actually ran it a on my AMD64 bare bone system as a 64 bit OS. I later install gentoo on my Mac Mini PPC as well and was loving it.<br /><br />But then came the painful realization of upgrading packages and I would spend a lot of time on the PPC machine and the AMD 64 machine downloading and compiling packages. You see unfortunately in most cases downloading source code takes up more space than the compiled binary and compiling it on an old PPC system or even an old AMD 64 (2GHz equivalent) was a killer. But I must say I enjoyed it - until of course the system was still half broken and I had to do a revdep-rebuild to rebuild those packages.<br /><br />Eventually I gave up running Gentoo on those systems and started running Ubuntu or Debian. Till recently my Apple PPC G4 was running Debian but when I upgraded to Leopard (aka Mac OSX 10.5) it replaced the boot loader so now its just running on that. My AMD 64 is running Ubuntu 64 with a Leopard like theme by following the <a href="http://www.howtoforge.com/mac4lin_make_linux_look_like_a_mac_p6">instructions at Mac4Lin</a>.<br /><br />So basically till last weekend I was only running Gentoo on my main notebook and funny enough it was the same installation I did 3 years ago on my HP P4 notebook. You see when I got this new Acer, I was too lazy to re-install Gentoo so copied the files across and re-emerged everything. And thats how Gentoo is. You can rebuild the whole system and have it like a new one. But for things to work right sometimes you end up recompiling a couple of times only to find by the time your done its time to upgrade again :) But more recently I had a couple of apps which just crashed like wvdial and X with compiz.<br /><br />So last weekend, I carefully backed up my data and started installing Ubuntu. Since I was already using xorg 7.3 and what not I thought I'd install Ubuntu+1 (aka 8.04 beta). Its one thing to switch from Gentoo to Ubuntu but another to also switch from Kde to Gnome. I've been a Kde user for over 6 years! So I did the apt-get install kubuntu-desktop. Oh boy is kubuntu not so ready. Compared to Ubuntu I find it buggy and not polished at all.<br /><br />After spending the whole of weekend I managed to get something which sort of resembles my favorite desktop - Leopard.<br /><br /><a href="http://www.flickr.com/photos/babytux/2377722787/" title="compiz-awn by Bud's photo blog, on Flickr"><img src="http://farm3.static.flickr.com/2209/2377722787_153d1f1d33_m.jpg" width="240" height="150" alt="compiz-awn" /></a><br /><br />But it was such a pain. When ever I rebooted the machine kubuntu would not properly start desktop effects. I spent a couple of hours trying to figure this out and even went on IRC #kubuntu but no luck. Finally I nailed it down to a some sort of a dbus error. And that was the last straw! I've had it spending time messing around with my OS. The reason I left Gentoo was to have an easier time enjoying the desktop and now the same thing with kubuntu!!!<br /><br />All this happened yesterday and I was quite frustrated with Kubuntu and Linux in general. I know linux has come a long way and its still a great OS but when all you want is to "just do it" Linux tends to fail more often. So I was quite frustrated and went for a shower since by then I was feeling so sweaty and sticky. While showering it occurred to me that I was really most comfortable with the Mac OSX GUI than anything else. If you've been following my blog you would remember that I wrote several posts from even before Compiz-beryl was invented on making the <a href="http://www.geekaholic.org/2005/12/obsessed-with-mac-on-linux.html">desktop look more like Mac OSX</a>. I remember a couple of years back I managed to install a hackint0sh version of <a href="http://www.geekaholic.org/2005/11/trip-to-mac-osx-on-intel.html">OSX Tiger on one of the IBM Thinkpad</a>, so yesterday started looking into it. I had the ISO for Tiger but wanted to install Leopard instead. Luckily I had the ISO's for Leopard also :) since I had upgraded my Mac Mini with it so after following the instructions at <a href="http://dailyapps.net/2007/10/hack-attack-install-leopard-on-your-pc-in-3-easy-steps/">this site</a> I managed to get OSX Leopard running on my Acer. <br /><br />Everything pretty much works great such as Sound, 3D acceleration, Networking, Wi-Fi and even bluetooth. I recently bought an Apple Mighty Mouse since I wanted a bluetooth wireless mouse and it works great. It was so much easier to configure on OSX than on Linux.<br /><br />So it looks like for the time being until I have a bit more time on my hand I am leaving the OS I spent the last 8-9 years using for an OS I've aspired all other OSes should become. Apple has always been very innovative and I think the desktop OS will be won by them at the end.<br /><br />I am still a hoping to use a lot of <a href="http://www.opensource.org">Open Source</a> software from within Mac OSX. When I get more time I might even look into the wonderful <a href="http://www.opensource.apple.com/darwinsource/">Darwin system</a> which is a truly Open Source kernel based on <a href="http://en.wikipedia.org/wiki/Berkeley_Software_Distribution">BSD Unix</a>. I am also very keen on getting into <a href="http://developer.apple.com/cocoa/">Apple's Objective C based CoCoa</a> development using <a href="http://developer.apple.com/tools/xcode/">XCode</a> since it will provide me with the means to develop <a href="http://developer.apple.com/iphone/">iPhone apps using the SDK</a>. I also hope to buy an iPhone once the 3G version is available. Wooh! can't wait!<br /><br />Anyway I wrote a lot and I'm quite excited and hope to bring you guys more articles on using Mac OSX!<br /> <br /><a href="http://www.flickr.com/photos/babytux/2379397701/" title="Leopard on my Acer by Bud's photo blog, on Flickr"><img src="http://farm3.static.flickr.com/2273/2379397701_8b2217eb49.jpg" width="500" height="375" alt="Leopard on my Acer" /></a>geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com4tag:blogger.com,1999:blog-8947964.post-56566665038266976022008-03-15T12:49:00.003-05:002008-03-15T14:04:45.289-05:00Wicked Shell Programming Workshop @ UCSCWe had a nice couple of sessions on shell programming today at the <a href="http://www.opensource.lk">LSF lab</a> @ <a href="http://ucsc.cmb.ac.lk">UCSC</a>. <a href="http://www.sayura.net/anuradha/">Anuradha</a> started off with a great introduction that set the stage for the rest of the day. I followed with my invent as I go style of presentation, which for the most part I believe (hope) made sense. Then Sapumal did the evening session on brining all of it together with an advanced session.<br /><br />There was a good geek crowd that filled the small LSF room where geeks were seen on chairs, couches, on top of tables and on their feet. All of the speeches and some interludes were recorded and could go up youtube if and when someone gets around to it. Overall it was a good first day and more will follow tomorrow.<br /><br />I have a session in the evening and will try to pop in and out between my usual Linux lectures at UCSC and the workshop. In the mine time, enjoy these few pics I took from day 1.<br /><br /><a href="http://www.flickr.com/photos/babytux/2334918275/" title="Shell workshop 2008 by Bud's photo blog, on Flickr"><img src="http://farm4.static.flickr.com/3284/2334918275_5e65a3812d_s.jpg" width="75" height="75" alt="Shell workshop 2008" /></a><a href="http://www.flickr.com/photos/babytux/2334919623/" title="Shell workshop 2008 by Bud's photo blog, on Flickr"><img src="http://farm3.static.flickr.com/2239/2334919623_193cf1e3a2_t.jpg" width="100" height="86" alt="Shell workshop 2008" /></a><a href="http://www.flickr.com/photos/babytux/2335745712/" title="Shell workshop 2008 by Bud's photo blog, on Flickr"><img src="http://farm4.static.flickr.com/3199/2335745712_5b35821e70_t.jpg" width="100" height="78" alt="Shell workshop 2008" /></a><a href="http://www.flickr.com/photos/babytux/2334910665/" title="Shell workshop 2008 by Bud's photo blog, on Flickr"><img src="http://farm3.static.flickr.com/2028/2334910665_9d14c7f21d_s.jpg" width="75" height="75" alt="Shell workshop 2008" /></a><a href="http://www.flickr.com/photos/babytux/2334915707/" title="Shell workshop 2008 by Bud's photo blog, on Flickr"><img src="http://farm4.static.flickr.com/3147/2334915707_1b32e4cfff_t.jpg" width="100" height="77" alt="Shell workshop 2008" /></a>geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com3tag:blogger.com,1999:blog-8947964.post-39213750148639860212008-02-24T07:30:00.002-06:002008-02-24T07:45:44.541-06:00Our first IEEE paper has been published!<a href="http://www.ucsc.cmb.ac.lk/index.php?option=com_content&task=view&id=98&Itemid=1">Dr. Ajantha</a> came back from the <a href="http://www.icact.org/">"International Conference on Advanced Communication </a>Technology" after submitting what appears to be our (that is myself and <a href="http://wathsalav.blogspot.com/">Wathsala's</a>) first paper at an IEEE conference. The subject of the paper which you can <a href="http://babytux.org/publications/11e-02.pdf">download and read here</a>, is about a Next Generation Proxy caching system which fuses the idea of Cached content and Bandwidth utilization with web 2.0/x.0 trends.<br /><br />These ideas are the result of what we learned by implementing <a href="http://bassa-blog.blogspot.com">Bassa</a>, an Open Source Next Generation Proxy Server (NGP) at <a href="http://ucsc.cmb.ac.lk">UCSC</a>. We are very excited about continuing our research and development to expand its scope.geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com0tag:blogger.com,1999:blog-8947964.post-45909938545634969242008-02-23T07:10:00.000-06:002008-02-24T07:30:37.109-06:00Just another day @ the Sri Lanka Customs OfficeOh boy what a day it was! I spent a good 5 hrs at the Sri Lankan Airport cargo office on Friday trying to clear a <a href="http://www.memoryx.net/kttmd133256.html">3" tiny piece of memory</a> which I had ordered for my Toshiba L1 laptop.<br /><br />I didn't really expect this to be held by customs in the first place, considering its size, weight, cost and the fact it was a 256MB chip (quite outdated by todays memory standards). I was also misled by FedEx's tracking comments which gave no indication of the shipment being held at customs but stated that it was on its way for delivery. It was only when the courier guy came and handed a letter I knew what happened. What was more amusing was the next update on FedEx's tracking site - Goods delivered.<br /><br />I called up FedEx and they said in order for them to clear it I had to get apply for a VAT number which seemed like an unnecessary hassle. They did suggest I go to the customs and try to sort it out myself - thank you FedEx for getting me FedUp!<br /><br />So I made my way to Katunayake which turned out to be a real Katu (needle) trip :) Granted you have to go through a strip search of the vehicle and get two passes, one for the person and another for the vehicle, considering the current security situation is understandable. What really ticked me off was the series of events that happened after I got in.<br /><br />To be continued ...geekaholichttp://www.blogger.com/profile/16681603430019235684noreply@blogger.com3