curl request and return headers only

The UNIX command line tools is something that just keeps giving. Within web development I often find myself wanting to quickly debug a URL, see whether it’s alive or what the response is. Often I do not want to download the whole content (a large file for example). Before I learned the following, I would use Chromes Developer Tools. That is until I learned how to do it more efficiently and quicker with good old curl:

curl -I

Which returns something like:

HTTP/1.1 200 OK
Server: cloudflare-nginx
Date: Sat, 27 Jun 2015 17:27:17 GMT
Content-Type: text/html; charset=UTF-8
Connection: keep-alive

It’s especially handy when setting up and testing temporary or permanent HTTP redirects. Doing that in a browser can be cumbersome due to caching.

Faster SSH workflow with multiplexing

I was reading The Art of Command Line (great stuff) and tried the SSH configuration tips. With the below config I noticed considerable speedup in various SSH and Git related workflows. My ~/.ssh/config now includes:

Host *
  ControlMaster auto
  ControlPath /tmp/%[email protected]%h:%p
  ControlPersist yes

Speed improvements I noticed:

  • I push my code to the remote often. Thanks to the keep alive options, the connection is kept open and subsequent pushes do not incur the penalty of establishing a new connections.
  • The same applies to server provisioning and maintenance. Once the initial connection is established, it is kept alive and sessions opened in new terminal tab or window begin instantly.

More on this topic with in depth explanations:

Speeding up bundle install with in-memory file system

On some of the servers I work with, due to cheap hard drives in software RAID configuration, I’ve found that bundle install can be extremely slow (take half an hour to complete). This obviously became unacceptable during deploys.

I thought that it might have something to do with how bundler writes a lot of small files during the installation of the gems. So I decided to try putting the deploy bundle directory (where all the gems are being installed) onto the in-memory filesystem. On Ubuntu this is /dev/shm.

It works flawlessly. The install time improved from half an hour down to a few seconds. After the bundle install is complete however, we do not want to leave the installed gems in the memory, as during restart they would be purged. So we just copy the directory back to the disk. Strangely enough, copying the whole directory from /dev/shm does not trash the disk so much and it only takes up to a minute for a few hundred MB of gems.

It’s cool to be able to find and utilize such a useful and simple part of Linux to solve and work around a slow hardware problem, while for everything else the server does, it’s still perfectly usable and more than capable of performing it.

Here’s my Capistrano 3 lib I use in my deploys that integrates this speedup:

namespace :bundler_speedup do
  task :symlink_to_shm do
    on roles(:all) do
      bundle_shm_path = fetch(:bundle_shm_path)
      # Make sure bundle dir exists
      execute "if [ ! -d #{shared_path}/bundle ]; then mkdir #{shared_path}/bundle; fi" 

      # todo: what if #{shared_path}/bundle is a symlink - meaning an interrupted install from previous time?

      cmds = []
      # Copy the bundle dir to /dev/shm/
      cmds << "cp -r #{shared_path}/bundle #{bundle_shm_path}"
      # Remove the shared bundle dir and symlink the shm dir instead
      cmds << "mv #{shared_path}/bundle #{shared_path}/bundle.old"
      cmds << "ln -s #{bundle_shm_path} #{shared_path}/bundle"
      # We're ready to do a fast in-memory bundle install now...
      execute cmds.join(' && ')
      info "shared/bundle was copied to /dev/shm for in-memory bundle install"

  task :remove_from_shm do
    on roles(:all) do
      bundle_shm_path = fetch(:bundle_shm_path)
      cmds = []
      # Copy the shm bundle to shared
      cmds << "cp -r #{bundle_shm_path} #{shared_path}/"
      # Remove the symlink and move in the dir on disk
      cmds << "rm #{shared_path}/bundle"
      cmds << "mv #{shared_path}/ #{shared_path}/bundle"
      # Remove the in memory bundle
      cmds << "rm -rf #{bundle_shm_path}"
      cmds << "rm -rf #{shared_path}/bundle.old"
      # Bundle is persisted and in place
      execute cmds.join(' && ')
      info "shared/bundle was restored from bundle install within /dev/shm"
  before 'bundler:install', 'bundler_speedup:symlink_to_shm'
  after 'bundler:install', 'bundler_speedup:remove_from_shm'

namespace :load do
  task :defaults do
    set :bundle_shm_path, -> { "/dev/shm/#{fetch(:application).gsub(' ', '_').downcase}_bundle" }

In a Rails project, place it in lib/capistrano/tasks/bundler_speedup.rake. Capistrano should auto-load this for you.

This code is released under the MIT license.

Effort – Personal To-do and Project manager

I have open sourced a Rails app that I’ve been personally using for years. The code is available on Github under the MIT license. From the README:

I’ve modeled this app for my own personal use, note keeping and personal project management loosely after Basecamp. The single most important point for me is to have To-do lists that work in a particular way – that’s why I’ve build this for myself.

I am open-sourcing it to see if somebody finds it useful and can maybe build on it. Let’s see what happens.

This is a standard Rails 4 app, build the “Rails way”. Test coverage is minimal, just enough for the purposes of this app at this stage.

effiort todo lists
To-do lists – the most important and the most used part of this project.

Installing Windows 8.1 on 2009 Mac mini

Today I was busy with refurbishing an old 2009 Mac mini, software wise. It’s such a nice device and it’s still running well, apart from the dead dvd rom. Until now, it was running Windows XP, which is no longer supported by MS, so it was time to upgrade. I bought a fresh copy of Windows 8.1 and did a native install, without Bootcamp. It took quite a few times to figure out which combination of disk formatting and architecture (x86/x64) this old Mac mini can handle.

I followed this guide, but figured out through trial and error what works for this particular machine. The biggest difference is that I ended up installing the x86 (32 bit) version of Windows on a MBR type disk partition scheme. The other combinations mentioned in the article resulted in the machine not being able to boot.

My guide

  1. Power on the Mac, hold down Alt to be able to select the startup disk.
  2. The Windows installation DVD should be in your Mac’s drive or an external DVD drive (both will work).
  3. Select this DVD to boot from, do not select any UEFI options.
  4. Once the installer starts, fire up the command line (Shift+F10) and issue the following commands:
select disk 0
convert mbr

This is where we diverge from the above linked guide: we’re converting the disk to use the older MBR partition scheme as this is what our 32 bit Windows needs to work on this Mac. Once this is done, you can exit the command line and continue with the installer as normal.

The only thing that did not work for me after the Windows is installed, is the build in sound card. I ended up using an external one that was lying around.

Windows 8.1 is surprisingly snappy on just 2GB of RAM that this Mac mini has and overall the machine is a joy to use for some office work, which is its purpose.

Disclaimer: follow this guide on your own risk.

Sketch replaced Photoshop in my web-design workflow

I am really happy that I stumbled on some article (can’t find it now) comparing Sketch to Photoshop. This convinced me to give Sketch a try for a web-design part of a project I was working on. I downloaded the trial version, went through a few tutorials and I quickly saw the potential of greatly improving my workflow. Sketch is clearly a tool that was developed with web in mind from the start. It paid off, because once I started designing the website, it’s UI, logo and typography I was impressed at how much faster I was able to accomplish things compared to Photoshop.

Better tool opens up time for experimentation

What I did not necessarily expect at the beginning is that the time and effort that Sketch saves me putting my idea or vision into the computer as a graphic design–this saved time I would use for experimentation and fine tuning of my work. It’s so much easier to just select all the elements of the same type on a webpage design and fine tune the border, shadow or size all in one go (just one of many features of Sketch). This results in me producing work that I am more satisfied with.

Vectors are everywhere

In Sketch, everything is a vector. Now vector graphics is not something I had too much experience with. In the past, I used Illustrator only on a few occasions, never becoming fully comfortable with it to a degree where I would use it as the first place to go when I needed to create a logo or similar graphic asset for a project. So this time, I gave myself a specific task to finish, unrelated to any work project I was busy with. The whole point of this task was to push myself to improve my vector skills and produce something tangible, not just playing aimlessly with shapes and colors. I created this african mask illustration and made it available on Envato (which is also something I wanted to try for a while).

African Mask


Learning this new tool and enjoying the faster and more effortless web-design workflow is something I now enjoy thanks to Sketch.

Script to update PhpBB 3.0.x to 3.1.x

Recently I had to upgrade a dozen PhpBB boards to the latest version. Previously I would do this by hand, which would take days. This time though, while reading through the update notes, I noticed that it is possible to update the database through the console (I assume this was only introduced in 3.1). That was the necessary prerequisity to be able to automate everything. I ended up with the following script:

#!/usr/bin/env bash
echo "Upgrading PhpBB instal in '$1' to 3.1 with files from '$2'..."

cd $1

echo "Deleting old files..."
shopt -s extglob
rm -r !(config.php|images|files|store)
shopt -u extglob

echo "Copying in 3.1 files..."
rsync -a --exclude='.git' $2/ $1/

echo "Migrating the database..."
php ./bin/phpbbcli.php db:migrate --safe-mode

echo "Removing the install dir..."
rm -r install

echo "Done."

The usage is as follows (assuming you named the script phpbb-update-to-3.1):

phpbb-update-to-3.1 /www/forum-to-update /tmp/phpbb-3.1-files-for-upgrade

The second parameter is the path to where you have prepared your 3.1 files, according to this guide. You should put your custom styles in this folder too.

If you have custom folders or files within your current install, make sure you add those to the rm -r !(config.php|images|files|store) line in the update script. This rm deletes everything except the files within the curly braces.

This saved me a lot of time and prevented mistakes. Enjoy!

Bash: Underscore a String

I find it weird that I did not find a ready made script for this immediately through a google search. So this is what I arrived at after some digging:

#!/usr/bin/env bash
read input
str="$(echo $input | tr '[:upper:]' '[:lower:]')"
printf ${str// /_}

Use it with stdin:

echo "Underscore Me" | underscore

I use this in TextMate when I need to convert normal english string/text into a parameter/key.

Automate everything

To automate everything, even the smallest things (scripts), is a lesson I continuously learn while doing programming.

Let’s take MySQL and setting up replication for example. For years I would rely on googling and then copy and pasting the same few SQL commands to configure the master, slave and start the replication. Forward to around now, while crafting the Percona Docker Container I finally decided to encapsulate the process into 2 simple to use and remember commands.

Now, I can just exec into my MySQL container and type:


To get the SQL to execute on the master, tailored for the particular configuration. Once I execute this SQL on the master, I finish the whole “process” and start the replication with something like:

start_replication mysql-bin.000001 107

Where the last two arguments are the binlog position of where to start the replication.

When I first used this in production, the experience was like night and day compared to looking up the correct queries and then modifying them according to my setup. Thus—automate everything. It pays of every time you need it afterwards.