Dave Rapin

Mobile and Web Development

Learning VIM… Finally!

I had the opportunity to work with a bunch of TDD vim hackers during the last 4-5 months over at Nulogy, and after some initial resistance I decided to jump in and learn vim. My motivation is to really see if it lives up to the hype, and more importantly because it’ll make me look like a genius to the layman watching my screen (I.e. Gary Bernhardt).

I’ve tackled vim a few times over the years, but never fully committed to it. Since I believe you need to fully commit to something to do it well (including learning anything), I decided to pick up my touch typing first to really see the benefits of VIM.

Last week I achieved my typing goal of 75 words per minute, inspiration courtesy of Steve Yegge. I’ve since upped my goal to 90 wpm, but I think I’m at least quick enough now to really immerse myself into learning vim.

I’m starting with Peepcode’s Smash Into Vim, this Yehuda post, and of course the built in vim tutor.

Rock on!

Code School - Rails Best Practices Reviewed

Yesterday I ran through the “Rails Best Practices” course over at Code School.

I really like this interactive format. You watch a video / screencast covering a topic or set of topics, then you’re required to code up some excersizes to review the content of the video’s material before you can move on to the next topic.

Overall the format worked really well. The gamification (points etc.) didn’t make a difference for me, but it might be a motivator for some people. I do think they’re on to something here, and from the looks of it (also from the workd marketplace in their tagline) they’ll be refining this as a platform for use with other third party content.

The online editor was actually really well done. It’s not vim or emacs obviously, but it’s not super cludgy like you’d expect, so it works well enough for the small amount of material you’re covering.

I think there’s an opportunity to add in some social features so students can help each other if they get some harder topics going.

As for the content of the “Rails Best Practices” course itself, you can get the same content from here; http://rails-bestpractices.com/, however the code school environment was enjoyable enough.

Handling Customer Addresses for Relational Purchasing

A problem I often run into whenver I build an ordering system is how best to store addresses for customers and orders in an ordering system (like an ecommerce store).

Given the following conditions for an order placement application:

  • Customer’s can register and supply seperate billing and shipping addresses.
  • Orders need to store customer data as a snapshot of when the order was placed in case the customer data is changed or removed in the future (orders should maintain historical integrity).

We have several different ways of accomplishing this with a relational database (document and KV stores are a different story).

  1. Store all address information within the customer and order tables themselves. This is perhaps the easiest solution even though it’s not the most normalized. So you’d have fields like billing_city and shipping_city inside both the customers and the orders tables. The downside is that you’ve created duplicates of the same fields, which uses up a little more storage space (usually not an issue) and requires more work to maintain if you ever needed to change their schema (again, pretty rare occurence for address fields that are well known entities). The upside is it’s very simple to work with from a code perspective.

  2. Store addresses in their own table and associate them to orders and customers using via polymorphic composite keys. In order for this to work you’ll need a composite key of 3 fields; address_type, addressable_type, addressable_id. So the shipping address for a customer would be something like: “Shipping”, “Customer”, 1232. and the billing address for an order could be: “Billing”, “Order”, 2873. etc. The downside is it’s a rather fancy assoication and will add complexity to your ORM code as you override some methods (since no ORM I know of is built to handle this oddball relationship out of the box). The upside is it’s very normalized and you can add new address types on the fly and new classes that can have addresses on the fly.

  3. Store addresses in their own table, but simplify the association by using many-to-one foreign keys. For this to work we just have keys in the address table for each assoication. So in this case we have “billing_customer_id”, “shipping_customer_id”, “billing_order_id”, “shipping_order_id”. The downside is it’s not very normalized / DRY and you won’t be able to add new address types or addressable classes on the fly like you could using the plymorphic associations. The upside is very simple (almost all convention based) ORM code since you’re dealing with belongs_to type relationships.

  4. Use an Address class to define your address fields, but serialize it to text fields wherever it’s used. So you’re ditching the relational style just for the addresses. For this to work you’d have two text fields in your orders table and your customers table; “billing_address” and “shipping_address”. Then you just serialize your address objects to these fields (yaml, xml, json, or whatever). The upside is the same simplicity as solution #1, but without all of the redundancy in your schema. The downside is the potential complexity of code needed to edit and manage the address information and get proper validations to work.

My preferred solution is #4. I think it’s worth the added complexity at the view level when using Rails 3 since it’s not too much extra work (although it could be a little cleaner).

TDD, BDD, and False Assumptions

After watching another Test First presentation related to the Ruby world I figured out what it is that bothers me the most about the TDD Religion.

It seems that everytime a TDD evangelist speaks about non-test driven / traditional development they paint a completely exaggerated and unrealistic picture of what it means to not use TDD. It usually goes something like this: “You spend a year creating a specification, and then another year coding until you’ve built this monolithic application then you manually go through all of the functionality you built for another year fixing bugs etc.”. Seriously? I know we’ve all got horror tales, but come on… who in their right mind has ever worked like this even before all of the Test-first buzz back in 2000? This is a fallacy, and even coding in Fortran sounds better than being involved in this fantasy process.

I don’t actually have a problem with TDD the practice, or BDD as a practice, or even EDD (experiment) the practice. What I do have an issue with are the religious zealots that think it solves all of their problems and will criticize anyone who doesn’t share the same beliefs. Really, it doesn’t.

Here’s how Joe the Programmer who’s never bothered with TDD actually performs his work on a daily basis. He thinks about the big picture. Breaks it up into small accessible problems (basic problem solving). Dives right in and starts building out a solution to tackle one of these small problems. Then he manually tests his small solution to make sure it works and provokes more thought on how it fits into the big picture. Once he’s happy with it, he tackles the next small problem. All the time he’s constantly reevaluating the big picture, identifying new problems, speaking with the client, etc.

Sounds a lot more reasonable than “code for a year” doesn’t it? It almost sounds like it would work really well in most scenarios. It doesn’t help sell the latest tickets to your speech on TDD though, because honestly, how much would TDD actually improve his process?

Please, before you tell everyone how amazing the latest test / behaviour / experiment driven development methodology or tool is, watch this presentation from Rich Hickey on Hammock-Driven Development first and let it sink in. http://clojure.blip.tv/file/4457042/

Better Authentication

Since my last post on authentication and single sign-on I came across an incredibly clever little tool called PasswordMaker.

What makes it so clever is that it changes practically nothing from the normal flow of entering a password and stores nothing locally (so it doesn’t matter if you change browser or computer). You type the same password for everything and it instead submits a unique and incredibly strong password for every site. This is done by creating a one-way hash. One-way hashes is also how we encrypt passwords on the backend of websites before storing them in the database. So basically you’re original password is getting hashed twice for most websites.

How it works:

You install the PasswordMaker extension for your browser of choice. You go to sign up for a new website service (or change your password for an existing one). You type your typical password, let’s say it’s “b@ng3r5”. You should still pick something fairly strong (mix of characters, numbers, symbols, etc.), but even if you didn’t you’re much better off than most. When you submit the sign up form, the PasswordMaker extension creates a hash using the data you’re entering combined with the domain of the website. In other words it’s creating an encrypted version of your real password. This encrypted password is what’s submitted to the website. It may end up being something like “4#ae2!9ljh2vk*8c$21h7wh%s$lz” for example.

You come back to the site another day and are asked to login You type in the same typical password, “b@ng3r5” in this case. When you submit the login form, the PasswordMaker performs the hashing operation again, using the same password and the same domain. This means it will come up with exactly the same hash as it did when you signed up. The site’s server see your encrypted password, I.e. “4#ae2!9ljh2vk*8c$21h7wh%s$lz” which it then submits to it’s own authentication process (usually it also performs a one-way hash again using your password and a random string it generated when you initially signed up and compares that against the encrypted value it has stored against your account).


Some sites still store passwords in clear-text. You’re way safer if one of these sites is compromised since you’re password was already encrypted before it was sent to the site. Using the same password for everything in way safer now than it was without this encryption… it’s probably still a good idea to rotate passwords, but not as big of a deal as it was without the pre-encryption. We’re practically faking single sign-on.

I still think there may be some potential problems that you need to keep in mind.

If a clever hacker compromises a site that’s storing passwords in clear-text they could still potentially crack your password since it will stick out like a sore thumb within the rest of the cleartext passwords. Said hacker will know that yours is the only one that’s been encrypted and he may guess that it was encrypted using PasswordMaker. He would then know that your salt (part of the string being used to generate the hash) is the domain of the site and he can use that information to run dictionary attacks with the domain until he gets the same encrypted result.

Obviously this is pretty unlikely and not worth the effort since there’s so many other passwords requiring no effort, but still using a strong password to begin with will make this practically impossible. The only way I see this happening is if someone is specifically targeting you and the added effort is really worth it… So maybe 1 chance in a google?

I highly recommend you checkout this tool. It has multiple extensions / plugins for every major browser.