camdez.com

Rule #1: There are no rules.

Dasherize HoneySQL Columns

| Comments

Well…Korma didn’t work out… There are lots of things to like about it, but I honestly can’t deal with the fact that when you join on a belongs-to relationship, it simply merges all of columns into a single map (:id-2?).

Anyway, I’ve moved on to HoneySQL for now. Which still features the_kind_of_attribute_names only a database could love. Let’s kick that to the curb.

Dasherize Korma Columns

| Comments

A quick tip for the Korma SQL library for Clojure:

By default, Korma will return column names with underscores (e.g. :this_kind, unmodified from the database convention), while Clojurists are likely more comfortable with :this-kind of dash-separated keyword. (Some people call this “kebab-case” vs. “snake-case”). To wit:

1
2
3
4
5
(defentity locations
  (belongs-to user))

(select locations)
; => ({:user_id 1, :latitude 30.02, :longitude -116.992})

We can convince Korma to automatically translate these names using Korma’s own prepare and transform functions, which are given the chance to mutate attribute names and values on the way to / from the database, as Conan Cook points out in his Tasty Korma Recipes post. But it gets a bit cumbersome to inline this code in every entity definition.

Switching to MELPA Stable: Why, How, What I Learned

| Comments

Background

Writing my first Emacs package a couple months ago left me more cognizant of how Emacs’ packaging system is put together and raised questions about how I use its capabilities. I had been installing all of my packages from MELPA, but now, as a fancy-schmancy package author I’d become intensely aware that MELPA builds its packages based on the latest commit in a project’s repository1. Suddenly I’d become paranoid about exactly what I pushed to the master branch of my project and worried about leaving things in a broken state.

Generally speaking, I keep the master branch on my projects in a functional state, and—yes—I could adopt a development methodology whereby work is always done on a development or feature branch and QA’d before being merged into master. But even if I have the inclination and discipline to manage my projects this way, all of my other Emacs packages are getting built from whatever happened to be pushed to master in their own project when the MELPA bot decided to make the rounds. I’ve run my fair share of beta software, but I don’t need every commit as it happens (cutting vs. bleeding edge).

As it turns out, there’s a new kid on the block—MELPA Stable—and she’s come to solve this exact problem.

Talking Atheism on the Nerd Absurd Podcast

| Comments

The lovely folks from the Nerd Absurd podcast had me on their show a couple months back1, along with my coworker Aaron, to discuss atheism and what it’s like growing up as an atheist in America. I founded—and for several years ran—The Atheists and Freethinkers Society at UT Dallas, an experience which left me with no lack of stories (both entertaining and infuriating) about growing up as part of America’s most-hated minority.

This was my first podcast appearance, so thanks a ton to Nick and Virginia for taking a chance on me! We had a lot of fun doing it and ended up running long enough to make a double episode.

Give a listen to Nerd Absurd episodes 116 and 117, Atheism Part 1 and Part 2! And if that strikes your fancy, make sure to subscribe to their show!


  1. Wish I’d remembered to post this ages ago but no harm done…

The Parable of the Starchart

| Comments

In my sophomore year of college I lived on campus with two roommates. Like so many other college roommates, we had a bit of a problem keeping the kitchen clean.

We weren’t especially messy people on our own, but there’s something unfortunate that happens in shared spaces where responsibility is divided. Tasks without a single, clear owner tend to get ignored while everyone waits to see if the others will tackle it. Questions about fairness arise, and perceived slights set off chain reactions of retribution.

At some point I decided we needed a way to track our individual contributions. I wrote our names across the top of a piece of paper and picked up some gold star stickers (childhood nostalgia, anyone?). The starchart was born.

Private Ruby Gem Versioning

| Comments

TLDR Don’t use version specifiers when referencing gems from git in your Gemfile. Use tags or refs instead.

Most of the time when declaring our Ruby projects’ dependencies via a Gemfile, we pull our gems from a source like RubyGems.org:

1
2
source 'https://rubygems.org'
gem 'rails', '4.1.0'

In this case you nearly always want to specify a version number for each gem to keep bundle update sane and safe.

But when you have your own branch of a public gem, or an internal project that can’t live on the public web, the typical solution is to pull it from git via a :git or :github specification:

1
2
# my (fictitious) fork of rails
gem 'rails', '4.1.0', git: 'https://github.com/camdez/rails.git'

Now, one would probably think that both of these Gemfiles precisely reference a single release of the rails gem (for their respective sources), but there turns out to be a nasty catch:

The RubyGems version always refers to the version of rails pushed as 4.1.0, but the GitHub version refers to the latest commit containing a Gemspec declaring version 4.1.0.

If you’re like most people, happily developing on master, and you’ve released version 4.1.0 of your gem and now you’re pushing changes which will eventually become 4.1.1, 4.2.0, or even 5.0.0 (you’ll decide later), all of those commits are getting pulled into projects where devs have referenced version 4.1.0 of your project via git, every time they bundle.

I nearly got burned by this because I was about to (unknowingly) push a pre-release version of a gem into production, but I happened to notice a difference in gem SHAs in the Gemfile.lock for two branches with the same Gemfile, making it clear that I didn’t have an immutable reference to a release.

So, what can you do? Well, first of all, stop using version specifications on your gems pulled via git. They’re misleading.

Next, you could go full paranoia and start referencing particular git commits via the :ref specifier. Another approach would be to change your development workflow and only merge releases into master1, but I find that to be a lot of ceremony for simple projects, and it could still support the bad habit of using unsafe version specifiers. Personally, I think the best approach is to tag all of your releases and reference them via their tags:

1
2
# my (fictitious) fork of rails
gem 'rails', git: 'https://github.com/camdez/rails.git', tag: 'v4.1.0'

This still gives you the ability to modify what commit is considered to be the release2, but affords dramatically more stability than the moving target that a version specification provides.


  1. Probably best done with git merge --no-ff so that there are never any non-release-ready commits on the branch.

  2. Obviously this should be used sparingly (if ever), like git push --force.

Automatically Installing Your Emacs Packages

| Comments

The interactive list-packages command in modern Emacsen is handy for finding and trying out new packages, but once we’ve found packages we want to use, how can we have them automatically installed on all machines where we use Emacs? There’s a decent Stack Overflow question on the topic, but I want to dig into the various answers a bit and provide a slightly cleaner (IMHO) code snippet.

First let’s define that list of packages we want installed by adding a defvar form to our .emacs:

1
2
3
4
(defvar my/packages
  '(abc-mode
    ;; ⋮
    zygospore))

Now, the obvious part of this problem is using package-installed-p to check if each named package is installed and then installing the missing ones. The less obvious part is that we may need to call package-refresh-contents to get the package repository data (especially on the first load) or else we’ll get errors. High-level, there are four ways I know of to approach this problem:

  1. Always call package-refresh-contents.
  2. Check if package-archive-contents is non-nil.
  3. Test if package-user-dir exists.
  4. Refresh if any packages are missing.

Number one works buts is extremely inefficient—I don’t want to wait for a package repo fetch every time I start Emacs.

Two and three have a fatal flaw: just because we have some package data doesn’t mean that we have the latest data. So it’s entirely possible that we don’t know about a recent package that the user wants to install—errors again.

The fourth method is the way to go. Surprising no one1, @bbatsov shows up with the right answer. But I do think this code could be a hair cleaner, so here’s my take:

1
2
3
4
5
6
7
8
9
10
11
12
(require 'cl-lib)

(defun my/install-packages ()
  "Ensure the packages I use are installed. See `my/packages'."
  (interactive)
  (let ((missing-packages (cl-remove-if #'package-installed-p my/packages)))
    (when missing-packages
      (message "Installing %d missing package(s)" (length missing-packages))
      (package-refresh-contents)
      (mapc #'package-install missing-packages))))

(my/install-packages)

Closing notes:

  • The function2 exists so that you can invoke it manually (M-x my/install-packages RET) if you’ve changed my/packages at runtime.
  • Keep in mind that re-evaluating my/packages after changing it will not do anything because that’s how defvar works. Temporarily change defvar to setq and you’ll be good to go.
  • cl-lib is required for cl-remove-if. How Elisp has made it this far without a filter function, I don’t know.
  • I’m not entirely sure if nil punning is idiomatic elisp or if there’s a more appropriate way to check for empty lists. Anyone know?

  1. Bozhidar Batsov is killing it in the Emacs community right now and you should probably be following him.

  2. Rather than including what is currently the function body at the top level of the .emacs file.

Goal Refactoring Insights

| Comments

In IOI#1 I mentioned “goal factoring”. In a nut, the idea is to apply a consistent, rational process to the goals you’re thinking of setting for yourself to ensure they’re appropriately sized, likely to succeed, and the best path to the underlying thing that you want to accomplish.

I’m currently working on my own goal definition / factoring process, but I wanted to offer a tip of the hat towards these two templates which I’ve repeatedly pored over:

The two key ideas I borrowed from those:

  1. After defining a goal, ask yourself, “How surprised would I be if I failed to achieve this?” If you’d be shocked, you’re done. If not, refine your plan by asking yourself how you’re most likely to fail, and then reworking your goal to prune that particular vector of failure. You should repeat this process until you’re confident that you’ll achieve your goal.

    This doesn’t mean that you have to write less ambitious goals, only that when you have ambitious goals you have to lay out the process by which you’ll get there.

  2. “[F]or each goal, brainstorm some other ways of achieving it. Try to find a better plan than the one you currently have.”

    I really like this idea and I think I’d prefer to amend it to say “write down three different ways of achieving the same goal” to force our hand to actually run through this exercise and not dismiss it assuming we’ve already chosen the best path. Playing this antagonistic role to yourself (i.e. trying to beat your own idea) is a great way of breaking out of optimistic biases, and helps us to avoid false dichotomies (well, n-chotomies).

If this topic interests you, stay tuned. I’m working on coalescing everything I know about goal setting into one handy dandy template. I think it’ll be good. ;)


  1. I don’t know exactly where template came from (or even how I found it), but I know it—like the other template—connects back to CFAR, so there are probably some shared influences.

Résistez à La Résistance

| Comments

I awoke this morning to a text message from an old friend with a rather sharp tongue:

Lol. You’ve gotta stop it with these Facebook posts.

I knew exactly what it meant. It stung a bit because it was precisely what I worry about: I have a tenuous relationship with Facebook because there’s a disconnect between the I people know & consider (Internet) friends, and the interests I have. I vacillate between thinking I should post content I’m interested in (and let the audience self-select) and thinking that I should tailor my posts to my audience.

I’m currently in the first mode. I share what I like, and hopefully I’m opening the door to new connections—maybe I’m entirely wrong in my assumptions of what people like! At least for me, it’s better this way. But I can’t pretend that I’m not a bit self-conscious about what I put out. I fixate on how it must be uninteresting to the majority of people, how boring (not to mention uncool) tech can seem from the outside, and how my interest in the process of self-improvement can seem awfully self-centered.

—and suddenly my alarm went off. 7AM. It slowly dawned on me that there never was a text message from my sharp-tongued friend.

It seems so incredibly devious that my mind would latch onto a pre-existing concern, then take a jab an me in the guise of an individual whose words it knew would sting. And I don’t tend to be an undermining person. The unconscious brain is quite a thing…

I couldn’t help but think of Steven Pressfield’s book The War of Art1, which discusses and personifies the (internal) forces which hold us back from doing the things we want to do with our lives:

RESISTANCE IS INTERNAL

Resistance seems to come from outside ourselves. We locate it in spouses, jobs, bosses, kids. “Peripheral opponents,” as Pat Riley used to say when he coached the Los Angeles Lakers.

Resistance is not a peripheral opponent. Resistance arises from within. It is self-generated and self-perpetuated. Resistance is the enemy within.

RESISTANCE IS INSIDIOUS

Resistance will tell you anything to keep you from doing your work. It will perjure, fabricate, falsify; seduce, bully, cajole. Resistance is protean. It will assume any form, if that’s what it takes to deceive you. It will reason with you like a lawyer and then jam a nine-millimeter in your face like a stickup man. Resistance has no conscience. It will pledge anything to get a deal, then double-cross you as soon as your back is turned. If you take Resistance at its word, you deserve everything you get. Resistance is always lying and always full of shit.


  1. I wish I could recommend this book wholeheartedly but I can’t. The beginning sections are great, but near the end of the book it really goes off the rails with odd ideas regarding the artist’s connection to the gods, or some such garbage. I’d probably tear off the back half before offering it to a friend. I can, however, wholeheartedly recommend the book that led me to Pressfield in the first place: David Mack’s amazing hand-painted / collage / decoupage / strange loop comic masterpiece, Kabuki: The Alchemy, which holds a special place in my collection of books.

Items of Interest #2

| Comments

  • Momo Loves You – need your daily dose of adorableness? Check out my puppy daughter’s Tumblr.
  • A Better Way to Say Sorry – a great procedure for how to apologize and have it actually mean something.
  • Lessig interviews Jack Abramoff – direct questions and answers with America’s most infamous political lobbyist who “pleaded guilty in 2006 to charges of fraud, tax evasion, and conspiracy to bribe public officials”. Every American should listen to this interview to understand the reality of American politics, straight from the horse’s mouth.
  • Rubocop – a fantastic static code analyzer for Ruby which will help you catch common errors and style guide violations. If you write Ruby, you should be using this. If you use Emacs, run it via flycheck-mode.
  • A Radical View – a neat graphical layout of Chinese characters by radical.
  • 12 Rules for Learning Foreign Languages in Record Time – a round-up of some of the most important language learning techniques by someone who really knows what he’s talking about—polyglot Benny Lewis.
  • Beyond the Semantic Web – lecture from Doug Lenat who has been working tirelessly for the last 30 years on Cyc, an attempt to build a complete ontology of everyday knowledge. Incredibly ambitious.