Skip to content

What The Code of Hammurabi Can Teach Us About Hardware and Software Liability

The Information Security community has been debating software reliability for decades. Some say software is too advanced to place guarantees on it. And others say that unless we hold creators/builders responsible, nothing will improve.

Nassim Taleb’s Skin in the Game is what got me thinking about this topic.

As it turns out, the most ancient law that we know of, from 4,000 years ago, has a lot to say about this. The Code of Hammurabi is our earliest known set of official laws, and it comes from Babylonian civilization in Mesopotamia.

What’s remarkable about it is not just how it tells you not to steal or kill, but how it focuses on things like making charges against people without proof, and building and selling things without standing behind them.

It basically says that need to be an upstanding citizen in all affairs. And if you are lazy, selfish, or otherwise do things that cause other people harm, then the law will punish you.

Here are some callout examples related to creating things that other people use.

228: If a builder build a house for some one and complete it, he shall give him a fee of two shekels in money for each sar of surface.

This is how much you can charge for software.

229: If a builder build a house for some one, and does not construct it properly, and the house which he built fall in and kill its owner, then that builder shall be put to death.

I can think of a few software products that fell in and killed their owners.

232: If it ruin goods, he shall make compensation for all that has been ruined, and inasmuch as he did not construct properly this house which he built and it fell, he shall re-erect the house from his own means.

Yep, you’re on clean-up duty. And you’re paying for the damages.

233: If a builder build a house for some one, even though he has not yet completed it; if then the walls seem toppling, the builder must make the walls solid from his own means.

Something something beta software.

235: If a shipbuilder build a boat for some one, and do not make it tight, if during that same year that boat is sent away and suffers injury, the shipbuilder shall take the boat apart and put it together tight at his own expense. The tight boat he shall give to the boat owner.

If your software starts sinking before a year has elapsed, you need to tighten up that codeboat for free.

That’s pretty potent stuff for 4,000 years ago.

Nassim Taleb points out how deeply these concepts have penetrated our soecieties through time, giving examples like:

Unsupervised Learning — Security, Tech, and AI in 10 minutes…

Get a weekly breakdown of what's happening in security and tech—and why it matters.

  • Ceasers couldn’t just rule from the throne; they were fighters on the front lines in order to earn that respect back home
  • Roman commanders would be held accountable for losses
  • Bridge builders had to sleep under their bridges for a period of time, with their families, to show that they stood behind (under) their work

I do think there is a fundamental difference between a building and a piece of software. The former has quite a small number of variables in it, and it’s much easier to blame the builder when it goes wrong.

With software, it’s so new and ridiculously complex that everyone (including the builder) is surprised it works at all.

The law have been dated to around 1754 BC.

But I think there’s still something to this. Especially as we start talking about building an Internet of Things upon which we’ll stack our new society.

At some point, the software becomes a lot more like a building. Not because it’s easier to build, but because it can actually fall down and kill people.

The central concept here is what Taleb calls Skin in the Game, which seems to be largely what the Code of Hammurabi was about. You couldn’t just claim things, or take things, or make things.

To the extent that your actions affected others, there were consequences.

My Slow-motion IoT Train Crash analogy

I’m not saying that this maps perfectly to building software in 2018, but it will likely become a lot more similar when software failures become matters of public safety.