The Cake Is a Lie

codemouse92

Jason C. McDonald

Posted on January 31, 2017

The Cake Is a Lie

Originally posted on indeliblebluepen.com.

Allow me to turn your world on its ear for a moment – programming is not a science. Sure, we call it computer science, but there is nothing inherently scientific about the process.

“We are relying on mechanical devices! you may insist. “Computer science is built on physics. On the one hand, this is true, in the same way that painting is based on the science of light and color. Dyes and materials are developed using science, but the act of painting is itself an art. There is no inherent “right way to paint. The sweeping panoramas of Monet cannot be compared to the portraits and scenes of Leonardo da Vinci, and neither can be compared to the abstract works of Picasso or impressionistic works of Van Gogh. Proponents and schools of each style exist, lending volume and popularity to each but doing little to disestablish the others.

All of computer science is built on binary mathematics, but even that falls apart when we explore other bases, such as the brave new frontier of base-3 quantum computing.

There are no hard-and-fast laws of programming that we can fall back on. Unlike Newtonian physics which, while incomplete, still holds fairly firmly true, every assumption in programming can be broken, or inexplicably ceases to work, when probed too deeply.

The New Hacker's Dictionary describes the enigmatic “schrodinbug”, a programming construct which works as expected until someone simply observes that it is impossible. Similarly, the “heisenbug” changes its behavior every time it is observed. While these sound like they were taken from works of Lewis Carroll, I can affirm from my years in this field that these, and many other strange things, are very real. Programming offers a Wonderland all its own, with seeming impossibilities grinning out at us like Cheshire Cats at every turn. Spend any amount of time among hackers, and you will learn that we are all indeed mad.

This is not to claim that programming is illogical. Like Wonderland, there is a stable and reliable form of logic: everything spoken comes true, in an irritatingly literal fashion. Computers will not bend to your will until you learn to drive all ambiguity from your thought processes, and even once you have, these mechanical monsters will defy your dearly-held logical assumptions.

Yet, there remains an instability in programming. As Gerald Weinberg once noted...

"If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization."

This is oft cited by hackers as ‘Weinberg's Second Law'.

The instability is of our own making, and I humbly submit that it came from an incomplete acknowledgement, if not an outright denial, of the inherent artistic character of programming. For decades, we have built our digital solutions, and then foolishly stamped them with the same authority as Newton's Laws. We do things this way because this is the way we do things.

Computers don't handle infinite loops well, and that would seem to include the circular logic of the human programmers, perhaps on some inexplicable point of principle.

Because we've declared our ideas as law at a very early stage, everything else we built on top of those ideas is now unstable. When something breaks down towards the binary level, the entire structure comes crumbling down around our ears.

Perhaps this is why we have all of our computer science students build tiny programs for their homework. From experience, I can say that our methods do NOT scale up! When we're making tiny LEGO models, we can deny that the table we're building on is held together by Scotch tape and rubber bands. As soon as we try to make anything significant, that table falls apart.

Alan Kay, inventor of the object-oriented programming paradigm, made a similar observation.

"When you blow something up [by] a factor of a hundred, it gets by a factor of hundred weaker in its ability, and in fact, what will happen to this dog house; it would just collapse into a pile of rubble.”

He goes on to pinpoint our field's habit of denial.

“The most popular [response] is to say, ‘Well, that was what we were trying to do all along.' Put more garbage on it, plaster it over with limestone, and say, ‘Yes, we were really trying to do pyramids, not Gothic cathedrals.'”

And yet, we keep building new layers on top of this. We keep trying to make this whole mess easier for humans to understand and use, through a process called “abstraction”. We add layers that bring the process of coding closer to English and, conversely, further away from binary. In his article, The Law of Leaky Abstractions, Joel Spolsky pointed out,

Abstractions fail. Sometimes a little, sometimes a lot. There's leakage. Things go wrong. It happens all over the place when you have [them].

All of this is not to say that we should throw everything out the window and start from scratch. We got a lot of things right, otherwise the machine in front of you right now would not be functional at all. What we instead need to learn to do is to take nothing for granted. The only laws we have in programming are those of our own invention. Innovation begins when we start to ask the dangerous question, “Why do we do it this way?”, and then demand a better answer than “Because we've always done it this way.”

Going hand-in-hand with this is the problem of the entire principle of Best Practice, a mythical beast on par with any law of computer science. I do not wish to suggest that standards are inherently bad – we need something to keep this smoldering pile of rubble organized, so we can continue to clean it up, and not make it worse. However, there is a vast difference between standards and Best Practice.

For one thing, standards change from one group to the next, and for good reason. Taking the earlier analogy of painting, if you are a student of the school of realism, there are certain techniques that you must follow to achieve the style you aim for. No one else can study, learn from, or improve upon your work unless everyone is working off the same style.

However, this standard becomes completely irrelevant when you decide instead to join the school of impressionism. You adopt a completely different set of standards that fit your new goal. Similarly, if you are pioneering a new style of painting, the standards are now being defined entirely by yourself.

It is also worth noting that, even within a given set of standards, deviation is not only uncommon but necessary for defining one's own style. Comparing any of the greats within a particular school, one can find considerable differences. Bear in mind in the old adage: “one must know the rules in order to know when to break them.”

The same holds true in programming, which is why Best Practice is such a divisive and dangerous concept. It suggests that there is a unified standard to which all programmers must hold true, and that deviation from the standard is an unspeakable crime. This is the fastest way to stop innovation, because it asserts that we already know all of the answers. Some proponents of this view would even say that we've discovered all there is to discover in software; the only thing left to do is to put a few extra layers of polish on our trophy cases and call it good. Indeed, some have made this claim.

Yet, even in our comfortable world of binary-based programming, we continue to find new things to do with this machine. If we were to be so bold and tear down every assumption we've made in search of where we went wrong, we would then be tasked with rebuilding the entire world of computing in a more stable fashion. With new frontiers like quantum computing on the horizon, we will undoubtedly continue to discover new wonders, new possibilities, and new irritants in this inherently human-built digital universe.

This renewed journey of discovery starts with the simple, yet surprisingly challenging, first step of acknowledging that everything we've assumed about programming is at best unstable, and at worst completely wrong. This is not a field of science that we can someday know everything about, though that is the bait dangled in front of many programmers to keep them silently following our long-held assumptions without question. In short, as fans of the massively popular video game Portal would assert: the cake is a lie.

I, for one, am no longer seeking the cake. There are new and better methods yet to discover, and creative ways to improve the ones we have. That is where our time and efforts are best spent.

💖 💪 🙅 🚩
codemouse92
Jason C. McDonald

Posted on January 31, 2017

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related

The Cake Is a Lie
computerscience The Cake Is a Lie

January 31, 2017