Welcome to Atalasoft Community Sign in | Help

Why All the Assembly Language

Let's be frank, most of my blog entries have referenced assembly language in some way or another.  Why?  Why do I, in an environment in which I have a compiler and a high-level debugger and a bunch of other slick tools, talk so much about assembly language and how it relates to the code that I'm working with?

Part of it is that assembly was the second language that I ever learned (6502 on an Apple II) and once I started really getting it, I approached it with a near-drunken vigor with the comparative power to my other option: Applesoft BASIC.  After writing thousands and thousands of lines of code it affected the way that I thought about programming problems.  I developed what we would now call design patterns and I used them.

I recall the day that I was in a lecture in college in the Assembly Language class that the professor spoke about John Von Neumann and how he would routinely write self-modifying code.  He spoke of this with a high degree of awe, whereas I looked at it as ordinary.  On any given day that I was writing code targeting the Apple II, I would write self-modifying code.  It was a design pattern that I had developed that allow me to shave precious machine cycles off inner loops by avoiding indirect addressing when I could get away with direct addressing by modifying the memory access code to the address I needed before I got to it.

If you write code like this for years, it changes you.  Mind you, I don't write code like this anymore.  I haven't had the need to because CPUs are typically fast or flexible enough for problems that I need to solve without having to bend over backwards.  In fact, I've only written assembly language once in the past 5 years or so for a work project that allows some high level code to identify certain processor features so as to best take advantage of them, and even then it wasn't really more than a few dozen lines.

No, I like the assembly point of view because it is helpful in building higher level models that map nicely onto hardware.  This is nice because if the model is solid and maps nicely on hardware it will either (a) run well right out of the chute or (b) will be trivially optimized into something that will run very well if need be.

I honestly don't write code with efficiency in mind all the time.  I usually write for readability, extensibility, usability etc., but if efficiency comes along for the ride with the others, terrific.  If not, I at least have a direct path there if I have to.

I'd like to live in a world where every line of code is crafted to the bare metal, but life is too short and project time is too precious, but if that's your criteria you need to have a path to better performance if it's an issue.  That type of coding is really reserved for target platforms that have tiny address spaces, and even then that's shifting.  When I look at PDAs that are comparable to high end desktop machines from the late 1980's, I know that hand-crafted code will be done less and less.

The final reason why I keep assembly in my trick bag is debugging.  It might be because I'm really good at it, but I tend to get assigned the really hard bugs.  I get the ones that only happen once in a while, or there are no debugging symbols, and so on.  If you can't read assembly language, you're dead in the water or you're grasping at straws for debugging instead of being methodical.

If I were asked, so what should I learn?  I'd say learn the following:
  1. An 8-bit processor, if only to find out how blessed you are for not having to work under such spartan conditions, and if you ever have to code on an embedded system, you'll already be ahead.
  2. A modernish 32-bit or 64-bit processor.  Personally, I love the 68K instruction set.  It's pretty dang nice in terms of implementing high level languages and it's very generous as far as registers are concerned.  I abhor x86 (but I can still read it and write it with a manual by my side).  Whenever I look at x86 code, it astounds me that it's possible to make it run as fast as it does.  Tip of the hat to Intel and AMD.

Published Thursday, April 06, 2006 6:37 PM by Steve Hawley


No Comments
Anonymous comments are disabled