That kind of bug, an intermittent bug, is the worst kind.
Last month, I started writing about what I know best: software and software development. That column talked a little about the theoretical side of software, and the fact that the main reason software is buggy is that it’s created by humans.
At their heart, programs like Outlook, Quicken, Windows and OS X are long sequences of ones and zeros (binary digits, or “bits”) that tell the the Intel microprocessor in your computer or laptop what to do. And “what to do” is very basic: add two numbers, compare a number with zero, load or store a number in memory, or jump to some other location in memory. Out of these basic steps (called the “instruction set” of the computer), complex software is built.
Of course, humans aren’t well suited to writing long strings of bits. The first programs written were programs to translate human-readable versions of an instruction (like “ADD r0, 1” to add one to whatever number is currently in register 0) into the actual bits that the computer understands. Programming using these symbolic representations of the computer’s instruction set was called programming in “assembly language,” and it was a big step forward in programming productivity. The programs that translated assembly language were called “assemblers.”
But different computers from different manufacturers had different instruction sets. So, assembly language programs couldn’t easily be moved to newer, faster, cheaper computers. If you manufactured computers, one way to keep your customers happy was to make sure your newer, faster, cheaper computers had “backward-compatible” instruction sets, which let old programs run on new machines. But that wasn’t always possible.
Another approach to making programs portable was to create “higher-level” languages with statements that were independent of a particular machine-level instruction set. The first widespread, high-level language was FORTRAN (FORmula TRANslating system), designed for engineering and scientific calculation. Although FORTRAN is still in use (and has seen great improvements over time), high-level languages have proliferated. Now we have C, Java, Perl, Python, Javascript, Ruby, Erlang, Swift, Go, COBOL, Ada, C#, BASIC, Pascal and many others. The programs that translate from high-level languages into machine instructions are called “compilers.”
Programmers wrote software (assemblers and compilers) to make the job of writing software easier. Another important class of software those intrepid programmers created to help them was a “debugger.” Debuggers let programmers step through a program one line of code at a time. Because, as I said before, programs are written by humans and sometimes don’t behave exactly as their human creators expect. Unexpected behaviors, or “bugs,” as they’re called (after a moth that created an unexpected behavior in an early computer), have many sources. Most stem from human fallibility. For example, a programmer may not understand exactly what a high-level language statement does. Or a programmer may simply forget a necessary step in a computation. Some bugs are more insidious. For example, an assembler or compiler may mistranslate a statement into its corresponding bits. So the unexpected behavior in your program is caused by a bug in one of your tools.
Worse, some bugs may only appear under certain circumstances, so that most of the time, the program seems to work properly. That kind of bug, an intermittent bug, is the worst kind. Programmers rely on bugs being repeatable, so stepping through a program with a debugger will eventually reveal the source of the problem. When a bug appears on a seemingly random basis, other techniques are required. Sometimes, debugging your program requires writing more code (and sometimes, your debugging code has bugs). It’s not pretty.
No one writes flawless code, least of all beginners, so one of the big hurdles for people learning to program is the technique of finding and correcting bugs. It takes a certain amount of optimism and perseverance to become a programmer of any sort. The best programmers realize they’re the most likely source of bugs and develop habits that help identify when bugs are present. For example, if your program expects a positive number at a certain point, you might add code (called an assertion) to check that the variable holding that number is greater than zero. If the assertion fails, you know you’ve made a mistake somewhere prior to that part of the code.
Another debugging technique is called “wolf and fence.” Instead of a programmer hunting for a bug in a program, you’re a sheepherder who has heard the howl of a wolf (a bug) in your fenced pasture land (your program). When you hear the wolf, you build a fence that isolates the wolf in a part of the pasture, repeating the process until you can see the wolf and dispatch it. As a programmer, you add fences (usually print statements or stopping points in a debugger) to narrow the scope of a bug until you can identify the section of code that contains it. While no programmer wants to write buggy code, good programmers are accomplished hunters and enjoy learning new ways to trap the wolves.
Programming is just a part of the overall process of developing software. Next month, I’ll talk about software design and what makes it so hard.
Author
-
Michael E. Duffy is a 70-year-old senior software engineer for Electronic Arts. He lives in Sonoma County and has been writing about technology and business for NorthBay biz since 2001.
View all posts