Memory Allocation in C

By Rebecca[Rebecca]

When you begin programming in C, someone will probably take you aside and attempt to explain memory allocation. He might wave his arms in the air or draw rectangles in blue ink on a whiteboard. Or you may be lucky enough to know a true bright-eyed, ungroomed geek, who'll share the presentation with everyone else on the bus all the way to your karate class in Ballard.

Your tutor will not be a certified programming instructor. However, after his lifetime total of 6 years of computer use, as well [as] the 2 semesters of TA experience he accumulated before buying a guitar and amp and dropping out of biology grad school, his head's filled with great aspirations and a growing Venn diagram.

He'll begin with an explanation encompassing data, variables, operators, functions, memory, logic, induction, compilers, elegance, debuggers, and if you're still hearing a word he's saying at this point, it's because you're trying to dodge the spit he sprays every time he shouts "pointer" or "sprintf".

Step back from the scene for a minute. Place him on a windy cliff, give him a better haircut, and imagine him shouting Heathcliff's lines from Wuthering Heights. You find him attractive, don't you? For shame.

At some point, you'll be presented with the "stack" and the "heap"1, the two main types of memory storage in C. They're almost the same, many people don't remember which is which, and given that neither word is in the index of Kernighan & Ritchie2 nor of most other C books (actually, many don't even have a listing for "memory allocation"), you'll probably think the guy made them up. But actually, they're a critical part of the geek lore which has blossomed into a bajillion-dollar global industry. The very same one to which you're now selling out.

Most people have a primal fear of computers, thus the success of Wargames, Terminator, and The Matrix. The tighter our lives (individually and culturally) become entwined with them, the greater our dependence on technology we don't understand, the deeper and more sinister we find this labyrinth of technology. Wouldn't your grandparents worry to know how little you, a real-live programmer, understand about the whole thing, not to mention the people who pay you far too much to glue you to a keyboard? What I have to keep reminding myself is that the computer, the software, and the underlying logic was and still is being designed entirely by almost normal people. The only difference between them and you is that they don't date, giving them much more time to tinker, as it were.

Let's think about computer memory in general. As you probably know from playing around with a computer, every file and program has a specific size, generally noted in kilobytes. If you've tried downloading much from the internet, you've probably come to a point where you don't have enough room left in your computer's memory to download or install something. To fix this, you need to delete things, (freeing the memory). But rather than wait until the last minute, when you've downloaded ¾ of the plugin you need to view all the naughty attachments your officemate Gretchen sends you, it's more efficient and easier in the long run to delete my essay as well as Gretchen's attachments as soon as you're done with them, or the minute you realize how humiliated you will be when someone finds them on your computer. We can use the same theory when it comes to memory within a C program.

The terms "memory", "stack", and "heap", and the way they're discussed in most textbooks, lead you to think of them as simple storage space. However, the way that you'll use memory as a programmer is much more like time or money. Now in one month, you could spend all your time and money with your boyfriend. You could spend it on tight clothes and let 31 different guys take you out. (Oh, you bad thing.) Or you could buy a bigger hard drive and sit at home. You can do all three in three successive months, at least if you're young, or your "hardware is not outdated". But you can't do all three at once. You don't have the time and money to allocate.

One difference between the stack and the heap is that, on the whiteboard, the stack is built with nice rectangles fitted snugly together, while the heap is just a mess to look at. But don't you reach for that eraser yet – let the geek add the bold global variables to the middle of the diagram. You'll like global variables, but don't use too many, or you'll start receiving condescending emails from people with titles like Systems Architect, Senior Software Development Engineer, or Consultant which send carbon copies to the entire software department.

A global variable is like your mother. She's there from the beginning. She'll always be Mom, even if she grows, shrinks, or otherwise changes. And while you don't call her often enough, sooner or later you'll need her and no one else can do.


1What, were Kernighan and Ritchie trash collectors? Why not "stuff" and "things"? Or "these" and "those" for that matter? Personally, I like "bookshelf" and "desktop", "lasagne" and "mac & cheese", "Switzerland" and "New Jersey", "MIT" and "Harvard", or "girls" and "boys".

2The little white book on C with the big, pale blue C on the cover. My, if that (and/or the contents) doesn't just say "Software = fun"?


Author's note: The [preceding] is the introduction to an essay I wrote for myself and some friends while we were learning C programming [in the heady pre-bust times of 1998 or 9 — ed.].


Eddie Kohler