Computer science classes sometimes don't reflect reality
Recollecting my experience in my Systems Programming class.
We had a 50 MiB quota on a Debian oldstable server. There was no gdb,
no Valgrind, and no realistic way to build large tooling from source within the
quota.
In reality, one doesn’t have an arbitrarily tiny storage limit, because it directly fights productivity.
The TAs required us to submit .c code fragments, which they compiled and
linked against their own test runners. We were marked off for procedure naming
conflicts. Neither the TAs nor the students were using static to limit
function scope to a compilation unit. The class never covered static for
internal linkage. Given that, this failure mode fell squarely on the test
harness, not the students.
In reality one knows the API/ABI boundaries of their codebase. Random symbol collisions like this are rare.
The TAs also ran our code against Valgrind (or a similar memory leak checker)
and marked us off for leaks. The class never discussed Valgrind, glibc’s
built-in heap checking facilities, MALLOC_CHECK_*, mtrace, or even basic
allocate instrumentation. We were penalized for failing check which we had no
tools — or instruction — to reproduce locally.
In reality, one has access to a debugger and memory tracer.
At one point, we were asked to find the tail node of a linked list. The list was guaranteed to be non-empty.
node n = head; /* Invariant: head != NULL */
for (; n->next; n = n->next) {}
/* n is tail */
I submitted this without thinking much of it. Apparently it was seen as elegant and “advanced.”
In reality, this is just normal, idiomatic C.