This blog post is a quick reply to Let’s Stop Copying C.
To begin with, I agree with most of the things Eevee wrote. However, I think she went a little too far and described some of her personal opinion as if it was a fact. I think that as programmers, we need to be more honest about these kind of things, so here goes my rebuttal of some points she made.
What’s Wrong with Integer Division?
The point the author makes about integer division is that it can confuse beginners. Sure enough, it can! But what can’t confuse beginners? After all, they are… beginners. This is consequently not a great argument against integer division, or any other language feature, for that matter.
However, the behavior of integer division is very useful. I would argue that in most cases, one expects it to behave just as it does. The reason for it is simple: when one is working with integers, one often has a problem related to integers in the first place. And we don’t want to involve nasty floating-point operations when, say, trying to zip a list of indices with their corresponding array elements. Performance is not even the point; floating-point numbers are much more complex (and a lot harder to use correctly even for experienced programmers) than integers, and above all, they have different semantics. I think that if one wants to escape the nice and friendly realm of closed operations, one should indicate it explicitly, via a cast – exactly how it’s done in C. Or maybe have another operator that does floating-point division, I don’t mind that either.
What’s Wrong with Increment/Decrement?
Bashing the increment and decrement operators became quite popular since Swift has taken the side of their omission. After all, if Apple does it, it can only be good, right?
Unfortunately, Eevee seems to fall for the extremely common “++ is equivalent with += 1” fallacy. You can see it’s a fallacious statement when, in the end, even the author herself admits that there are things that can’t be implemented in terms of “+= 1“; for instance, incrementing non-random-access iterators.
But look, there’s an even more direct argument: the ++ and -- operators are not even “one” operator. There is a prefix and a postfix variation of them that do slightly different things. The usual semantics is that the prefix ones evaulate to the already-modified expression, while postfix ones yield the original value. This can be very convenient, as in different contexts, code might require the value of one state or another, so having both versions can lead to more concise code and less off-by-one errors. Therefore, one can’t simply say that “++ is equivalent to += 1“, as it is simply not true: it’s not equivalent to at least one of them. In particular, in C, it’s not equivalent with the postfix increment operator.
What’s Wrong with Braces and Spaces? (And Semicolons Too)
Now, let’s pull out the big guns. The article argues that brace-based blocks and indentation-based statement grouping are equivalent and that having both is redundant. It also mentions that terminating semicolons fall into the same category and that they should be replaced with newlines. This is very much not true, though. Spaces are for the human eye, while braces are for the compiler. In a language where whitespace is significant, automatic indentation becomes literally impossible, which is very annoying. I’m no Python expert, but I’ve run into hard-to-debug errors several times because I’ve cut and pasted some code into my function during refactoring, and it misbehaved because of the indentation that didn’t happen to match the then-current indent level.
With regards to semicolons: you can’t just interpret every newline as if it was a semicolon, because newlines become context-sensitive in this way. For example, after a function declaration, a newline doesn’t imply the end of a statement. And now lexing has become context-sensitive too, and it’s entangled with parsing, and it’s a pain in the ass to write, let alone to write it correctly. It’s very confusing to see the author advocate for such a misfeature just after having argued that type-first syntax makes parsing hard.
Treating newlines as statement endings is hard to remember correctly for humans, too, however appealing it might be at first glance. For example, what should this piece of code do?
1 + 2
Should it return unit, or should it return 3? It depends on whether the newline is taken as a statement separator in this case. Blargh.
TL;DR: the designers of C weren’t idiots. Yes, there are many aspects in the design of the language that reflect the 40-year-old need for “speed above everything”, and which are consequently obsolete nowadays. But it would clearly be a mistake to shove every design decision into this one pigeonhole. The three points I’ve emphasized above are clearly not results of short-sightedness – on the contrary, they show great insight into real-life problems programmers encounter in their daily work.