Isn't this a terrible failure of the compiler though? Why is it not just telling you that the `if` is a noop?? Damn, using IntelliJ and getting feedback on really difficult logic when a branch becomes unreachable and can be removed makes this sort of thing look like amateur hour.
Should the compiler emit a warning for such code? Compilers don't behave like a human brain, maybe a specific diagnostic could be added by pattern matching the AST but it will never catch every case.
There's a world of difference between code that's dead because of a static define, and code that's dead because of an inference the compiler made.
A dead code report would be a useful thing, though, especially if it could give the reason for removal. (Something like the list of removed registers in the Quartus analysis report when building for FPGAs.)
> There's a world of difference between code that's dead because of a static define, and code that's dead because of an inference the compiler made.
Not really, that’s the problem. After many passes of transforming the code through optimization it is hard for the compiler to tell why a given piece of code is dead. Compiler writers aren’t just malicious as a lot of people seem to think when discussions like this come up.
Yeah, I know the compiler writers aren't being deliberately malicious. But I can understand why people perceive the end result - the compiler itself - as having become "lawful evil" - an adversary rather than an ally.
Fair point, however your example is a runtime check, so shouldn't result in dead code.
(And if DEBUG is a static define then it still won't result in dead code since the preprocessor will remove it, and the compiler will never actually see it.)
EDIT: and now I realise I misread the first example all along - I read "#if (DEBUG)" rather than "if (DEBUG)".
I am guessing there would be a LOT of false negatives of compilers removing dead code for good reason. For example, if you only use a portion of a library's enum then it seems reasonable to me that the compilers optimizes away all the if-else that uses those enums that will never manifest.
I don't think it is unreasonable to have an option for "warn me about places that might be UB" that would tell you if it removes something it thinks is dead because it assumed UB doesn't happen?
The focus was certainly much more on optimization instead of having good warnings (although any commercial products focus on that). I would not blame compiler vendors exclusively, certainly paying customer also prioritized this.
This is shifting though, e.g. GCC now has -fanalyzer. I does not detect this specific coding error though, but for example issues such as dereferencing a pointer before checking for null.