Classically, compilers treated "undefined behavior" as simply an excuse not to check for various types of errors and merely "let it happen anyway." But contemporary compilers are starting to use undefined behavior to guide optimizations.
Consider this code:
int table[5];
bool does_table_contain(int v)
{
for (int i = 0; i <= 5; i++) {
if (table[i] == v) return true;
}
return false;
}
Classical compilers wouldn't notice that your loop limit was written incorrectly and that the last iteration reads off the end of the array. It would just try to read off the end of the array anyway, and return true
if the value one past the end of the array happened to match.
A post-classical compiler on the other hand might perform the following analysis:
- The first five times through the loop, the function might return
true
.
- When
i = 5
, the code performs undefined behavior. Therefore, the case i = 5
can be treated as unreachable.
- The case
i = 6
(loop runs to completion) is also unreachable, because in order to get there, you first have to do i = 5
, which we have already shown was unreachable.
- Therefore, all reachable code paths return
true
.
The compiler would then simplify this function to
bool does_table_contain(int v)
{
return true;
}
Another way of looking at this optimization is that the compiler mentally unrolled the loop:
bool does_table_contain(int v)
{
if (table[0] == v) return true;
if (table[1] == v) return true;
if (table[2] == v) return true;
if (table[3] == v) return true;
if (table[4] == v) return true;
if (table[5] == v) return true;
return false;
}
And then it realized that the evaluation of table[5]
is undefined, so everything past that point is unreachable:
bool does_table_contain(int v)
{
if (table[0] == v) return true;
if (table[1] == v) return true;
if (table[2] == v) return true;
if (table[3] == v) return true;
if (table[4] == v) return true;
/* unreachable due to undefined behavior */
}
and then observe that all reachable code paths return true
.
A compiler which uses undefined behavior to guide optimizations would see that every code path through the being_a_bad_boy
function invokes undefined behavior, and therefore the being_a_bad_boy
function can be reduced to
T& being_a_bad_boy()
{
/* unreachable due to undefined behavior */
}
This analysis can then back-propagate into all callers of being_a_bad_boy
:
void playing_with_fire(bool match_lit, T& t)
{
kindle(match_lit ? being_a_bad_boy() : t);
}
Since we know that being_a_bad_boy
is unreachable due to undefined behavior, the compiler can conclude that match_lit
must never be true
, resulting in
void playing_with_fire(bool match_lit, T& t)
{
kindle(t);
}
And now everything is catching fire regardless of whether the match is lit.
You may not see this type of undefined-behavior-guided optimization in current-generation compilers much, but like hardware acceleration in Web browsers, it's only a matter of time before it starts becoming more mainstream.