He didn't suggest that the test suite should account for all possible combinations of byte differences. He suggested that the test suite should account for all possible combinations of SINGLE byte differences. I.E., the test suite should have checked two files with only the first byte different, with only the second byte different, etc. Such a test suite would be linearly complex, not exponentially, and could easily be run before the heat death of anything.
(However, to play my own devil's advocate, I'd have to say it's easy in retrospect to say "yes! there is an easy test for this that should have been written!" when in fact often the number of possible tests is astronomically large and it can be hard to pick the right ones. What if the bug was that FC.EXE didn't correctly register a difference when both the 127th and 128th bytes were the only differences? The proposed test suite would not have caught it.)
Your devil's advocate argument seems to be just my post. I was trying to show that, while the parent's test suite was linear running time, the number of tests in a comprehensive test suite is exponential, therefore the runtime of any completely comprehensive test suite is exponential. Choosing the correct tests to use resources on is a very difficult problem.
(However, to play my own devil's advocate, I'd have to say it's easy in retrospect to say "yes! there is an easy test for this that should have been written!" when in fact often the number of possible tests is astronomically large and it can be hard to pick the right ones. What if the bug was that FC.EXE didn't correctly register a difference when both the 127th and 128th bytes were the only differences? The proposed test suite would not have caught it.)