Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No. Big O tells us the theoretical best case for how the compute time will increase as the input size increases for a given algorithm. It does not say anything about the performance of a specific program on a specific input.

The author’s point is that if we have two different inputs A and B, and A is sufficiently smaller than B, then the runtime of B will dominate the program. This would be true even if we have to operate on A with a very slow algorithm.

For example suppose you have some extreme scenario where you have to do three coloring on A and then find the minimum value of B. The runtime would be O(2^A) + O(B). Just plug in numbers and you can see that if B > 2^A then B takes longer even though that algorithm is much faster. If you suppose B is always much larger than 2^A, then B is your bottleneck.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: