next up previous contents index CD CD Algorithms
Next: Logarithms Up: Introduction to Algorithms Previous: The Big Oh Notation

Growth Rates

In working with the big Oh notation, we cavalierly discard the multiplicative constants. The functions tex2html_wrap_inline23505 and tex2html_wrap_inline23507 are treated identically, even though g(n) is a million times larger than f(n) for all values of n. The method behind this madness is illustrated by Figure gif, which tabulates the growth rate of several functions arising in algorithm analysis, on problem instances of reasonable size. Specifically, Figure gif shows how long it takes for algorithms that use f(n) operations to complete on a fast computer where each operation takes one nanosecond ( tex2html_wrap_inline23515 seconds). Study the table for a few minutes and the following conclusions become apparent:    

 
Figure:   Growth rates of common functions measured in nanoseconds

The bottom line is that even by ignoring constant factors, we can get an excellent idea of whether a given algorithm will be able to run in a reasonable amount of time on a problem of a given size. An algorithm whose running time is tex2html_wrap_inline23625 seconds will beat one whose running time is tex2html_wrap_inline23627 seconds only so long as tex2html_wrap_inline23629 . Such enormous constant factor differences between algorithms occur in practice far less frequently than such large problems do.


next up previous contents index CD CD Algorithms
Next: Logarithms Up: Introduction to Algorithms Previous: The Big Oh Notation

Algorithms
Mon Jun 2 23:33:50 EDT 1997