What is the difference between Θ(n) and O(n)?

Short explanation:

If an algorithm is of Θ(g(n)), it means that the running time of the algorithm as n (input size) gets larger is proportional to g(n).

If an algorithm is of O(g(n)), it means that the running time of the algorithm as n gets larger is at most proportional to g(n).

Normally, even when people talk about O(g(n)) they actually mean Θ(g(n)) but technically, there is a difference.


More technically:

O(n) represents upper bound. Θ(n) means tight bound.
Ω(n) represents lower bound.

f(x) = Θ(g(x)) iff f(x) =
O(g(x)) and f(x) = Ω(g(x))

Basically when we say an algorithm is of O(n), it’s also O(n2), O(n1000000), O(2n), … but a Θ(n) algorithm is not Θ(n2).

In fact, since f(n) = Θ(g(n)) means for sufficiently large values of n, f(n) can be bound within c1g(n) and c2g(n) for some values of c1 and c2, i.e. the growth rate of f is asymptotically equal to g: g can be a lower bound and and an upper bound of f. This directly implies f can be a lower bound and an upper bound of g as well. Consequently,

f(x) = Θ(g(x)) iff g(x) = Θ(f(x))

Similarly, to show f(n) = Θ(g(n)), it’s enough to show g is an upper bound of f (i.e. f(n) = O(g(n))) and f is a lower bound of g (i.e. f(n) = Ω(g(n)) which is the exact same thing as g(n) = O(f(n))). Concisely,

f(x) = Θ(g(x)) iff f(x) = O(g(x)) and g(x) = O(f(x))


There are also little-oh and little-omega (ω) notations representing loose upper and loose lower bounds of a function.

To summarize:

f(x) = O(g(x)) (big-oh) means that
the growth rate of f(x) is
asymptotically less than or equal
to
to the growth rate of g(x).

f(x) = Ω(g(x)) (big-omega) means
that the growth rate of f(x) is
asymptotically greater than or
equal to
the growth rate of g(x)

f(x) = o(g(x)) (little-oh) means that
the growth rate of f(x) is
asymptotically less than the
growth rate of g(x).

f(x) = ω(g(x)) (little-omega) means
that the growth rate of f(x) is
asymptotically greater than the
growth rate of g(x)

f(x) = Θ(g(x)) (theta) means that
the growth rate of f(x) is
asymptotically equal to the
growth rate of g(x)

For a more detailed discussion, you can read the definition on Wikipedia or consult a classic textbook like Introduction to Algorithms by Cormen et al.

Leave a Comment