This follows easily from a simple fact in Calculus:
and we have the following inequality:
Here we can conclude that S = 1 + 1/2 + … + 1/n is both Ω(log(n)) and O(log(n)), thus it is Ɵ(log(n)), the bound is actually tight.
More Related Contents:
- Computational complexity of Fibonacci Sequence
- How to solve T(n)=4T(sqrt(n/2))+n^(3/2)
- What is a plain English explanation of “Big O” notation?
- Time complexity of nested for-loop
- What is O(log(n!)) and O(n!) and Stirling Approximation
- What would cause an algorithm to have O(log log n) complexity?
- What is O(log(n!)), O(n!), and Stirling’s approximation?
- What is the complexity of this simple piece of code?
- Why do we ignore co-efficients in Big O notation?
- Can an O(n) algorithm ever exceed O(n^2) in terms of computation time?
- Computing time complexity in C [closed]
- Calculating Big O Notation for a algorithm [duplicate]
- How can I find the time complexity of an algorithm?
- What does O(log n) mean exactly?
- Difference between Big-O and Little-O Notation
- Time complexity of python set operations?
- Is list::size() really O(n)?
- Time complexity of Euclid’s Algorithm
- Are there any O(1/n) algorithms?
- What is Constant Amortized Time?
- What would cause an algorithm to have O(log n) complexity?
- Is there anything that guarantees constant time for accessing a property of an object in JavaScript?
- Sorting in linear time? [closed]
- What is Big O notation? Do you use it? [duplicate]
- Big O of JavaScript arrays
- Prove that binary heap build max comparsion is (2N-2)
- What’s the time complexity of iterating through a std::set/std::map?
- Why is bubble sort O(n^2)?
- Computational complexity of base conversion
- Example of O(n!)?