Complexity or Order of Growth in Programming




Complexity or Order of Growth of a program plays a major role in efficient programming. It actually gives an idea of the time taken by your code to run and return correct answer.
  • Constant Complexity O(1) : constant running time
  • Linear Complexity O(n) : linear running time
  • Logarithmic Complexity O(log n) : logarithmic running time
  • Log-Linear Complexity O(n log n) : log-­linear running time
  • Polynomial Complexity O(n^c) : polynomial running time (c is a constant)
  • Exponential Complexity O(c^n) : exponential running time (c is a constant being raised to a power based on size of input)
But this doesn't explain anything, so lets go through some simple examples of each of them.

O(1): Constant Complexity

The code will run for a fixed amount of time. Consider following code:

def constant():
     i  = 0
     while i<10: 
          print i
          i = i+1

Output:
>>> constant()
0
1
2
3
4
5
6
7
8
9

The while loop will run 10 times and its fixed i.e. constant, hence the term Constant Complexity


O(n) : Linear Complexity

Run time depends on the value passed into the function. Consider a function that return a^b ( a to the power b)

def iterPower(a, b):
   result = 1
   while b > 0:
      result *= a
      b -= 1
   return result

Here the number of times while loop is iterated depends on the value of b passed in. So the Complexity or Order of Growth for this program is O(n)

O(log n) : Logarithmic Complexity

Consider following code that converts an integer to string (you may need to convert int to string when int is too long to store in int type)

def intToStr(i): #assumes i to be a positive integer
     digits = '0123456789'
     if i == 0:
          return '0'
     result = ''
     while i > 0:
          result = digits[i%10] + result
          i = i/10
     return result

Take a look at while loop, guess how many times will this while loop be executed when we run this code?
It'll depend on how many times we can divide i by 10.
eg.: for i = 100, code will run 2 times(since 100 can be divided by 10 atmost twice) and log 10 also equals 2.
        for 1= 10000, code will run 4 times and log 10000 also equals 4

Hence Complexity or Order of Growth is O(log n)

O(n log n) : Log-Linear Complexity

One of the examples of Log-Linear Order of Growth is Merge Sort Algorithm.

Say we have list, l = [5,2,4,7,1,3,2,6]
In Merge Sort algorithm, we keep dividing the list into half until we get the easiest form where we can directly compare two values. Have a glance here.
Log-Linear Order of Growth

So we finally got a bunch of Singleton Lists. Upto now Complexity is O(log n).

Now we'll go through each of lists like iteration. Please go through following from bottom to top.(acc. to how arrows indicate)
O(n logn) Complexity
We iterate through each part linearly in each step from bottom to top. This makes O(n) Complexity along with previous O(log n). So final result is O(n log n) Order of Growth.
Remaining complexities will be discussed later as we proceed. Stay in touch meanwhile or choose another interesting topic in Python here.

Note:
This is a part of what I learn in an online Open Course Ware offered by MIT on edX
Its for my personal reference & also for those who have missed the course.
You too can enroll yourself on edX (if they are still offering the course MITx 6.00x)