# Big O

Ref - Cracking the coding interview

  • Metric for efficiency of algorithms runtime

  • Example

    • Travel or download data
      • Travel - Use airplane, car, etc
      • Download - using internet, ftp, email, etc
    • data size ?
      • Big (lots of TB ) - airplane is faster
      • Small - email is faster
    • Big O ?
      • Electronic - O(size)` - Time is linear with size of file.
      • Airplane - O(1)` - Time is constant (size don't matter)

# Time Complexity

  • Runtimes list (Not fixed)

    • O(log N)
    • O(N log N)
    • O(N)
    • O(N^2) (n square)
    • O(2N) (2 raise to N)
    • O(x)
    • O(xy) ( any variables)
  • Best / Worst / Expected case

    • Ex: For quicksort(uses pivot) on array
      • Best case - O(N) - If all items are same. It will travel just once. {Not much useful in reality.}
      • Worst casr - O(N2) - Pivot keeps changing. If at every change pivot is the largest number.
      • Expected case - O(N log N) - Best & Worst cases are rare.
      • Worst = Expected - Almost all the time.

# Space Complexity

  • O(n) - array with n items
  • O(n2) - 2D array with n x n items

Stack space in recursive calls is counted

  • Example 1 - Sum of 0 to n

    • Time = O(n)

    • Space = O(n) -> (Recursion uses stack)

      function sum(n) {
        if (n < 0) {
          return 0;
        }
        return n + sum(n - 1);
      }
      
  • Example 2 - Sum of 0 to n - But add adjacent items

    • Time = O(n)

    • Space = O(1) -> No stack and only 1 number is stored at a time

      function sum(n) {
        var sum = 0;
        for (let i = 0; i < n; i++) {
          sum += add(i, i + 1);
        }
        return sum;
      }
      function add(x, y) {
        return x + y;
      }
      

# Drop the constants (O(1))

  • If specific inputs (depends on N)
    • O(n) can be better than O(1)
    • O(N2) can be better than O(N)
  • So Big O is just description of rate of increase if N increases.
  • This is why O(1) is never counted in runtime.
    • Means O(2N) is O(N) - 2 is dropped
// 1 loop
for(....){
  if(x<min){min = x}
  if(x>max){max = x}
}
// 2 loops
for(....){
  if(x<min){min = x}
}
for(....){
  if(x>max){max = x}
}

# Drop the Non-Dominant terms

  • Least -> Most
    • (log N) < (N) < (N log N) < (N2) < (2N) < (N!)
  • Examples
    • O(N+N) -> O(2N) -> O(N) (Drop 2)
    • O(N2 + N) -> O(N2) (Drop N which is not dominant)
    • O(N + Log N) -> O(N) -> O()
    • O(2N + 1000N100) -> O(2N)
    • O(B2 + A) (Not dropped since cannot determine dominance)

# Multipart algorithms - Add vs Multiply

// Add ---> O(A+B)
for(...){
  // A
}
for(...){
  // B
}
// Multiply ----> O(A*B)
for(...){
  for(...){
    // A
    // B
  }
}

# Amortized time

  • Google it (maybe ignore it)

# O(Log N)

  • Base of log dont matter for Big O.

  • Ex: In binary search tree when we divide the items into half for each iterations. This will take O(log N) time.

    • If N = 16 then it takes max of 4 iterations to find a number - 1 -> 2 -> 4 -> 8 -> 16 (16/2 = 8, 8/2=4 .....)
    • 24 = 16 --> log2 l6 = 4

# Recursive

function foo(x){
  .....
  return foo(x-1) * foo(x-1);
}
foo(4);
  • Total function calls = Total nodes in nodetree

    • Nodes(N) = Branchesdepth = 2X
    • 16 = 24 (in above ex)
    • depth = x = log N
  • Time = O(N) = O(branchesdepth)

    • Note - Often true but not always
    • If depth is unknown then `depth = log(N)
  • Space = O(depth)

  • Memoizations

    • In Fibonacci series instead of calculating all numbers again & again. We can store the each calculated value in array.
    • Use this stored values to calculate next value - a[x] = a[x-1] + a[x-2]
    • So even if recursive the stored value is returned ie a constant is returned
    • Time = O(N) [Reducing N = 16 ---> 4 ]

. . . . . . . . . . .

. .

Last Updated: 4/7/2021, 10:29:29 AM