 # Runtime Overview

If we are given hints on the runtime requirement of a problem during interviews, we can roughly deduce the algorithm as follows. The important thing about runtime complexity is that we usually do not consider constants when analyzing the time complexity. We usually want to find the "family" of functions that most closely match the growth in computational time. For instance, `O(2n)` and `O(5n)` can be loosely said to belong to the family of functions `O(n).` Therefore, if a solution runs in precisely `O(2n)` time, it is a common convention to simply consider it an `O(n)` function and discard the constant. However, sometimes a sufficiently bad constant can substantially increase runtime. One may want to consider optimizing algorithms to have a better constant in some specific cases. In most cases, however, this is unnecessary, and basic runtime analysis can indicate how efficient a solution is.

As a general rule of thumb, a conservative estimate for servers is 10⁸ operations per second.

## `O(1)`

Constant time complexity. Could be

• Hashmap lookup
• Array access and update
• Pushing and popping elements from a stack
• Finding and applying math formula
• Typically for `n > 10⁹`

The following code is `O(1)` because it executes a constant number of operations:

``````1a = 5
2b = 7
3c = 4
4d = a + b + c + 153
5``````
``````1int a = 5;
2int b = 7;
3int c = 4;
4int d = a + b + c + 153;
5``````
``````1int a = 5;
2int b = 7;
3int c = 4;
4int d = a + b + c + 153;
5``````
``````1var a = 5;
2var b = 7;
3var c = 4;
4var d = a + b + c + 153;
5``````

## `O(log(N))`

`log(N)` grows VERY slowly. `log(1,000,000)` is only about `20`. In fact, lookup by primary key in a relational database is `log(N)` (many mainstream relational databases such as postgres use B-trees for indexing by default, and searching in a B-tree is `log(N)`).

In coding interviews, `log(N)` typically means

• Binary search or variant
• Balanced binary search tree lookup
• Processing the digits of a number
• Typically for `n > 10⁸`

Note: Unless specified, we assume that `log(N)` refers to `log₂(N)` or "log base 2 of N"

The following code is `O(log(N))` because `N` is halved each iteration, so the amount of work is logarithmic:

``````1N = 100000000
2while N > 0:
3  # some constant operation
4  N //= 2
5``````
``````1int N = 100000000;
2while (N > 0) {
3  // some constant operation
4  N /= 2;
5}
6``````
``````1int N = 100000000;
2while (N > 0) {
3  // some constant operation
4  N /= 2;
5}
6``````
``````1var N = 100000000;
2while (N > 0) {
3  // some constant operation
4  N /= 2;
5}
6``````

## `O(N)`

Linear time typically means looping through a linear data structure a constant number of times. Most commonly, this means

• Two pointers
• Some types of greedy
• Tree/graph traversal
• Stack/Queue
• Typically for `n ≤ 10⁸`

The following examples are both `O(N)`:

``````1for i in range(1, N + 1):
2  # constant time code
3
4i = 0
5while i < N:
6  # constant time code
7  i += 1
8``````
``````1for (int i = 1; i <= N; i++) {
2  // constant time code
3}
4
5int i = 0;
6while (i < N) {
7  // constant time code
8  i++;
9}
10``````
``````1for (int i = 1; i <= N; i++) {
2  // constant time code
3}
4
5int i = 0;
6while (i < N) {
7  // constant time code
8  i++;
9}
10``````
``````1for (let i = 1; i <= N; i++) {
2  // constant time code
3}
4
5var i = 0;
6while (i < N) {
7  // constant time code
8  i++;
9}
10``````

Because we also ignore constant factors and lower order terms, the following are also `O(N)`:

``````1for i in range(1, 5 * N + 17):
2  # constant time code
3
4for i in range(1, N + 538238):
5  # constant time code
6``````
``````1for (int i = 1; i < 5 * N + 17; i++) {
2  // constant time code
3}
4
5for (int i = 1; i < N + 538238; i++) {
6  // constant time code
7}
8``````
``````1for (int i = 1; i < 5 * N + 17; i++) {
2  // constant time code
3}
4
5for (int i = 1; i < N + 538238; i++) {
6  // constant time code
7}
8``````
``````1for (let i = 1; i < 5 * N + 17; i++) {
2  // constant time code
3}
4
5for (let i = 1; i < N + 538238; i++) {
6  // constant time code
7}
8``````

## `O(K log(N))`

• Heap push/pop `K` times. When you encounter problems that look for the "top K elements", you can usually solve them with pushing and popping to a heap, `K` times, resulting in `O(K log(N))` runtime. e.g., K closest points, merge K sorted lists.
• Binary search K times.
• Typically for `n ≤ 10⁶`

## `O(N log(N))`

• Sorting. The default sorting algorithm's expected runtime in all mainstream languages is `N log(N).` For example, java uses a variant of merge sort for object sorting and a variant of Quick Sort for primitive type sorting.
• Divide and conquer with a linear time merge operation. Divide is normally `log(N)`, and if merge is `O(N)` then the overall runtime is `O(N log(N))`. An example problem is smaller numbers to the right.
• Typically for `n ≤ 10⁶`
``````1N = int(input())
2ar = []
3for i in range(N):
4  m = int(input())
5  ar.append(m)
6ar.sort() # nlogn
7``````
``````1int N = sc.nextInt();
2ArrayList<Integer> ar = new ArrayList<>();
3for (int i = 0; i < N; i++) {
4  int m = sc.nextInt();
6}
7Collections.sort(ar); // nlogn
8``````
``````1int N;
2cin >> N;
3vector<int> ar;
4for (int i = 0; i < N; i++) {
5  int m;
6  cin >> m;
7  ar.emplace_back(m);
8}
9sort(ar.begin(), ar.end()) // nlogn
10``````
``````1var N = parseInt(gets());
2let ar = [];
3for (let i = 0; i < N; i++) {
4  var m = parseInt(gets());
5  ar.push(m);
6}
7ar.sort(); // nlogn
8``````

## `O(N^2)`

• Nested loops, e.g., visiting each matrix entry
• Many brute force solutions
• Typically for `n ≤ 10⁴`

For small `N < 1000`, this is not that bad for modern computers. You can probably pass most Leetcode tests with quadratic time for tiny `N`s. However, in an interview, your solution usually has to do better than quadratic time to impress the interviewer if there is a better solution.

This example is `O(N^2)` because the outer loop runs `O(N)` iterations and the inner loop `O(N)` as well:

``````1for i in range(1, N + 1):
2  for j in range(1, N + 1):
3    # constant time code
4``````
``````1for (int i = 1; i <= N; i++) {
2  for (int j = 1; j <= N; j++) {
3    // constant time code
4  }
5}
6``````
``````1for (int i = 1; i <= N; i++) {
2  for (int j = 1; j <= N; j++) {
3    // constant time code
4  }
5}
6``````
``````1for (let i = 1; i <= N; i++) {
2  for (let j = 1; j <= N; j++) {
3    // constant time code
4  }
5}
6``````

For this example, the outer loop runs `O(N)` iterations, and the inner loop runs anywhere between 1 and N iterations. Since Big O notation calculates worst-case time complexity, we treat the inner loop as a factor of N. Thus, this code is `O(N^2)`.

``````1for a in range(1, N + 1):
2  for j in range(a, N + 1):
3    # constant time code
4``````
``````1for (int a = 1; a <= N; a++) {
2  for (int j = a; j <= N; j++) {
3    // constant time code
4  }
5}
6``````
``````1for (int a = 1; a <= N; a++) {
2  for (int j = a; j <= N; j++) {
3    // constant time code
4  }
5}
6``````
``````1for (let a = 1; a <= N; a++) {
2  for (int j = a; j <= N; j++) {
3    // constant time code
4  }
5}
6``````

## `O(2^N)`

Grows very rapidly. Often requires memoization to avoid repeated computations and reduce complexity.

• Combinatorial problems, backtracking, e.g. subsets
• Often involves recursion and is harder to analyze time complexity at first sight
• Further detailed code examples can be found in the backtracking section
• Typically for `n ≤ 25`

A recursive Fibonacci algorithm is `O(2^N)` because for any `Fib(i)` where `i > 1`, we call `Fib(i - 1)` and `Fib(i - 2)`.

``````1def Fib(n):
2  if n == 0 or n == 1:
3    return 1
4  return Fib(n - 1) + Fib(n - 2)
5``````
``````1public static int Fib(int n) {
2  if (n == 0 || n == 1) {
3    return 1;
4  }
5  return Fib(n - 1) + Fib(n - 2);
6}
7``````
``````1int Fib(int n) {
2  if (n == 0 || n == 1) {
3    return 1;
4  }
5  return Fib(n - 1) + Fib(n - 2);
6}
7``````
``````1function Fib(n) {
2  if (n === 0 || n === 1) {
3    return 1;
4  }
5  return Fib(n - 1) + Fib(n - 2);
6}
7``````

## `O(N!)`

Grows insanely rapidly. Only solvable by computers for small `N.` Often requires memoization to avoid repeated computations and reduce complexity.

• Combinatorial problems, backtracking, e.g. permutations
• Often involves recursion and is harder to analyze time complexity at first sight
• Detailed code examples can be found in the backtracking section
• Typically for `n ≤ 12`

## Amortized Time Complexity

The idea of amortized time is doing a very costly task once in a while. The costly task is done so rarely that it is dilluted away. For example, if we had `N` `O(1)` tasks but only a single `O(N)` task, we could still consider the total `O(N)` instead of `O(N^2)` if `N` is large enough.

The key take away is that amortized time is the average time taken per operation.

The following is an implementation of an O(1) append function for a dynamically sized array:

``````1size = None # number of elements in array
2capacity = None # maximum number of elements array can store
3arr = []
4...
5def append(x):
6  if size == capacity:
7    new_arr = [None] * (2 * capacity)
8    capacity *= 2
9    for i in range(size):
10      new_arr[i] = arr[i]
11    arr = new_arr
12  arr[size] = x
13  size += 1
14``````
``````1int size; // number of elements in array
2int capacity; // maximum number of elements array can store
3int[] arr;
4...
5public static void append(int x) {
6  if (size == capacity) {
7    int[] newArr = new int[2 * capacity];
8    capacity = capacity * 2
9    for (int i = 0; i < size; i++) {
10      newArr[i] = arr[i];
11    }
12    arr = newArr;
13  }
14  arr[size] = x;
15  size++;
16}
17``````
``````1int size; // number of elements in array
2int capacity; // maximum number of elements array can store
3int arr[] = {};
4...
5void append(int x) {
6  if (size == capacity) {
7    int new_arr[2 * capacity];
8    capacity = capacity * 2;
9    for (int i = 0; i < size; i++) {
10      new_arr[i] = arr[i];
11    }
12    arr = new_arr;
13  }
14  arr[size] = x;
15  size++;
16}
17``````
``````1var size; // number of elements in array
2var capacity; // maximum number of elements array can store
3const arr = [];
4...
5function append(x) {
6  if (size === capacity) {
7    const newArr = new Array(2 * capacity);
8    capacity = capacity * 2;
9    for (let i = 0; i < size; i++) {
10      newArr[i] = arr[i]
11    }
12    arr = newArr;
13  }
14  arr[size] = x;
15  size++;
16}
17``````

## Big O Notation Practice

1. What is the asymptotic time bound of functions with these time complexities?
• `3N + 2N + N`
• `2N^3 + 5N^2`
• `N + log(N)`
• `N^2log(N)`
• `2^N + N^2`
• `10`
1. What is the time complexity of this code?
``````1N = int(input())
2ar =  * N
3for i in range(N):
4  ar[i] = int(input())
5max_val = ar
6for i in range(1, N):
7  max_val = max(max_val, ar[i])
8print(max_val)
9``````
``````1int N = sc.nextInt();
2int[] ar = new int[N];
3for (int i = 0; i < N; i++) {
4  ar[i] = sc.nextInt();
5}
6int maxVal = ar;
7for (int i = 1; i < N; i++) {
8  maxVal = Math.max(maxVal, ar[i]);
9}
10System.out.println(maxVal);
11``````
``````1int N;
2cin >> N;
3int ar[N];
4for (int i = 0; i < N; i++) {
5  cin >> ar[i];
6}
7int max_val = ar;
8for (int i = 1; i < N; i++) {
9  max_val = max(max_val, ar[i]);
10}
11cout << max_val << endl;
12``````
``````1var N = parseInt(gets());
2const ar = new Array(N);
3for (let i = 0; i < N; i++) {
4  ar[i] = parseInt(gets());
5}
6int maxVal = ar;
7for (let i = 1; i < N; i++) {
8  maxVal = Math.max(maxVal, ar[i]);
9}
10console.log(maxVal);
11``````
1. What is the time complexity of this code?
``````1N = int(input())
2ar = [ * N for _ in range(N)]
3for i in range(N):
4  for j in range(N):
5    ar[i][j] = int(input())
6
7for i in range(N):
8  ar[i].sort()
9``````
``````1int N = sc.nextInt();
2int[][] ar = new int[N][N];
3for (int i = 0; i < N; i++) {
4  for (int j = 0; j < N; j++) {
5    ar[i][j] = sc.nextInt();
6  }
7}
8for (int i = 0; i < N; i++) {
9  Arrays.sort(ar[i]);
10}
11``````
``````1int N;
2cin >> N;
3int ar[N][N];
4for (int i = 0; i < N; i++) {
5  for (int j = 0; j < N; j++) {
6    cin >> ar[i][j];
7  }
8}
9for (int i = 0; i < N; i++) {
10  sort(ar[i], ar[i] + N);
11}
12``````
``````1var N = parseInt(gets());
2const ar = new Array(N).fill(new Array(N));
3for (let i = 0; i < N; i++) {
4  for (let j = 0; j < N; j++) {
5    ar[i][j] = parseInt(gets());
6  }
7}
8for (let i = 0; i < N; i++) {
9  ar[i].sort();
10}
11``````
1. What is the time complexity of this code?
``````1# assume an O(1) swap(a, b) function that swaps the values a and b
2N = int(input())
3ar = [ * N] * N
4for i in range(N):
5  for j in range(N/2):
6    swap(ar[i][j], ar[i][N - j])
7``````
``````1// assume an O(1) swap(a, b) function that swaps the values a and b
2int N = sc.nextInt();
3int[][] ar = new int[N][N];
4for (int i = 0; i < N; i++) {
5  for (int j = 0; j < N / 2; j++) {
6    swap(ar[i][j], ar[i][N - j]);
7  }
8}
9``````
``````1// assume an O(1) swap(a, b) function that swaps the values a and b
2int N;
3cin >> N;
4int ar[N][N];
5for (int i = 0; i < N; i++) {
6  for (int j = 0; j < N / 2; j++) {
7    swap(ar[i][j], ar[i][N - j]);
8  }
9}
10``````
``````1// assume an O(1) swap(a, b) function that swaps the values a and b
2var N = parseInt(gets());
3const ar = new Array(N).fill(new Array(N));
4for (let i = 0; i < N; i++) {
5  for (let j = 0; j < N / 2; j++) {
6    swap(ar[i][j], ar[i][N - j]);
7  }
8}
9``````
1. What is the time complexity of this code?
``````1# assume the log2(N) function takes O(1) time
2# assume string concatenation takes O(1) time
3def func(str, idx, len):
4  if idx == len:
5    print(str)
6  else:
7    func(str + "a", idx + 1, len)
8    func(str + "b", idx + 1, len)
9...
10N = int(input())
11func("", 0, int(log2(N)))
12``````
``````1// assume the log2(N) function takes O(1) time
2// assume string concatenation takes O(1) time
3public static void func(String str, int idx, int len) {
4  if (idx == len) {
5    System.out.println(str);
6  } else {
7    func(str + "a", idx + 1, len);
8    func(str + "b", idx + 1, len);
9  }
10}
11...
12int N = sc.nextInt();
13func("", 0, (int)log2(N));
14``````
``````1// assume the log2(N) function takes O(1) time
2// assume string concatenation takes O(1) time
3void func(string str, int idx, int len) {
4  if (idx == len) {
5    cout << str << endl;
6  } else {
7    func(str + "a", idx + 1, len);
8    func(str + "b", idx + 1, len);
9  }
10}
11...
12int N;
13cin >> N;
14func("", 0, (int)log2(N));
15``````
``````1// assume the log2(N) function takes O(1) time
2// assume string concatenation takes O(1) time
3function func(str, idx, len) {
4  if (idx === len) {
5    console.log(str);
6  } else {
7    func(str + "a", idx + 1, len);
8    func(str + "b", idx + 1, len);
9  }
10}
11...
12var N = parseInt(gets());
13func("", 0, parseInt(log2(N)));
14``````
1. What is the time complexity of this code?
``````1N = int(input())
2bin = []
3while N != 0:
4  bin.append(N)
5  N //= 2
6``````
``````1int N = sc.nextInt();
2ArrayList<Integer> bin = new ArrayList<>();
3while (N != 0) {
5  N /= 2;
6}
7``````
``````1int N;
2cin >> N;
3vector<int> bin;
4while (N) {
5  bin.emplace_back(N);
6  N /= 2;
7}
8``````
``````1var N = parseInt(gets());
2let bin = [];
3while (N > 0) {
4  bin.push(N);
5  N /= 2;
6}
7``````