Facebook Pixel

Constant vs Linear Time

Now that you can count operations, let's categorize them. Some operations always take the same time, while others grow with input size. Understanding this distinction is fundamental to analyzing algorithm performance.

One Step or Many?

The key question when analyzing code is: Does the number of operations depend on the input size?

  • Constant time: Operations stay the same regardless of input size
  • Linear time: Operations grow proportionally with input size

This difference becomes dramatic as data grows larger.

Constant Time: O(1)

Constant time operations take the same number of steps regardless of input size. We write this as O(1) (pronounced "big-O of one" or "order one").

The most common O(1) operation is array access by index:

Constant Time: Array Access

Whether the array has 3 elements or 1 million elements, accessing arr[index] always takes exactly 1 operation. The computer uses a simple formula to calculate the memory address:

address = base_address + (index × element_size)

This formula is a few arithmetic operations—always the same, no matter how large the array is.

Examples of O(1) Operations

1# All of these are O(1) - constant time
2
3# 1. Variable assignment
4x = 10
5
6# 2. Array access by index
7value = arr[5]
8
9# 3. Array update by index
10arr[3] = 20
11
12# 4. Simple arithmetic
13result = x + y
14
15# 5. Comparison
16is_greater = x > y
17
18# 6. Getting array length (stored as metadata)
19length = len(arr)
20
21# 7. Append to end of dynamic array (amortized)
22arr.append(42)

Linear Time: O(n)

Linear time operations grow proportionally with input size. We write this as O(n) (pronounced "big-O of n" or "order n").

The most common O(n) operation is searching an unsorted array:

Linear Time: Linear Search

To find a value, we must check elements one by one. In the worst case (target not found or at the end), we check all n elements.

  • Small array (n=3): up to 3 checks
  • Large array (n=10): up to 10 checks
  • Huge array (n=1,000,000): up to 1,000,000 checks

The operation count scales directly with n.

Examples of O(n) Operations

1# All of these are O(n) - linear time
2
3# 1. Linear search
4def find_value(arr, target):
5    for i in range(len(arr)):
6        if arr[i] == target:
7            return i
8    return -1
9
10# 2. Sum all elements
11def sum_array(arr):
12    total = 0
13    for num in arr:
14        total += num
15    return total
16
17# 3. Print all elements
18def print_all(arr):
19    for num in arr:
20        print(num)
21
22# 4. Find maximum
23def find_max(arr):
24    max_val = arr[0]
25    for num in arr:
26        if num > max_val:
27            max_val = num
28    return max_val
29
30# 5. Count occurrences
31def count_target(arr, target):
32    count = 0
33    for num in arr:
34        if num == target:
35            count += 1
36    return count

Comparing the Two

Let's visualize how dramatically these two categories differ:

O(1) vs O(n) Graph

Notice how the O(1) line stays flat while the O(n) line grows diagonally. As input size increases, the gap widens dramatically.

Here's a concrete comparison table:

Operation Count Comparison Table

Key observations:

  • O(1): Always 1 operation, even for 1 million elements
  • O(n): Operations grow proportionally—1 million elements means up to 1 million operations
  • The gap: At n=1,000,000, O(n) is 1 million times slower than O(1)

Recognizing Constant Operations

How to spot O(1):

  • No loops
  • No recursion
  • Direct access (array indexing, hash table lookup)
  • Simple arithmetic or comparisons
  • Variable assignments

Example:

1def is_constant(arr):
2    # All O(1) operations:
3    first = arr[0]      # O(1): direct access
4    last = arr[-1]      # O(1): direct access
5    result = first + last  # O(1): arithmetic
6    return result       # O(1): return
7
8# Total: O(1) because no operations depend on array size

Recognizing Linear Operations

How to spot O(n):

  • Single loop through array/list
  • Visiting every element once
  • Searching unsorted data
  • Summing, counting, or processing all elements

Example:

1def is_linear(arr):
2    total = 0             # O(1)
3    for num in arr:       # Loop runs n times
4        total += num      # O(1) per iteration
5    return total          # O(1)
6
7# Total: O(1) + O(n) + O(1) = O(n)
8# The loop dominates, so complexity is O(n)

Constants Don't Change Category

An important insight: constant factors don't change the category.

Consider these two examples:

1# Example 1: Single loop - O(n)
2def process_once(arr):
3    for num in arr:
4        print(num)  # 1 operation per element
5# Total: n operations → O(n)
6
7# Example 2: Two separate loops - still O(n)!
8def process_twice(arr):
9    for num in arr:
10        print(num)  # n operations
11
12    for num in arr:
13        print(num * 2)  # another n operations
14
15# Total: n + n = 2n operations → O(n)
16# We drop the constant 2, so it's still O(n)

Why we drop constants:

In Big-O notation, we care about growth rate, not exact counts. As n gets large:

  • 2n and n both grow linearly
  • 100n and n both grow linearly
  • The constant factor becomes less important than the shape of growth

However, n and are fundamentally different—one is linear, one is quadratic. We'll explore this in the next lesson.

Why This Matters

Understanding O(1) vs O(n) helps you:

  1. Choose the right data structure: Arrays give O(1) access by index
  2. Avoid unnecessary loops: If you can access directly, don't search
  3. Recognize performance cliffs: Going from O(1) to O(n) can slow code dramatically
  4. Design better algorithms: Aim for O(1) operations when possible

Real-world impact:

  • Looking up a user by ID in a hash table: O(1)
  • Searching for a user by name in an unsorted list: O(n)
  • If you have 1 million users, that's the difference between 1 operation and up to 1 million operations!

Key Takeaways

  • O(1) Constant: Operations stay the same regardless of input size

    • Array access by index: arr[5]
    • Variable assignment, arithmetic, comparisons
    • Getting length (stored as metadata)
  • O(n) Linear: Operations grow proportionally with input size

    • Single loop through array
    • Linear search
    • Sum, count, print all elements
  • Comparison: O(1) is dramatically faster than O(n) as data grows

  • Constants don't matter: O(2n) = O(n), both are linear

  • Growth rate matters: The shape of the curve matters more than exact operation counts

In the next lesson, we'll explore O(n²) quadratic time from nested loops, which grows even faster than linear!

Invest in Yourself
Your new job is waiting. 83% of people that complete the program get a job offer. Unlock unlimited access to all content and features.
Go Pro