2630. Memoize II
Problem Description
You need to create a higher-order function called memoize
that takes any function fn
as input and returns a memoized version of that function.
A memoized function is an optimized version that caches its results. When called with the same inputs multiple times, it returns the cached result instead of recalculating it. This optimization technique trades memory for speed by storing previously computed results.
Key Requirements:
- The input function
fn
can be any function with any number of parameters - The function can accept any type of values as parameters
- Two inputs are considered identical if they are strictly equal (
===
comparison) - If the memoized function is called with identical inputs, it should:
- Return the cached result from the first call
- NOT execute the original function again
Example Behavior:
If you have a function that adds two numbers:
- First call:
memoizedFn(2, 3)
→ executes the original function, caches result5
, returns5
- Second call:
memoizedFn(2, 3)
→ retrieves cached result, returns5
without executing original function - The original function is only called once despite multiple identical calls
The solution uses two Maps:
idxMap
: Assigns unique indices to each unique parameter value (handles reference types correctly)cache
: Stores the computed results with keys generated from parameter indices
The caching key is created by mapping each parameter to its unique index and joining them with commas, ensuring that identical parameter combinations produce the same cache key.
Intuition
The core challenge in memoization is creating a reliable way to identify when we've seen the same inputs before. We need to map function arguments to their computed results.
The first thought might be to use a simple object or Map with the arguments directly as keys. However, this runs into issues:
- JavaScript objects only support string keys
- Converting arguments to strings (like
JSON.stringify
) doesn't work for all types (functions, undefined, symbols) - We need to handle reference types correctly - two different objects with the same content should be treated as different inputs per the
===
requirement
The key insight is that we need a way to uniquely identify each distinct parameter value we encounter. Since ===
is our equality criteria, we can assign each unique value a numerical ID when we first see it. This works because:
- Primitive values that are
===
equal will map to the same ID - Reference types (objects, arrays) will each get their own ID since they're only
===
to themselves
With this ID system in place via idxMap
, we can create cache keys by:
- Converting each parameter to its unique ID:
params.map(getIdx)
- Joining these IDs into a string:
.join(',')
This gives us a string key like "0,1,2"
that uniquely represents a specific combination of parameters.
The cache
Map then stores these string keys mapped to their computed results. When the memoized function is called:
- Generate the cache key from the parameters
- If the key exists in cache, return the stored result
- If not, compute
fn(...params)
, store it, and return it
This two-Map approach elegantly handles all data types while respecting JavaScript's ===
equality semantics, ensuring that the memoized function behaves exactly like the original except it never computes the same inputs twice.
Solution Approach
The solution implements memoization using a closure pattern with two Map data structures for efficient caching.
Data Structures Used:
idxMap: Map<string, number>
- Maps each unique parameter value to a unique index numbercache: Map<string, any>
- Stores the computed results with composite keys
Implementation Walkthrough:
-
Index Assignment Function (
getIdx
):const getIdx = (obj: any): number => { if (!idxMap.has(obj)) { idxMap.set(obj, idxMap.size); } return idxMap.get(obj)!; };
- Checks if the parameter value already has an assigned index
- If not, assigns a new index equal to the current size of
idxMap
- Returns the index for this parameter value
- This ensures each unique value (by
===
comparison) gets a consistent index
-
Memoized Function Creation:
return function (...params: any) { const key = params.map(getIdx).join(','); if (!cache.has(key)) { cache.set(key, fn(...params)); } return cache.get(key)!; };
- Takes any number of parameters using rest syntax
...params
- Generates a cache key by:
- Mapping each parameter to its index:
params.map(getIdx)
- Joining indices with commas:
.join(',')
- Example:
[2, 3]
might become"0,1"
- Mapping each parameter to its index:
- Checks if this key exists in the cache
- If not cached, executes
fn(...params)
and stores the result - Returns the cached result
- Takes any number of parameters using rest syntax
Why This Works:
- The closure preserves access to
idxMap
andcache
across function calls - Using Map instead of plain objects allows any type as keys (including objects and functions)
- The index-based key generation ensures:
- Same parameters → same key → cached result returned
- Different parameters → different key → function executed
- The
===
equality is naturally handled since Map uses SameValueZero equality
Time & Space Complexity:
- Time: O(n) per call where n is the number of parameters (for key generation)
- Space: O(m × n) where m is the number of unique parameter combinations cached and n is the average number of parameters
Ready to land your dream job?
Unlock your dream job with a 5-minute evaluator for a personalized learning plan!
Start EvaluatorExample Walkthrough
Let's walk through a concrete example with a simple sum function:
function sum(a, b) {
console.log("Computing sum...");
return a + b;
}
const memoizedSum = memoize(sum);
Initial State:
idxMap
: empty Mapcache
: empty Map
First Call: memoizedSum(2, 3)
- Parameters:
[2, 3]
- Generate cache key:
- Process parameter
2
:2
not inidxMap
- Add to
idxMap
:{2 → 0}
- Return index:
0
- Process parameter
3
:3
not inidxMap
- Add to
idxMap
:{2 → 0, 3 → 1}
- Return index:
1
- Join indices:
"0,1"
- Process parameter
- Check cache with key
"0,1"
:- Not found in cache
- Execute
sum(2, 3)
→ prints "Computing sum...", returns5
- Store in cache:
{"0,1" → 5}
- Return:
5
Second Call: memoizedSum(2, 3)
- Parameters:
[2, 3]
- Generate cache key:
- Process parameter
2
:- Found in
idxMap
at index0
- Found in
- Process parameter
3
:- Found in
idxMap
at index1
- Found in
- Join indices:
"0,1"
- Process parameter
- Check cache with key
"0,1"
:- Found! Value is
5
- No function execution (no console output)
- Found! Value is
- Return:
5
Third Call: memoizedSum(3, 2)
(different order)
- Parameters:
[3, 2]
- Generate cache key:
- Parameter
3
→ index1
(already in idxMap) - Parameter
2
→ index0
(already in idxMap) - Join indices:
"1,0"
(different from"0,1"
)
- Parameter
- Check cache with key
"1,0"
:- Not found in cache
- Execute
sum(3, 2)
→ prints "Computing sum...", returns5
- Store in cache:
{"0,1" → 5, "1,0" → 5}
- Return:
5
Final State:
idxMap
:{2 → 0, 3 → 1}
cache
:{"0,1" → 5, "1,0" → 5}
This example demonstrates how:
- Each unique parameter value gets a consistent index
- The same parameters in the same order produce the same cache key
- Different parameter orders create different cache keys (as required by
===
comparison) - The original function only executes when there's a cache miss
Solution Implementation
1from typing import Any, Callable, Dict, Tuple
2
3def memoize(fn: Callable) -> Callable:
4 """
5 Creates a memoized version of the provided function.
6 The memoized function caches results based on the arguments passed.
7
8 Args:
9 fn: The function to be memoized
10
11 Returns:
12 A memoized version of the input function
13 """
14 # Dictionary to store unique identifiers for each unique object/primitive argument
15 argument_index_map: Dict[Any, int] = {}
16
17 # Dictionary to store cached results with serialized argument keys
18 result_cache: Dict[str, Any] = {}
19
20 def get_argument_index(argument: Any) -> int:
21 """
22 Gets or creates a unique index for an argument value.
23 This allows handling of object references as distinct cache keys.
24
25 Args:
26 argument: The argument to get an index for
27
28 Returns:
29 The unique index for this argument
30 """
31 # Use id() for unhashable types, otherwise use the value itself as key
32 try:
33 key = argument
34 # Try to hash the argument to check if it's hashable
35 hash(argument)
36 except TypeError:
37 # For unhashable types (like lists, dicts), use object id
38 key = id(argument)
39
40 # If this argument hasn't been seen before, assign it a new index
41 if key not in argument_index_map:
42 argument_index_map[key] = len(argument_index_map)
43
44 # Return the index for this argument
45 return argument_index_map[key]
46
47 def memoized_function(*params: Any) -> Any:
48 """
49 The memoized wrapper function that caches results.
50
51 Args:
52 *params: Variable arguments passed to the original function
53
54 Returns:
55 The result of the function call (either cached or newly computed)
56 """
57 # Create a unique cache key by mapping each parameter to its index and joining
58 cache_key: str = ','.join(str(get_argument_index(param)) for param in params)
59
60 # If result is not cached, compute and store it
61 if cache_key not in result_cache:
62 result_cache[cache_key] = fn(*params)
63
64 # Return the cached result
65 return result_cache[cache_key]
66
67 # Return the memoized function
68 return memoized_function
69
70
71# Example usage:
72# call_count = 0
73# def add(a, b):
74# global call_count
75# call_count += 1
76# return a + b
77#
78# memoized_fn = memoize(add)
79# print(memoized_fn(2, 3)) # 5
80# print(memoized_fn(2, 3)) # 5 (retrieved from cache, function not called again)
81# print(call_count) # 1 (function was only called once)
82
1import java.util.HashMap;
2import java.util.Map;
3import java.util.Arrays;
4import java.util.function.Function;
5
6/**
7 * Memoization utility class that provides caching functionality for functions.
8 */
9public class Memoizer {
10
11 /**
12 * Creates a memoized version of the provided function.
13 * The memoized function caches results based on the arguments passed.
14 *
15 * @param fn - The function to be memoized
16 * @return A memoized version of the input function
17 */
18 public static Function<Object[], Object> memoize(Function<Object[], Object> fn) {
19 // Map to store unique identifiers for each unique object/primitive argument
20 final Map<Object, Integer> argumentIndexMap = new HashMap<>();
21
22 // Map to store cached results with serialized argument keys
23 final Map<String, Object> resultCache = new HashMap<>();
24
25 // Return the memoized function
26 return new Function<Object[], Object>() {
27 @Override
28 public Object apply(Object[] params) {
29 // Create a unique cache key by mapping each parameter to its index and joining
30 String cacheKey = createCacheKey(params);
31
32 // If result is not cached, compute and store it
33 if (!resultCache.containsKey(cacheKey)) {
34 resultCache.put(cacheKey, fn.apply(params));
35 }
36
37 // Return the cached result
38 return resultCache.get(cacheKey);
39 }
40
41 /**
42 * Creates a unique cache key for the given arguments.
43 *
44 * @param params - The function arguments
45 * @return A string representation of the cache key
46 */
47 private String createCacheKey(Object[] params) {
48 StringBuilder keyBuilder = new StringBuilder();
49
50 for (int i = 0; i < params.length; i++) {
51 if (i > 0) {
52 keyBuilder.append(',');
53 }
54 keyBuilder.append(getArgumentIndex(params[i]));
55 }
56
57 return keyBuilder.toString();
58 }
59
60 /**
61 * Gets or creates a unique index for an argument value.
62 * This allows handling of object references as distinct cache keys.
63 *
64 * @param argument - The argument to get an index for
65 * @return The unique index for this argument
66 */
67 private int getArgumentIndex(Object argument) {
68 // If this argument hasn't been seen before, assign it a new index
69 if (!argumentIndexMap.containsKey(argument)) {
70 argumentIndexMap.put(argument, argumentIndexMap.size());
71 }
72 // Return the index for this argument
73 return argumentIndexMap.get(argument);
74 }
75 };
76 }
77
78 /**
79 * Example usage:
80 *
81 * public static void main(String[] args) {
82 * final int[] callCount = {0};
83 *
84 * Function<Object[], Object> memoizedFn = memoize(params -> {
85 * callCount[0]++;
86 * return (Integer) params[0] + (Integer) params[1];
87 * });
88 *
89 * System.out.println(memoizedFn.apply(new Object[]{2, 3})); // 5
90 * System.out.println(memoizedFn.apply(new Object[]{2, 3})); // 5 (retrieved from cache)
91 * System.out.println("Call count: " + callCount[0]); // 1 (function was only called once)
92 * }
93 */
94}
95
1#include <unordered_map>
2#include <vector>
3#include <functional>
4#include <string>
5#include <any>
6#include <sstream>
7
8// Type definition for a function with any parameters and any return type
9using Fn = std::function<std::any(const std::vector<std::any>&)>;
10
11/**
12 * Creates a memoized version of the provided function.
13 * The memoized function caches results based on the arguments passed.
14 *
15 * @param fn - The function to be memoized
16 * @returns A memoized version of the input function
17 */
18Fn memoize(Fn fn) {
19 // Shared pointer to maintain state across lambda copies
20 auto state = std::make_shared<struct MemoState>();
21
22 struct MemoState {
23 // Map to store unique identifiers for each unique object/primitive argument
24 std::unordered_map<std::any*, int> argumentIndexMap;
25
26 // Map to store cached results with serialized argument keys
27 std::unordered_map<std::string, std::any> resultCache;
28
29 // Counter for assigning unique indices
30 int nextIndex = 0;
31 };
32
33 /**
34 * Lambda function that gets or creates a unique index for an argument value.
35 * This allows handling of object references as distinct cache keys.
36 */
37 auto getArgumentIndex = [state](std::any* argument) -> int {
38 // If this argument hasn't been seen before, assign it a new index
39 auto it = state->argumentIndexMap.find(argument);
40 if (it == state->argumentIndexMap.end()) {
41 state->argumentIndexMap[argument] = state->nextIndex++;
42 return state->argumentIndexMap[argument];
43 }
44 // Return the index for this argument
45 return it->second;
46 };
47
48 // Return the memoized function
49 return [fn, state, getArgumentIndex](const std::vector<std::any>& params) -> std::any {
50 // Create a unique cache key by mapping each parameter to its index and joining
51 std::stringstream keyStream;
52 for (size_t i = 0; i < params.size(); ++i) {
53 if (i > 0) {
54 keyStream << ",";
55 }
56 // Get address of the any object for consistent hashing
57 keyStream << getArgumentIndex(const_cast<std::any*>(¶ms[i]));
58 }
59 std::string cacheKey = keyStream.str();
60
61 // If result is not cached, compute and store it
62 auto it = state->resultCache.find(cacheKey);
63 if (it == state->resultCache.end()) {
64 state->resultCache[cacheKey] = fn(params);
65 return state->resultCache[cacheKey];
66 }
67
68 // Return the cached result
69 return it->second;
70 };
71}
72
73/**
74 * Example usage:
75 * int callCount = 0;
76 * auto memoizedFn = memoize([&callCount](const std::vector<std::any>& params) -> std::any {
77 * callCount += 1;
78 * int a = std::any_cast<int>(params[0]);
79 * int b = std::any_cast<int>(params[1]);
80 * return a + b;
81 * });
82 *
83 * std::vector<std::any> args = {2, 3};
84 * auto result1 = memoizedFn(args); // 5
85 * auto result2 = memoizedFn(args); // 5 (retrieved from cache, function not called again)
86 * std::cout << callCount << std::endl; // 1 (function was only called once)
87 */
88
1// Type definition for a function with any parameters and any return type
2type Fn = (...params: any) => any;
3
4/**
5 * Creates a memoized version of the provided function.
6 * The memoized function caches results based on the arguments passed.
7 *
8 * @param fn - The function to be memoized
9 * @returns A memoized version of the input function
10 */
11function memoize(fn: Fn): Fn {
12 // Map to store unique identifiers for each unique object/primitive argument
13 const argumentIndexMap: Map<any, number> = new Map();
14
15 // Map to store cached results with serialized argument keys
16 const resultCache: Map<string, any> = new Map();
17
18 /**
19 * Gets or creates a unique index for an argument value.
20 * This allows handling of object references as distinct cache keys.
21 *
22 * @param argument - The argument to get an index for
23 * @returns The unique index for this argument
24 */
25 const getArgumentIndex = (argument: any): number => {
26 // If this argument hasn't been seen before, assign it a new index
27 if (!argumentIndexMap.has(argument)) {
28 argumentIndexMap.set(argument, argumentIndexMap.size);
29 }
30 // Return the index for this argument
31 return argumentIndexMap.get(argument)!;
32 };
33
34 // Return the memoized function
35 return function (...params: any): any {
36 // Create a unique cache key by mapping each parameter to its index and joining
37 const cacheKey: string = params.map(getArgumentIndex).join(',');
38
39 // If result is not cached, compute and store it
40 if (!resultCache.has(cacheKey)) {
41 resultCache.set(cacheKey, fn(...params));
42 }
43
44 // Return the cached result
45 return resultCache.get(cacheKey)!;
46 };
47}
48
49/**
50 * Example usage:
51 * let callCount = 0;
52 * const memoizedFn = memoize(function (a, b) {
53 * callCount += 1;
54 * return a + b;
55 * })
56 * memoizedFn(2, 3) // 5
57 * memoizedFn(2, 3) // 5 (retrieved from cache, function not called again)
58 * console.log(callCount) // 1 (function was only called once)
59 */
60
Time and Space Complexity
Time Complexity:
- For the
getIdx
function:O(1)
average case for Map operations (bothhas
andset
/get
) - For the memoized function call:
- First call with specific arguments:
O(n)
wheren
is the number of parameters- Creating the key requires mapping over
n
parameters:O(n)
- Each
getIdx
call isO(1)
average - Joining the array into a string:
O(n)
- Checking cache and setting value:
O(1)
average for Map operations - Plus the time complexity of the original function
fn
- Creating the key requires mapping over
- Subsequent calls with same arguments:
O(n)
for key generation, but the original function is not executed
- First call with specific arguments:
- Overall:
O(n + T)
for first call whereT
is the time complexity of the original function, andO(n)
for cached calls
Space Complexity:
idxMap
: Stores up toU
unique parameter values across all function calls, whereU
is the total number of unique objects/primitives passed as arguments:O(U)
cache
: Stores up toC
entries whereC
is the number of unique argument combinations:O(C × R)
whereR
is the average space for storing return values- Key strings: Each key string takes
O(n × d)
space wheren
is the number of parameters andd
is the average number of digits in the indices - Overall:
O(U + C × (n × d + R))
where:U
= number of unique parameter valuesC
= number of unique argument combinationsn
= number of parameters per function calld
= average digits in indicesR
= average space for return values
Common Pitfalls
1. Handling Unhashable Arguments
The Python implementation attempts to handle unhashable types (like lists or dictionaries) by using id()
, but this creates a critical flaw: two lists with identical content will be treated as different arguments because they have different memory addresses.
Problem Example:
def sum_list(lst):
return sum(lst)
memoized_sum = memoize(sum_list)
print(memoized_sum([1, 2, 3])) # Computes: 6
print(memoized_sum([1, 2, 3])) # Computes again: 6 (not cached!)
Solution: Convert unhashable types to hashable representations:
def get_argument_index(argument: Any) -> int:
try:
# For lists and tuples, convert to tuple (hashable)
if isinstance(argument, list):
key = tuple(argument)
elif isinstance(argument, dict):
# Convert dict to sorted tuple of items
key = tuple(sorted(argument.items()))
elif isinstance(argument, set):
key = frozenset(argument)
else:
key = argument
hash(key) # Test if hashable
except TypeError:
# Fall back to id() only for truly unhashable objects
key = id(argument)
if key not in argument_index_map:
argument_index_map[key] = len(argument_index_map)
return argument_index_map[key]
2. Memory Leaks with Unbounded Cache Growth
The cache has no size limit and will grow indefinitely, potentially causing memory issues in long-running applications.
Problem Example:
# This will create millions of cache entries
for i in range(1000000):
memoized_fn(i, i+1) # Each unique pair creates a new cache entry
Solution: Implement an LRU (Least Recently Used) cache with size limits:
from collections import OrderedDict
def memoize(fn: Callable, max_size: int = 128) -> Callable:
cache = OrderedDict()
def memoized_function(*params: Any) -> Any:
cache_key = create_cache_key(params)
if cache_key in cache:
# Move to end (most recently used)
cache.move_to_end(cache_key)
return cache[cache_key]
result = fn(*params)
cache[cache_key] = result
# Remove oldest entry if cache exceeds max size
if len(cache) > max_size:
cache.popitem(last=False)
return result
return memoized_function
3. Nested Mutable Arguments
The solution doesn't handle nested mutable structures properly. Changes to nested objects after caching can lead to incorrect results.
Problem Example:
def process_data(config):
return config['value'] * 2
memoized_process = memoize(process_data)
config = {'value': 5}
print(memoized_process(config)) # 10
config['value'] = 10 # Modifying the object
print(memoized_process(config)) # Still returns 10 (incorrect!)
Solution: Create deep copies of mutable arguments or use immutable data structures:
import copy
def get_argument_index(argument: Any) -> int:
# Create immutable representation for mutable types
if isinstance(argument, (list, dict, set)):
# Create a deep copy and convert to immutable form
key = str(copy.deepcopy(argument))
else:
key = argument
# Rest of the implementation...
4. Function Arguments with Default Values
Memoization doesn't account for default parameter values, treating explicit and default arguments differently.
Problem Example:
def multiply(a, b=1):
return a * b
memoized_multiply = memoize(multiply)
print(memoized_multiply(5)) # Calls function: 5
print(memoized_multiply(5, 1)) # Calls function again: 5 (should be cached!)
Solution: Normalize arguments by including default values:
import inspect
def memoize(fn: Callable) -> Callable:
sig = inspect.signature(fn)
def memoized_function(*args, **kwargs):
# Bind arguments to get normalized form with defaults
bound = sig.bind(*args, **kwargs)
bound.apply_defaults()
# Create cache key from all parameters (including defaults)
cache_key = str(bound.arguments)
# Rest of caching logic...
How does quick sort divide the problem into subproblems?
Recommended Readings
Coding Interview Patterns Your Personal Dijkstra's Algorithm to Landing Your Dream Job The goal of AlgoMonster is to help you get a job in the shortest amount of time possible in a data driven way We compiled datasets of tech interview problems and broke them down by patterns This way
Recursion Recursion is one of the most important concepts in computer science Simply speaking recursion is the process of a function calling itself Using a real life analogy imagine a scenario where you invite your friends to lunch https assets algo monster recursion jpg You first call Ben and ask
Runtime Overview When learning about algorithms and data structures you'll frequently encounter the term time complexity This concept is fundamental in computer science and offers insights into how long an algorithm takes to complete given a certain input size What is Time Complexity Time complexity represents the amount of time
Want a Structured Path to Master System Design Too? Don’t Miss This!