10+ Must-know JavaScript Coding Interview Questions

10+ essential JavaScript coding interview questions with solutions, organized by interview level (junior, mid, senior). Curated by senior engineers and former interviewers from leading tech companies.
Author
GreatFrontEnd Team
19 min read
Mar 28, 2025
10+ Must-know JavaScript Coding Interview Questions

These 18 questions cover what front-end interviewers ask in JavaScript coding rounds, from junior screens to senior onsites. Every question includes a full working implementation you can study and adapt, plus a link to a practice page on GreatFrontEnd where you can solve it in our IDE with test cases.

If you're looking for additional JavaScript interview preparation materials, also check out:

What gets asked at which level

Different interview rounds reach for different questions. Use this table to plan what to focus on first based on the role you're targeting.

LevelWhat you'll be askedQuestions in this guide
Junior / new gradFoundational JS, array methods, basic timing utilities1-6: Debounce, Throttle, Array.map, Array.filter, Array.reduce, Flatten
Mid-levelObject manipulation, real production patterns, common utilities7-13: Deep Equal, Deep Clone, Data Merging, classnames, Once, Memoize, Get
Senior / staffAsync patterns, OOP design, functional composition, DOM internals14-18: Event Emitter, Promise.all, Promise.allSettled, Curry, getElementsByClassName

Junior questions still come up in senior rounds. The bar shifts to whether you can discuss edge cases (leading vs trailing edge, memory leaks, async ordering) rather than just produce working code.


Junior level

1. Debounce

Debouncing delays a function's execution until a certain amount of time has passed since the last call. Every new call resets the timer. The function ultimately runs once, after the burst of calls has stopped.

The classic use case is a search input: you only want to fire a request once the user pauses typing, not on every keystroke.

function debounce(fn, wait) {
let timeoutId;
return function (...args) {
clearTimeout(timeoutId);
timeoutId = setTimeout(() => fn.apply(this, args), wait);
};
}
// Usage
const handleSearch = debounce((query) => {
console.log('Search:', query);
}, 300);
document.getElementById('search').addEventListener('input', (e) => {
handleSearch(e.target.value);
});

The key details to mention if asked: forwarding this and args correctly so the wrapped function's call site behaves normally; remembering to expose a cancel() method for React unmount cleanup; and the difference between leading-edge and trailing-edge debounce (Lodash defaults to trailing only).

Practice implementing Debounce on GreatFrontEnd →

2. Throttle

Throttling caps how often a function can run — at most once per time interval, regardless of how many times it's called. Calls during the cooldown are dropped (or in better implementations, coalesced into a trailing call).

Throttle is the right tool for scroll, mousemove, resize — events where you want regular updates during a continuous interaction.

function throttle(fn, limit) {
let inThrottle = false;
let lastArgs = null;
return function (...args) {
if (!inThrottle) {
fn.apply(this, args);
inThrottle = true;
setTimeout(() => {
inThrottle = false;
if (lastArgs) {
fn.apply(this, lastArgs);
lastArgs = null;
}
}, limit);
} else {
lastArgs = args;
}
};
}
// Usage
const handleScroll = throttle(() => {
console.log('Scroll position:', window.scrollY);
}, 100);
window.addEventListener('scroll', handleScroll);

For anything that paints to the screen, mention requestAnimationFrame throttling instead. rAF aligns with the browser's paint cycle and stops firing entirely in background tabs. setTimeout doesn't align with paint, and while modern browsers throttle it in background tabs (Chrome and Firefox cap it at roughly 1Hz), it keeps firing.

Practice implementing Throttle on GreatFrontEnd →

3. Array.prototype.map

map returns a new array of the same length, with each element transformed by the callback. It's the most-used array method in modern JavaScript and a common ask for "implement a polyfill for X" interview rounds.

Array.prototype.myMap = function (callback, thisArg) {
const result = new Array(this.length);
for (let i = 0; i < this.length; i++) {
if (i in this) {
result[i] = callback.call(thisArg, this[i], i, this);
}
}
return result;
};
// Usage
[1, 2, 3].myMap((x) => x * 2); // [2, 4, 6]

Two details that separate junior from senior answers: handling sparse arrays correctly (the if (i in this) check skips holes, matching native behavior), and accepting the optional thisArg second parameter that controls the callback's this.

Practice implementing Array.prototype.map on GreatFrontEnd →

4. Array.prototype.filter

filter returns a new array containing only the elements for which the callback returns truthy. Like map, it preserves order and skips sparse-array holes.

Array.prototype.myFilter = function (callback, thisArg) {
const result = [];
for (let i = 0; i < this.length; i++) {
if (i in this && callback.call(thisArg, this[i], i, this)) {
result.push(this[i]);
}
}
return result;
};
// Usage
[1, 2, 3, 4].myFilter((x) => x % 2 === 0); // [2, 4]

A classic interview follow-up: "implement filter in terms of reduce." Worth knowing — it tests whether you understand how the array methods relate.

function filterViaReduce(arr, fn) {
return arr.reduce((acc, x, i) => (fn(x, i, arr) ? [...acc, x] : acc), []);
}

Practice implementing Array.prototype.filter on GreatFrontEnd →

5. Array.prototype.reduce

reduce applies a function against an accumulator and each element to reduce the array to a single value. It's the most general of the array methods — map and filter can both be expressed in terms of reduce.

Array.prototype.myReduce = function (callback, initialValue) {
let accumulator;
let startIndex;
if (arguments.length >= 2) {
accumulator = initialValue;
startIndex = 0;
} else {
if (this.length === 0) {
throw new TypeError('Reduce of empty array with no initial value');
}
accumulator = this[0];
startIndex = 1;
}
for (let i = startIndex; i < this.length; i++) {
if (i in this) {
accumulator = callback(accumulator, this[i], i, this);
}
}
return accumulator;
};
// Usage
[1, 2, 3, 4].myReduce((sum, x) => sum + x, 0); // 10

Common interview gotchas to mention: the difference between providing and omitting the initial value (without it, the first element becomes the initial accumulator and iteration starts from index 1); the four parameters the callback receives (accumulator, current value, index, array); and the TypeError thrown when reducing an empty array with no initial value.

Practice implementing Array.prototype.reduce on GreatFrontEnd →

6. Flatten

Flatten converts a nested array into a single-level array. Modern JavaScript provides Array.prototype.flat(depth) for this, but implementing it from scratch is a recurring interview ask.

function flatten(arr) {
return arr.reduce(
(acc, val) => acc.concat(Array.isArray(val) ? flatten(val) : val),
[],
);
}
// Usage
flatten([1, [2, [3, [4, [5]]]]]); // [1, 2, 3, 4, 5]

For production code, use arr.flat(Infinity) — it's more readable and handles edge cases. For the interview, the recursive reduce version above is the clean answer. A senior follow-up to be ready for: "what if the nesting is so deep that recursion overflows the stack?" Answer: use an iterative stack-based approach (push a copy of the array onto a stack, then pop and expand sub-arrays in place). Avoid arr.shift() for this — it's O(n) per call, which makes the whole traversal O(n²).

Practice implementing Flatten on GreatFrontEnd →


Mid-level

7. Deep Equal

Deep equal recursively compares two values for structural equality. Unlike ===, which compares object references, deep equal walks both structures and compares values at every depth.

function deepEqual(a, b) {
if (a === b) return true;
if (a == null || b == null) return false;
if (typeof a !== 'object' || typeof b !== 'object') return false;
if (Array.isArray(a) !== Array.isArray(b)) return false;
const keysA = Object.keys(a);
const keysB = Object.keys(b);
if (keysA.length !== keysB.length) return false;
for (const key of keysA) {
if (!keysB.includes(key) || !deepEqual(a[key], b[key])) return false;
}
return true;
}
// Usage
deepEqual({ a: 1, b: { c: 2 } }, { a: 1, b: { c: 2 } }); // true
deepEqual([1, [2, 3]], [1, [2, 3]]); // true

The basic version above handles the common case. Senior follow-ups: handling Date, RegExp, Map, Set (each needs a type-specific comparison), NaN (should be considered equal to itself, even though NaN === NaN is false), and circular references (track visited pairs in a WeakMap to avoid infinite recursion).

Practice implementing Deep Equal on GreatFrontEnd →

8. Deep Clone

Deep clone creates a fully independent copy of a value, where modifying the clone never affects the original at any depth.

function deepClone(value) {
if (value === null || typeof value !== 'object') return value;
if (Array.isArray(value)) {
return value.map(deepClone);
}
const result = {};
for (const key of Object.keys(value)) {
result[key] = deepClone(value[key]);
}
return result;
}
// Usage
const original = { a: 1, nested: { b: 2 } };
const copy = deepClone(original);
copy.nested.b = 99;
console.log(original.nested.b); // 2 (unaffected)

For production code in modern environments, prefer the built-in structuredClone(value) — it handles Date, RegExp, Map, Set, ArrayBuffer, typed arrays, and circular references correctly. It throws a DataCloneError on functions, Symbols, and most DOM nodes. It also silently strips custom prototypes, so a class instance comes back as a plain object with the own properties copied.

For the interview, the recursive version above is the standard answer. Be ready to discuss why JSON.parse(JSON.stringify(...)) is a common but buggy alternative — it drops undefined and functions, turns Date into a string, turns NaN and Infinity into null, and throws on circular references.

Practice implementing Deep Clone on GreatFrontEnd →

9. Data Merging

Merging combines multiple objects (or arrays) into one. JavaScript provides several built-in approaches; the right choice depends on whether you need shallow or deep merging.

// Shallow merge with spread (modern, idiomatic)
const merged = { ...obj1, ...obj2 }; // obj2's keys overwrite obj1's
// Shallow merge with Object.assign (older API, same result)
const merged2 = Object.assign({}, obj1, obj2);
// Deep merge — recursive
function deepMerge(target, source) {
if (typeof target !== 'object' || typeof source !== 'object') return source;
const result = { ...target };
for (const key of Object.keys(source)) {
if (
result[key] &&
typeof result[key] === 'object' &&
!Array.isArray(result[key]) &&
typeof source[key] === 'object' &&
!Array.isArray(source[key])
) {
result[key] = deepMerge(result[key], source[key]);
} else {
result[key] = source[key];
}
}
return result;
}
// Usage
deepMerge({ a: 1, b: { x: 10, y: 20 } }, { b: { y: 30, z: 40 }, c: 3 });
// { a: 1, b: { x: 10, y: 30, z: 40 }, c: 3 }

The interview gotcha is array merging — should [1, 2] merged with [3, 4] produce [1, 2, 3, 4] (concatenation) or [3, 4] (replacement)? There's no universally correct answer; ask the interviewer for the contract before coding.

Practice implementing Data Merging on GreatFrontEnd →

10. classnames

Every React project (and most projects in other frameworks) needs a utility for conditionally combining CSS class names. The classnames package on npm is the de facto standard, and "implement classnames" is a frequent interview question.

function classnames(...args) {
const classes = [];
for (const arg of args) {
if (!arg) continue;
if (typeof arg === 'string' || typeof arg === 'number') {
classes.push(arg);
} else if (Array.isArray(arg)) {
const inner = classnames(...arg);
if (inner) classes.push(inner);
} else if (typeof arg === 'object') {
for (const key of Object.keys(arg)) {
if (arg[key]) classes.push(key);
}
}
}
return classes.join(' ');
}
// Usage
classnames('btn', 'btn-primary'); // 'btn btn-primary'
classnames('btn', { 'btn-disabled': isDisabled }); // 'btn' or 'btn btn-disabled'
classnames('btn', ['btn-large', { active: isActive }]); // 'btn btn-large active'
classnames(null, undefined, false, '', 0, 'btn'); // 'btn'

All falsy values (null, undefined, false, '', 0, NaN) are skipped, which matches the published classnames package — its top-of-loop if (!arg) continue filters them out uniformly. If the interviewer wants 0 treated as a valid class, swap the falsy guard for an explicit nullish/empty check.

Practice implementing classnames on GreatFrontEnd →

11. Once

once wraps a function so it only ever runs the first time it's called. Subsequent calls return the cached first-call result without invoking the wrapped function.

function once(fn) {
let called = false;
let result;
return function (...args) {
if (!called) {
called = true;
result = fn.apply(this, args);
}
return result;
};
}
// Usage
const init = once(() => {
console.log('Initializing...');
return { timestamp: Date.now() };
});
init(); // Logs "Initializing..." and returns { timestamp: ... }
init(); // Returns the same object, doesn't log

A small question that covers a lot: closures (the called and result variables persist between calls), this/args forwarding, and why caching the result matters (so the second caller gets the same value, not undefined). Common follow-up: "make it so the wrapped function can be reset" — add a .reset() method that clears called and result.

Practice implementing Once on GreatFrontEnd →

12. Memoize

Memoize wraps an expensive function and caches its results by argument, so repeated calls with the same input return the cached value instead of recomputing.

function memoize(fn) {
const cache = new Map();
return function (...args) {
const key = JSON.stringify(args);
if (cache.has(key)) {
return cache.get(key);
}
const result = fn.apply(this, args);
cache.set(key, result);
return result;
};
}
// Usage
const slowSquare = (n) => {
console.log('Computing', n);
return n * n;
};
const fastSquare = memoize(slowSquare);
fastSquare(5); // Logs "Computing 5", returns 25
fastSquare(5); // Returns 25 from cache (no log)
fastSquare(6); // Logs "Computing 6", returns 36

Two interview gotchas worth raising before the interviewer does:

  1. JSON.stringify is a brittle key strategy — functions in arguments are dropped (replaced with null), circular references throw, and {a: 1, b: 2} and {b: 2, a: 1} produce different keys despite being structurally equivalent (modern engines preserve insertion order in stringification). For real production code, accept a custom resolver function (Lodash's API) or use a WeakMap keyed by the first argument when arguments are objects.
  2. Memoize leaks memory — the cache grows unbounded. Add an LRU eviction policy or a Map with a max size for any function called with many distinct inputs.

Practice implementing Memoize on GreatFrontEnd →

13. Get (safely access nested properties)

get reads a property at a nested path on an object, returning a fallback if any part of the path is missing. Before optional chaining landed in JavaScript, this was one of Lodash's most reached-for utilities.

function get(obj, path, defaultValue) {
const keys = Array.isArray(path) ? path : path.split('.');
let result = obj;
for (const key of keys) {
if (result == null) return defaultValue;
result = result[key];
}
return result === undefined ? defaultValue : result;
}
// Usage
const user = { profile: { name: 'Alice', address: { city: 'NYC' } } };
get(user, 'profile.name'); // 'Alice'
get(user, 'profile.address.zip'); // undefined
get(user, 'profile.address.zip', 'unknown'); // 'unknown'

Modern JavaScript usually doesn't need this — optional chaining (user?.profile?.address?.city ?? 'unknown') is more readable and built into the language. The interview value is showing you understand both the historical reason for the function and the modern equivalent. A common follow-up: "what if the path includes array indices like users[0].name?" — the split('.') approach above doesn't handle bracket notation; you'd need a regex split or a tokenizer for full Lodash compatibility.

Practice implementing Get on GreatFrontEnd →


Senior level

14. Event Emitter

An EventEmitter is a publish-subscribe primitive — listeners subscribe to named events, the emitter triggers all listeners for an event when emit is called. Node.js has a built-in EventEmitter; browsers have EventTarget. Implementing one tests your grasp of OOP, closures, and array manipulation.

class EventEmitter {
constructor() {
this.events = new Map();
}
on(event, listener) {
if (!this.events.has(event)) this.events.set(event, []);
this.events.get(event).push(listener);
return this;
}
off(event, listener) {
const listeners = this.events.get(event);
if (!listeners) return this;
this.events.set(
event,
listeners.filter((l) => l !== listener),
);
return this;
}
emit(event, ...args) {
const listeners = this.events.get(event);
if (!listeners || listeners.length === 0) return false;
// Snapshot so listeners that unsubscribe themselves don't break iteration.
[...listeners].forEach((listener) => listener.apply(this, args));
return true;
}
}
// Usage
const emitter = new EventEmitter();
const handler = (data) => console.log('Got:', data);
emitter.on('message', handler);
emitter.emit('message', { text: 'hello' }); // Logs "Got: { text: 'hello' }"
emitter.off('message', handler);
emitter.emit('message', { text: 'world' }); // No log

The bug interviewers love to test is the listener-self-unsubscribe case — if a listener calls emitter.off(...) for itself inside emit, the array gets modified mid-iteration and the next listener is skipped. The [...listeners] snapshot above prevents it. Mention this without being asked.

Practice implementing Event Emitter on GreatFrontEnd →

15. Promise.all

Promise.all takes an iterable of promises and returns a promise that fulfills with all results in input order, or rejects on the first rejection.

function promiseAll(promises) {
return new Promise((resolve, reject) => {
const results = new Array(promises.length);
let remaining = promises.length;
if (remaining === 0) {
resolve(results);
return;
}
promises.forEach((promise, index) => {
Promise.resolve(promise).then(
(value) => {
results[index] = value;
remaining--;
if (remaining === 0) resolve(results);
},
(reason) => reject(reason),
);
});
});
}
// Usage
const results = await promiseAll([
Promise.resolve(1),
42,
new Promise((r) => setTimeout(() => r('foo'), 100)),
]);
console.log(results); // [1, 42, 'foo']

A few specifics to mention even if not asked: Promise.all([]) resolves with [] (NOT rejects); non-promise values in the input array are wrapped via Promise.resolve and included in order; on first rejection, the other promises keep running but their results are discarded. JS promises themselves aren't cancellable — if you need to cancel the underlying work (a fetch, for example), pass an AbortController signal to it. For the full comparison with Promise.allSettled, see the next question.

Practice implementing Promise.all on GreatFrontEnd →

16. Promise.allSettled

Promise.allSettled resolves once all input promises have settled, with an array of { status, value } or { status, reason } objects. Unlike Promise.all, it never rejects — even when individual inputs reject.

function promiseAllSettled(promises) {
return new Promise((resolve) => {
const results = new Array(promises.length);
let remaining = promises.length;
if (remaining === 0) {
resolve(results);
return;
}
promises.forEach((promise, index) => {
Promise.resolve(promise).then(
(value) => {
results[index] = { status: 'fulfilled', value };
remaining--;
if (remaining === 0) resolve(results);
},
(reason) => {
results[index] = { status: 'rejected', reason };
remaining--;
if (remaining === 0) resolve(results);
},
);
});
});
}
// Usage
const results = await promiseAllSettled([
Promise.resolve(1),
Promise.reject(new Error('boom')),
Promise.resolve(3),
]);
// [
// { status: 'fulfilled', value: 1 },
// { status: 'rejected', reason: Error('boom') },
// { status: 'fulfilled', value: 3 }
// ]

The when-to-use distinction matters: reach for Promise.all when partial success is invalid (transactional sequences, prefetches that gate a render). Reach for Promise.allSettled when partial results are still useful (dashboard widgets where one slow endpoint shouldn't blank the screen, bulk operations with per-row UX). The common mistake is wrapping Promise.all in try/catch to "handle" individual failures — that catches the first rejection but loses the rest of the results. Use allSettled instead.

Practice implementing Promise.allSettled on GreatFrontEnd →

17. Curry

Currying transforms a function of N arguments into a sequence of N functions of 1 argument each. The classic example is curry(add3) letting you call add3(1)(2)(3), add3(1, 2)(3), or add3(1)(2, 3) interchangeably.

function curry(fn) {
return function curried(...args) {
if (args.length >= fn.length) {
return fn.apply(this, args);
}
return function (...nextArgs) {
return curried.apply(this, [...args, ...nextArgs]);
};
};
}
// Usage
const add = (a, b, c) => a + b + c;
const curriedAdd = curry(add);
curriedAdd(1)(2)(3); // 6
curriedAdd(1, 2)(3); // 6
curriedAdd(1)(2, 3); // 6
curriedAdd(1, 2, 3); // 6

The trick is using fn.length to know how many arguments the original function expects. The recursion in curried collects arguments across calls until the count is reached. Common follow-up: "implement curry that supports a placeholder for skipping arguments" — that's a separate variant (Curry II in our practice catalog) and a good test of whether you can extend a clean recursion to handle more state.

Practice implementing Curry on GreatFrontEnd →

18. getElementsByClassName

getElementsByClassName returns a live HTMLCollection of elements that match one or more class names. The interview challenge is implementing it from scratch using DOM traversal — a real test of recursion and attribute matching.

function getElementsByClassName(root, classNames) {
const targetClasses = classNames.trim().split(/\s+/);
const results = [];
function traverse(node) {
if (node.nodeType !== 1) return; // Only Element nodes.
const nodeClasses = (node.getAttribute('class') || '').trim().split(/\s+/);
if (targetClasses.every((cls) => nodeClasses.includes(cls))) {
results.push(node);
}
for (const child of node.children) {
traverse(child);
}
}
for (const child of root.children) {
traverse(child);
}
return results;
}
// Usage
// Given <div><span class="a"><i class="a b"/></span><i class="a"/></div>
getElementsByClassName(document.body, 'a b'); // [<i class="a b">]

The key gotchas: the input class string can contain multiple space-separated class names, and the element must have ALL of them (not any); the comparison is on the parsed class list, so class="ab" is not a match for "a b"; the implementation should NOT include the root element itself, only its descendants. The native getElementsByClassName returns a LIVE collection that updates as the DOM changes — most interview implementations return a static array, which is acceptable as long as you mention the difference.

Practice implementing getElementsByClassName on GreatFrontEnd →


How to use this guide

If you have a week before an interview, work through questions 1-6 (the junior set) twice — once writing them from a blank file, once reviewing the senior follow-ups. If you have a month, do all 18 in order. The questions are arranged so the patterns build on each other: closures appear in Debounce, Throttle, Once, and Memoize; recursion appears in Flatten, Deep Equal, Deep Clone, and getElementsByClassName; promise composition appears in Promise.all, Promise.allSettled, and is a common follow-up to Event Emitter (replacing callbacks with promises).

Practice each one on the linked GreatFrontEnd page; the IDE runs the same kinds of test cases real interviewers use, so edge cases surface before the interview does.

If you'd like to go beyond these 18 questions, check out our open-source GitHub repo featuring +190 JavaScript interview questions with detailed answers and test cases.