Conceptly
← All Concepts
🔄

Map·Filter·Reduce

CollectionsThe three core operations for transforming and summarizing collections

map, filter, and reduce are higher-order functions built into most functional and multi-paradigm languages. They express the three most common things you do with a list: transform each item, keep only some items, or fold all items into one value.

Architecture Diagram

📊 Data Flow

Dashed line animations indicate the flow direction of data or requests

Why do you need it?

Loops are flexible but they blur intent. A for-loop that builds a new array, one that discards entries, and one that sums a field all look structurally the same -- a mutable variable, an index, a body. The reader has to trace the whole body to understand what is happening. When collections are transformed in several steps, the code tends to grow into a mix of indices, intermediate variables, and mutation that makes the transformation hard to follow and easy to break. map, filter, and reduce each name a single clear intention, so the transformation reads as a declaration rather than an instruction.

Why did this approach emerge?

These operations come from the mathematical notion of functors and folds in functional programming. Lisp introduced map in the 1950s; reduce (also called fold) has roots in the same tradition. When languages like Python, Ruby, and JavaScript adopted them as first-class collection methods, they became the standard vocabulary for writing concise, composable data transformations without relying on mutable loops.

How does it work inside?

Each of the three functions takes a collection and a function, and produces a result without changing the original. map applies a transformation to every element and returns a new array of equal length. filter evaluates a predicate on every element and returns a new array containing only the elements for which the predicate is true. reduce walks the array from left to right, passing the current accumulated value and the current element to the combining function, and returns whatever the final accumulated value is. Because all three accept and return values rather than mutating state, they compose naturally: the result of filter can be passed directly into map, which can be passed directly into reduce.

In Code

Transforming and filtering a list

const prices = [10, 25, 5, 40, 15];

const discounted = prices
  .filter((p) => p >= 10)
  .map((p) => p * 0.9);

// [9, 22.5, 36, 13.5]

filter keeps only prices at or above the threshold, then map applies a 10% discount to each. The original array is untouched and each step reads as an independent transformation.

Aggregating with reduce

const orders = [
  { item: "coffee", total: 4.5 },
  { item: "muffin", total: 3.0 },
  { item: "juice",  total: 5.5 },
];

const grandTotal = orders.reduce(
  (acc, order) => acc + order.total,
  0
);

// 13

reduce folds the array into a single number by accumulating each order's total. The second argument (0) is the initial accumulator value.

Boundaries & Distinctions

map, filter, and reduce operate on existing collections and produce new values. They are distinct from functions that construct data structures from scratch. They are also distinct from side-effecting iteration like forEach, which walks a collection for its side effects and returns nothing. When you want to transform, select, or fold, these three are the right tools. When you want to iterate for logging, mutation, or IO, a plain loop or forEach is typically more honest about the intent.

Trade-off

Chaining map, filter, and reduce creates a clear declarative pipeline, but each step allocates a new intermediate array. For most application code this cost is insignificant, but in performance-sensitive paths over large datasets, materializing multiple intermediate arrays may become a bottleneck. Languages and libraries that support lazy evaluation or transducers let you express the same pipeline while walking the collection only once.

When should you use it?

In practice, the clearest way to use these functions is to chain them directly and keep each function argument small -- ideally a named function or a short arrow. If the chain grows long or any step becomes conditional about what kind of output it produces, that is usually a sign to extract a named function or break the pipeline into labeled intermediate results. The other common failure mode is reaching for reduce when filter or map would be simpler. Reduce is the most general of the three but also the hardest to read at a glance; prefer the more specific operations whenever they are sufficient.

Data transformation -- converting raw API responses into the shape a UI component needsFiltering -- narrowing a list to only the items that match a conditionAggregation -- folding a collection into a total, average, or grouped structurePipeline construction -- chaining map, filter, and reduce to express multi-step data processing as a single readable flow