JavaScript `reduce` performance

2.7k Views Asked by At

I've spent some time recently plying around with transducers (tool in functional programming meant to improve performance without losing readability/flexibility of code) and when I came to testing their actual speed, I got some quite disappointing results. Consider:

const inc = x => x + 1;

const isEven = x => x % 2 === 0;

// simplest, shortest way I would be comfortable with if performance wasn't an issue

const mapFilter = xs => xs.filter(isEven).map(inc);

// transducers way

// function composition
const compose = (...fns) => x => fns.reduceRight((y, f) => f(y), x);

const map = f => step => (a, c) => step(a, f(c));
const filter = p => step => (a, c) => (p(c) ? step(a, c) : a);

// basic reducer for building array
const build = (acc, x) => {
  acc.push(x);
  return acc;
};

// transducer, it doesn't create intermediate arrays hence should theoretically be faster 
const transducers = xs =>
  xs.reduce(compose(filter(isEven), map(inc))(build), []);

// native loop for comparison
const nativeLoop = data => {
  const result = [];
  const l = data.length;
  for (let i = 0; i < l; i++) {
    const x = data[i];
    if (isEven(x)) result.push(inc(x));
  }
  return result;
};

const data = Array(1000).fill(1);

const base = ["simplest, chained map and filter", () => mapFilter(data)];
const alternative = ["composed transducers", () => transducers(data)];
const alternative2 = ["native loop", () => nativeLoop(data)];

/* console.log(Benchmark) */
console.log("Running benchmarks....");

const suite = new Benchmark.Suite();
suite
  .add(...base)
  .add(...alternative)
  .add(...alternative2)
  .on("cycle", function(event) {
  console.log(String(event.target));
})
  .on("complete", function() {
  console.log("Fastest is " + this.filter("fastest").map("name").join(", "));
})
// run async
  .run({ async: true });
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.15/lodash.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/benchmark/2.1.4/benchmark.min.js"></script>

I would expect that the order of performance will be

native loop > transducers > chained map/filter

Meanwhile, besides native approach which is way faster than any other, it turned out, to my great astonishment, that reduce/transduce way is much slower than using map/filter and creating intermediate arrays (slower, like an order of magnitude in Chrome). Could someone explain to me the origin of this result?

2

There are 2 best solutions below

1
Aadit M Shah On BEST ANSWER

Your benchmark is wrong because you're building a new transducer chain on each run.

const inc = x => x + 1;

const isEven = x => x % 2 === 0;

// simplest, shortest way I would be comfortable with if performance wasn't an issue

const mapFilter = xs => xs.filter(isEven).map(inc);

// transducers way

// function composition
const compose = (...fns) => x => fns.reduceRight((y, f) => f(y), x);

const map = f => step => (a, c) => step(a, f(c));
const filter = p => step => (a, c) => (p(c) ? step(a, c) : a);

// basic reducer for building array
const build = (acc, x) => {
  acc.push(x);
  return acc;
};

// transducer, it doesn't create intermediate arrays hence should theoretically be faster
const reducer = compose(filter(isEven), map(inc))(build);
const transducers = xs => xs.reduce(reducer, []);

// native loop for comparison
const nativeLoop = data => {
  const result = [];
  const l = data.length;
  for (let i = 0; i < l; i++) {
    const x = data[i];
    if (isEven(x)) result.push(inc(x));
  }
  return result;
};

const data = Array(1000).fill(1);

const base = ["simplest, chained map and filter", () => mapFilter(data)];
const alternative = ["composed transducers", () => transducers(data)];
const alternative2 = ["native loop", () => nativeLoop(data)];

/* console.log(Benchmark) */
console.log("Running benchmarks....");

const suite = new Benchmark.Suite();
suite
  .add(...base)
  .add(...alternative)
  .add(...alternative2)
  .on("cycle", function(event) {
  console.log(String(event.target));
})
  .on("complete", function() {
  console.log("Fastest is " + this.filter("fastest").map("name").join(", "));
})
// run async
  .run({ async: true });
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.15/lodash.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/benchmark/2.1.4/benchmark.min.js"></script>

As you can see, transducers are indeed faster than chained map and filter methods.

0
James Laux On

The benchmark is flawed. the reducer doesn't have to do any work.

  • create a uniform array with the odd number 1.
  • then run an isEven function over each element
  • always going to return an empty array

We are benchmarking the performance of returning an empty array.

if we prefill an array with real data, the native method will win. Aadit is correct though, his transducer is the fastest of the two transducer implementations.

const data = [];

for (let i = 0; i < 1000; i++) {
    data.push(Math.floor(Math.random() * 10));
}