logoalt Hacker News

bjournetoday at 3:36 PM3 repliesview on HN

Reductions are painful because they specify a sequence of ordered operations. Runtime is O(N), where N is the sequence length, regardless of amount of hardware. So you want to work at a higher level where you can exploit commutativity and independence of some (or even most) operations.


Replies

toxiktoday at 6:44 PM

You can reduce in parallel. That was the whole point of MapReduce. For example, the sum abcdefgh can be found by first ab, cd, ef, gh; then those results (ab)(cd), (ef)(gh); then the final result by (abcd)(efgh). That's just three steps to compute seven sums.

ux266478today at 6:08 PM

You're right it's primarily a runtime + compiler + language issue. I really don't understand why people tried to force functional programming in environments without decent algebraic reasoning mechanisms.

Modern graph reducers have inherent confluence and aren't reliant on explicit commutation. They can do everything parallel and out of order (until they have to talk to some extrinsic thing like getting input or spitting out output), including arbitrary side-effectual mutation. We really live in the future.

heavenlybluetoday at 7:08 PM

Reduce is massively parallel for commutative operations