So the point of distributed compute is to reduce the compute needed? I’ve generally found that distributed compute requires more compute than vertical scaling while getting clobbered by network bandwidth / latency.
Theoretically with 2 to 10x compute required and in practice 100 to 500x
The point of distributed computing is to do computing that you can't do on a vertically scaled system or to increase availability.
If you're doing it for other reasons it's usually a mistake.