For linear algebraic transformation applied to several rows at once, I wholeheartedly agree.
Not so convinced about decision trees though (that process one row at a time).
Yeah, unless you had to deal with arbitrarily large integer features, Guile integers would come with a big efficiency hit.
What do you mean by processing one row at a time?
I think one could parallelize processing rows, at the very least when classifying from learned model. Probably also during learning the model.