Noise is, because of its random nature, inherently less compressible than a predictable signal.
So counterintuitively, noise reduction improves compression ratios. In fact many video codecs are about determining which portion of the video IS noise that can be discarded, and which bits are visually important...
That doesn't make it just a compression algorithm, to me at least.
Or to put it another way, to me it would be similarly disingenuous to describe e.g. dead code elimination or vector path simplification as "just a compression algorithm" because the resultant output is smaller than it would be without. I think part of what has my hackles raised is that it claims to improve video clarity, not to optimise for size. IMO compression algorithms do not and should not make such claims; if an algorithm has the aim (even if secondary) to affect subjective quality, then it has a transformative aspect that requires both disclosure and consent IMO.