Exactly. Running games at a lower resolution isn't new. I remember changing the size of the viewport in the original DOOM 1993 to get it to run faster. Making a lower resolution look better without having to run at a higher resolution is the exact same problem anti-aliasing has been tackling forever. DLSS is just another form of AA that is now so advanced, you can go from an even lower resolution and still look good.
So even when I'm running a game at native resolution, I still want anti-aliasing, and DLSS is a great choice then.
But we're not talking about resolution here. We're talking about interpolation of entire frames, multiple frames.
It's one thing to rely on a technique like AA to improve visual quality with negligible drawbacks. DLSS is entirely different though, since upscaling introduces all kinds of graphical issues, and frame generation[1] even more so, while adding considerable input latency. NVIDIA will claim that this is offset by its Reflex feature, but that has its own set of issues.
So, sure, we can say that all of this is ultimately software trickery, but when the trickery is dialed up to 11 and the marketing revolves entirely on it, while the raw performance is only slightly improved over previous generations, it's a clear sign that consumers are being duped.
[1]: I'm also opposed to frame generation from a philosophical standpoint. I want my experience to be as close as possible to what the game creator intended. That is, I want every frame to be generated by the game engine; every object to look as it should within the world, and so on. I don't want my graphics card to create an experience that approximates what the creator intended.
This is akin to reading a book on an e-reader that replaces every other word with one chosen by an algorithm. I want none of that.