I'm not a fan of nodiscard because it's applied way too freely, even if the return value is not relevant. E.g. WebGPU/WGSL initially made atomics nodiscard simply because they return a value, but half the algorithms that use atomics only do so for the atomic write, without needing the return value. But due to nodiscard you had to make a useless assignment to an unused variable.
Disagree. This is the default in Objective-C & Swift, and it’s great. You have to explicitly annotate when discards are allowed. It’s probably my all time favorite compiler warning in terms of accidental-bugs-caught-early, because asking the deeper question about why a function returns useless noise invites deep introspection.