Games/graphics are one of those domains with a lot of jargon for sure. If you don't want to be a wizard you can just mess with it and see what happens. I like how dolphin approaches this with extensive tooltips in the settings, but there's always going to be some implicit knowledge.
On a meta level - I feel like I've seen anti-acronym sentiment a lot recently. I feel like it's never been easier to look these things up. There's definitely levels of acronyms which are anti-learning or a kind of protectionism, but to my mind there are appropriate levels of it to use because you have to label concepts at a useful level to accomplish things, and graphics settings of a game definitely are on the reasonable side.
The PS4 Pro introduced the gaming world to the simplification of settings from dozens of acronyms that were common to PC Gamers, down to “Performance” and “Quality”.
I wouldn’t be surprised if there’s now a market demand for that to spread back to PC land.
Well, I suppose I'm not the assumed target audience. Back when I was younger I had the time to tweak everything on my computer: my Linux distro, my games (dualbooting windows just for that), and looking things up for that reason. I also could play long gaming sessions.
Nowadays I'm a dad with practically no spare time between raising a toddler and work. Suddenly the question of "does the game overstay its welcome?" has become a thing; I almost exclusively play games that can be played in short bursts of time, and that deliver a great experience as a whole that can be completed in a relatively short playtime. I got a Steam Deck a few years ago for the specific purpose of separating my work computer from my gaming platform, and being able to pick up and play a game and pausing it without problems.
Even with the built-in performance overlay of the Steam Deck (which is very nice) it takes time to assess the quality-vs-performance results of every possible combination of settings. Often more than I would spend on playing a game.
I suspect that people like me either already are or soon will be the bigger segment of the (paying) customers though, so that is something to consider for developers.
And some games do give short explanations of what each type of technique does and how they compare, along with statements like "this usually has a small impact on performance" or "this has a large impact on performance" to guide me, which is already a great help.
> just mess with it and see what happens
And even if you know every detail, that's still the best course of action, I think. Which kind of antialiasing you prefer, and how it trades with performance and resolution is highly subjective, and it can be "none".
There are 3 components to rescaling/rendering pixels: aliasing, sharpness and locality. Aliasing is, well, aliasing, sharpness is the opposite of blurriness, and locality is about these "ringing" artefacts you often see in highly compressed images and videos. You can't be perfect on all three. Disabling antialiasing gives you the sharpest image with no ringing artefacts, but you get these ugly staircase effects. Typical antialiasing trades this for blurriness, in fact, FXAA is literally a (selective) blur, that's why some people don't like it. More advanced algorithms can give you both antialiasing and sharpness, but you will get these ringing artefacts. The best of course is to increase resolution until none of these effects become noticeable, but you need the hardware.
The best algorithms attempt to find a good looking balance between all these factors and performance, but "good looking" is subjective, that's why your best bet is to try for yourself. Or just keep the defaults, as it is likely to be set to what the majority of the people prefer.