...The Mac menu bar is what it is for a very good reason. Being at the top of the screen makes it an infinitely-tall target.
All you have to do to get to it is move your mouse up until you can't move it up any more.
This remains a very valuable aspect to it no matter what changes in the vogue of UIs have come and gone since.
The fact that you think that you've "minimized the application" when you minimized a window just shows that you are operating on a different (not better, not worse, just different) philosophy of how applications work than the macOS designers are.
Ah yes, this old argument. Except nobody slams his cursor against the top of the screen in real life, assuming that the menu bar is "infinitely tall." Watch real users interact with a Mac's menu, and you simply won't see this behavior. Not to mention that it doesn't work if you're using a laptop and a second monitor positioned behind and above it.
And we're talking about a GUI here, so when I minimize an application's GUI then yes, I expect that I've minimized the application. And again, I think you'll find that the vast majority of users work under this M.O.
But your observation raises another usability issue caused by the single menu: Instead of an "infinite" desktop, the Mac reduces the entire screen to a single application's client area... so, historically, Mac applications treated it that way...littering it with an armada of floating windows that you had to herd around.
The problem is that turning the whole screen into one application's client area fails because you can see all the other crap on your desktop and all other open applications' GUIs THROUGH the UI of the app you're trying to use. It's stupid.
So, to users' relief, the floating-window nonsense has been almost entirely abandoned over the last couple of decades and single-window applications have become the norm on Mac as they have been on Windows forever. Oh wait, hold on... here comes Apple regressing back to "transparent" UI with "liquid glass;" a failed idea from 20+ years ago.
Full circle, sadly.
There are videos out there where CHM interviewed Bill Atkinson. One part has him go over old Polaroids of Lisa interface drafts. There, he justifies the menu bar at the top of the screen differently: they couldn't figure out what to do when the menu was too wide.for the window when the user made it narrow.
This argument never made much sense to me, although I do subscribe to Fitts' Law. With desktop monitor sizes since 20+ years ago, the distance you have to travel, together with the visual disconnect between application and the menu bar, negates the easier targetability. And with smaller screen sizes, you would generally maximize the application window anyway, resulting in the same targetability.
The actual historical rationale for the top menu bar was different, as explained by Bill Atkinson in this video: https://news.ycombinator.com/item?id=44338182. The problem was that due to the small screen size, non-maximized windows often weren't wide enough to show all menus, and there often wasn't enough space vertically below the window's menu bar to show all menu items. That's why they moved the menus to the top of the screen, so that there always was enough space, and despite the drawback, as Atkinson notes, of having to move the mouse all the way to the top. This drawback was significant enough that it made them implement mouse pointer acceleration to compensate.
So targetability wasn't the motivation at all, that is a retconned explanation. And the actual motivation doesn't apply anymore on today's large and high-resolution screens.