logoalt Hacker News

miohtama10/10/20246 repliesview on HN

Why Apple does not just implement it? They have more resources than anyone in the world. Patents?


Replies

ribit10/11/2024

Are you talking about Vulkan or about geometry shaders? The later is simple: because geometry shaders are a badly designed feature that sucks on modern GPUs. Apple has designed Metal to only support things that are actually fast. Their solution for geometry generation is mesh shaders, which is a modern and scalable feature that actually works.

If you are talking about Vulkan, that is much more complicated. My guess is that they want to maintain their independence as hardware and software innovator. Hard to do that if you are locked into a design by committee API. Apple has had some bad experience with these things in the past (e.g. they donated OpenCL to Kronos only to see it sabotaged by Nvidia). Also, Apple wanted a lean and easy to learn GPU API for their platform, and Vulkan is neither.

While their stance can be annoying to both developers and users, I think it can be understood at some level. My feelings about Vulkan are mixed at best. I don't think it is a very good API, and I think it makes too many unnessesary compromises. Compare for example the VK_EXT_descriptor_buffer and Apple's argument buffers. Vulkan's approach is extremely convoluted — you are required to query descriptor sizes at runtime and perform manual offset computation. Apple's implementation is just 64-bit handles/pointers and memcpy, extremely lean and immediately understandable to anyone with basic C experience. I understand that Vulkan needs to support different types of hardware where these details can differ. However, I do not understand why they have to penalize developer experience in order to support some crazy hardware with 256-byte data descriptors.

show 2 replies
johnnyanmac10/11/2024

Geometry shaders have almost always sucked in all fairness. I'm surprised a game newer than 2015 bothered with them. It's been pretty common knowledge that geometry shaders really only work better on Intel hardware (and I'm not sure how long that lasted).

Tessellation falling short is just classic Apple, though. Shows how much they prioritize games in their decision making, despite every other year deciding they need a AAA game to showcase their hardware.

(apologies for the crude answer. I would genuinely be interested in a technical perspective defending the decision. My only conclusion is that the kind of software their customers need, like art or editing, does not need that much tessellation).

dagmx10/11/2024

Geometry shaders have long been disfavored by all ISVs , not just Apple. It’s just most include the software path.

If you’re using geometry shaders, you’re almost always going to get better performance with compute shaders and indirect draws or mesh shaders.

A lot of hardware vendors will handle them in software which tanks performance. Metal decided to do away with them rather than carry the baggage of something that all vendors agree is bad.

It takes up valuable die space for very little benefit.

Wowfunhappy10/10/2024

In hardware? I would assume because it takes up space on the die, right? It's not free.

kelnos10/10/2024

Because they don't care. They've decided that Metal is The One True Way to write 3D-accelerated apps on macOS, so they only implement the things in hardware that Metal requires.

show 3 replies
fl0id10/10/2024

Because they like to be different tm