logoalt Hacker News

ac2901/20/20254 repliesview on HN

AMD supports only a single Radeon GPU in Linux (RX 7900 in three variants)?

Windows support is also bad, but supports significantly more than one GPU.


Replies

llm_trw01/20/2025

Imagine nvidia supported only the 4090, 4080 and 4070 for cuda at the consumer level. With the 3090 not being supported since the 40xx series came out. This is what amd is defending here.

Delk01/20/2025

I honestly can't figure out which Radeon GPUs are supposed to be supported.

The GitHub discussion page in the title lists RX 6800 (and a bunch of RX 7xxx GPUs) as supported, and some lower-end RX 6xxx ones as supported for runtime. The same comment also links to a page on the AMD website for a "compatibility matrix" [1].

That page only shows RX 7900 variants as supported on the consumer Radeon tab. On the workstation side, Radeon Pro W6800 and some W7xxx cards are listed as supported. It also suggests to see the "Use ROCm on Radeon GPU documentation" page [2] if using ROCm on Radeon or Radeon Pro cards.

That link leads to a page for "compatibility matrices" -- again. If you click the link for Linux compatibility, you get a page on "Linux support matrices by ROCm version" [3].

That "by ROCm version" page literally only has a subsection for ROCm 6.2.3. It only lists RX 7900 and Pro W7xxx cards as supported. No mention of W6800.

(The page does have an unintuitively placed "Version List" link through which you can find docs for ROCm 5.7 [4]. Those older docs are no more useful than the 6.2.3 ones.)

Is RX 6800 supported? Or W6800? Even the amd.com pages seem to contradict each other on the latter.

Maybe the pages on the AMD site only list official production support or something. In any case it's confusing as hell.

Nothing against the GitHub page author who at least seems to try and be clear but the official documentation leaves a lot to be desired.

[1] https://rocm.docs.amd.com/projects/install-on-linux/en/lates...

[2] https://rocm.docs.amd.com/projects/radeon/en/latest/docs/com...

[3] https://rocm.docs.amd.com/projects/radeon/en/latest/docs/com...

[4] https://rocm.docs.amd.com/projects/radeon/en/docs-5.7.0/docs...

show 3 replies
cokecan01/21/2025

Super annoying. I have an RX 6600 XT and can't get ROCm to work on Linux. Vulkan ML however worked perfectly out of the box, so at least I got something.

Just weird the official thing doesn't work.

show 1 reply
curt1501/21/2025

I found that striking as well. Does AMD expect everyone wanting to try out PyTorch or LLMs on Linux to splurge on Instinct servers?

show 1 reply