logoalt Hacker News

npteljeslast Saturday at 8:44 AM3 repliesview on HN

What I experienced is that AI is a nightmare on AMD in Linux. There is a myriad of custom things that one needs to do, and even that just breaks after a while. Happened so much on my current setup (6600 XT) that I don't bother with local AI anymore, because the time investment is just not worth it.

It's not that I can't live like this, I still have the same card, but if I were looking to do anything AI locally with a new card, for sure it wouldn't be an AMD one.


Replies

eden-u4last Saturday at 9:28 AM

I don't have much experience with ROCm for large trainings, but NVIDIA is still shit with driver+cuda version+other things. The only simplification is due to ubuntu and other distros that already do the heavy lift by installing all required components, without much configuration.

show 2 replies
FredPretlast Saturday at 3:17 PM

I set up a deep learning station probably 5-10 years ago and ran into the exact same issue. After a week of pulling out my hair, I just bought an Nvidia card.

phronimoslast Saturday at 3:13 PM

Are you referring to AI training, prediction/inference, or both? Could you give some examples for what had to be done and why? Thanks in advance.

show 1 reply