logoalt Hacker News

keyletoday at 1:36 AM1 replyview on HN

This is because Blender is in fact using CUDA?


Replies

Joel_Mckaytoday at 2:36 AM

The key feature on intel platforms is the hardware de-noise acceleration (NVIDIA OptiX also works well.) Note, AMD OpenCL works quite well for some renderings, but blender flamenco likes consistent cluster hardware.

For 8k HDR10 media or 3+ screens the rtx 5090 32G model is going to be the minimum card people should buy. Just because you see 4 DP ports, doesn't mean the card can push bit-rates needed to fill an HDR10 display >60Hz.

The Mac Studio Pro unified >512GB ram/vram is a better LLM lab solution (Apple recently NERF'd it to 256GB.) Who cares if a task completes a bit slower, it doesn't matter given the lower error rates... and not costing $14k like an rtx 6000. =3

Great tutorial on getting blender to behave on mid-grade PC and laptops etc. :

https://www.youtube.com/watch?v=a0GW8Na5CIE