VFX heavy feature for a Disney subsidiary. Each frame is rendered independently of each other - it’s not like video encoding where each frame depends on the previous one, they all have their own scene assembly that can be sent to a server to parallelize rendering. With enough compute, the entire film can be rendered in a few days. (It’s a little more complicated than that but works to a first order approximation)
I don’t remember how long the final rendering took but it was nearly two months and the final compute budget was 7 or 8 figures. I think we had close to 100k cores running at peak from three different render farms during crunch time, but don’t take my word for it I wasn’t producing the picture.
Are they still using CPUs and not GPUs for rendering?
Weren't the rendering algos ported to CUDA yet?