I think he's saying if you set temp to 0 and answers become deterministic, it will appear that the model is just memorising and reciting. The randomness is a hack that 'forces' the model to generalise by deliberately knocking it off the track of the most probable next token