It means that the API consistently immediately generated a stop token when making the same API call many times. The API call sets the temperature to 0 (the OpenAI documentation is not clear if gpt 5.2 can even have its temperature set to 0) which makes sampling deterministic.
> to 0 (the OpenAI documentation is not clear if gpt 5.2 can even have its temperature set to 0)
I think for the models that any value but 1.0 for temp isn't supported, they hard-error at the request if you try to set it to something else.