DeepSeek is an open source model. You can download it and run it locally on your laptop already.
So any OpenAI user ( or competitor even) could take it and run a hosted model. You can even tweak the weights if you wanted to.
Why pay for OpenAI access when you can just run your own and save the money?
LM Studio version is here: https://lmstudio.ai/model/deepseek-r1-llama-8b
The one your laptop can run does not rival what OpenAI offers for money. Still, the issue is not whether third party can run it, it's just the OpenAI seems not putting API as their main product.