The "bubble popping" mostly means that investment will drastically fall, investors will start demanding profit, and costs will soar. This will cause a lot of tools currently built on top of LLMs to become too expensive. Free tools will likely become rare.
There's a significant number of users that will not pay for AI. There's likely also a significant number of users that will not accept higher subscription costs no matter how much they use AI tools today.
When this happens, the market will go back to "normal". Yes, there will still be a higher demand for computer parts than before ChatGPT was released, but the demand will still go down drastically from current levels. So only a moderate increase in production capacity will be needed.
AI is already easily profitable without further optimization. If at any point investors decided that this is the best the models are going to get, because it's not worth further investment then we will run inference on the existing hardware, forever. What will not happen:
- The models going away. There is no future where people will start doing more coding without AI. - Everyone running all AI on their existing notebook or phone. We have absolutely no indication that the best models are getting smaller and cheaper to run. In fact GPUs are getting bigger.
This might hurt OpenAI, depending on how good the best available open models are at that point, but this will in no way diminish the continued increased demand for hardware.
> When this happens
I think all of this is highly unlikely, and would put a "If". But we will see!