Well isnt the point humans wouldn't need to do basically anything?
It would be 'desirable' because the value is in the product of the labour not the labour itself. (Of course the resulting dystopian hellscape might be considered undesirable)
As I keep pointing out, if the model ever stops needing you to complete ambitious goals, then what does the model actually need you for?
People somehow imagine an agent that can crush the competition with minimal human oversight. And then they somehow think that they'll be in charge, and not Sam Altman, a government, or possibly the model itself.
If the model's that good, nobody's going to sell it to you.
As I keep pointing out, if the model ever stops needing you to complete ambitious goals, then what does the model actually need you for?
People somehow imagine an agent that can crush the competition with minimal human oversight. And then they somehow think that they'll be in charge, and not Sam Altman, a government, or possibly the model itself.
If the model's that good, nobody's going to sell it to you.