You can argu that you will have skill atrophy by not using LLMs.
We have gone multi cloud disaster recovery on our infrastructure. Something I would not have done yet, had we not had LLMs.
I am learning at an incredible rate with LLMs.
Yes, you certainly can argue that, but you'd be wrong. The primary selling point of LLMs is that they solve the problem of needing skill to get things done.
You're learning at your standard rate of learning, you're just feeding yourself over-confidence on how much you're absorbing vs what the LLM is facilitating you rolling out.
> We have gone multi cloud disaster recovery on our infrastructure. Something I would not have done yet, had we not had LLMs.
That’s product atrophy, not skill atrophy.
> I am learning at an incredible rate with LLMs
Could you do it again without the help of an LLM?
If no, then can you really claim to have learned anything?
Also AI could help you pick those skills up again faster, although you wouldn’t need to ever pick those skills up again unless AI ceased to exist.
What an interesting paradox-like situation.
>I am learning at an incredible rate with LLMs.
I don't believe it. Having something else do the work for you is not learning, no matter how much you tell yourself it is.
Using LLMs as a learning tool isn’t what causes skill atrophy. It’s using them to solve entire problems without understanding what they’ve done.
And not even just understanding, but verifying that they’ve implemented the optimal solution.
I kind feel the same. I’m learning things and doing things in areas that would just skip due to lack of time or fear.
But I’m so much more detached of the code, I don’t feel that ‘deep neural connection’ from actual spending days in locked in a refactor or debugging a really complex issue.
I don’t know how a feel about it.