Heads up: forbes.com/sites/xyz are ppl and groups who pay for the domain, but aren't edited or promoted by forbes itself. Almost always conservative interest groups posing as journalists.
What exactly is an “AI working competency”? How to have a conversation with a chatbot? How to ask a chatbot a question that you confirm with a Google search and then confirm that with a trusted online reference and confirm that with a book you check out of the library three weeks later?
Perhaps the world is going the direction of relying on an AI to do half the things we use our own brains for today. But to me that sounds like a sad and worse future.
I’m just rambling here. But at the moment I fail to see how current LLMs help people truly learn things.
How to Speedrun devaluing the credentials your institution exists to award.
Realistically universities will have to ban the usage of computers for exams and homeworks.
For the same reason that elementary schools don't allow calculators in math exams.
You first need to understand how to do the thing yourself.
The announcement is here:
https://www.purdue.edu/newsroom/2025/Q4/purdue-unveils-compr...
Where the actual news is:
> To this end, the trustees have delegated authority to the provost, working with deans of all academic colleges, to develop and to review and update continuously, discipline-specific criteria and proficiency standards for a new campuswide “artificial intelligence working competency” graduation requirement for all Purdue main campus students, starting with new beginners in fall 2026.
So the Purdue trustees have "delegated authority" to people at the University to make a new graduation requirement for 2026.
Who knows what will be in the final.
From my long-ago uni courses, current-day AI could have helped with the non-major courses: English and History, doing the first draft or even the final drafts of papers, etc. As a science major, I'm not sure what the point of relying on an AI is as it would leave you empty when considering further education or the tests they require. And as far as a foreign language goes, one needs to at least read the stuff without relying on Google Translate (assuming they have such a requirement anymore).
But I like to think that actually learning the history was important and it certainly was a diversion from math/chemistry/physics. I liked Shakespeare, so reading the plays was also worthwhile and discussing them in class was fun. Yeah, I was bored to tears in medieval history, so AI could have helped there.
This is going to be like when all the schools were pushing big data because that was going to be the next big thing.
After more than a trillion dollars spent, LLMs can replace: (a) a new secretary with one week of experience (b) a junior programmer who just learned that they can install programs on a desktop computer, and (c) James Patterson.
That's the bright future that Purdue is preparing its students for.
Yes, AIs will be a huge thing...eventually...but LLMs are not AI, and they never will be.
I was on the academic board of engineering mechanics for Purdue almost a decade ago.
Purdue not necessarily uniquely but specific to their charter does a really good job at workforce development focus in their engineering. They are very highly focused on staffing and training and less so on the science and research part - though that exists as well.
This tracks what I would expect an in line with what I think it should be best practice
If they were to set down what the curriculum needed to meet such a requirement would be today, by the time the students who matriculate in August graduate, it will be so out of date to be effectively worthless.
This is not remotely the kind of thing that a school should be making a requirement at this time. The technology is changing way too fast to even be sure that basic fundamental skills related to it will remain relevant for as many as 4-5 years.
Seems mostly knee-jerk reactionary more than anything. I'm sure this is to justify hiring even more administrators.
https://www.purdue.edu/newsroom/2025/Q4/purdue-unveils-compr...
"all as informed by evolving workforce and employer needs"
“At the same time, it’s absolutely imperative that a requirement like this is well informed by continual input from industry partners and employers more broadly."
Purdue is engaging in the oldest profession in the world. And the students pay for this BS.
Upfront computer literacy may have never been convincing enough; AI could be the ubiquitous and timely leverage to open the way for general machine thinking.
I don’t really get the dismissive comments here. Universities have had gen ed requirements for years, one of which is usually something to do with computers. AI seems to be a technology that will be increasingly relevant…so a basic gen ed requirement seems logical.
Full disclose: I'm a Purdue graduate, though I disagree with certain things the school has done (Purdue Global).
Part of this is very reasonable; AI is upending how students learn (or cheat), so adding a requirement to teach how to do it in a way that improves learning rather than just enhances cheating makes sense. The problem with the broad, top-down approach is it looks like what happens in Corporate America where there's a CEO edict that "we need a ____ strategy," and every department pivots projects to include that, whether or not it makes sense.