> “Over the years I’ve found that when students read on paper they're more likely to read carefully, and less likely in a pinch to read on their phones or rely on chatbot summaries,” Shirkhani wrote to the News. “This improves the quality of class time by orders of magnitude.”
This is the key part. I'm doing a part-time graduate degree at a major university right now, and it's fascinating to watch the week-to-week pressure AI is putting on the education establishment. When your job as a student is to read case studies and think about them, but Google Drive says "here's an automatic summary of the key points" before you even open the file, it takes a very determined student to ignore that and actually read the material. And if no one reads the original material, the class discussion is a complete waste of time, with everyone bringing up the same trite points, and the whole exercise becomes a facade.
Schools are struggling to figure out how to let students use AI tools to be more productive while still learning how to think. The students (especially undergrads) are incredibly good at doing as little work as possible. And until you get to the end-of-PhD level, there's basically nothing you encounter in your learning journey that ChatGPT can't perfectly summarize and analyze in 1 second, removing the requirement for you to do anything.
This isn't even about AI being "good" or "bad". We still teach children how to add numbers before we give them calculators because it's a useful skill. But now these AI thinking-calculators are injecting themselves into every text box and screen, making them impossible to avoid. If the answer pops up in the sidebar before you even ask the question, what kind of masochist is going to bother learning how to read and think?
Last weekend I was arguing with a friend that physical guitar pedals are better for creativity and exploration of the musical space than modelers even though modelers have way more resources for a fraction of the cost, the physical aspect of knobs and cables and everything else leads to something that's way more interactive and prone to "happy mistakes" than any digital interface can offer.
In my first year of college my calculus teacher said something that stuck with me "you learn calculus getting cramps on your wrists", yeah, AI can help remember things and accelerate learning, but if you don't put the work to understand things you'll always be behind people that know at least with a bird eye view what's happening.
> And if no one reads the original material, the class discussion is a complete waste of time, with everyone bringing up the same trite points, and the whole exercise becomes a facade.
If reading an AI summary of readings is all it takes to make an exercise a facade, then the exercise was bad to begin with.
AI is certainly putting pressure on professors to develop better curricula and evaluations, and they don’t get enough support for this, imho.
That said, good instruction and evaluation techniques are not some dark art — they can be developed, implemented, and maintained with a modest amount of effort.