> To the extent that learning to punch your own punch cards was useful, it was because you needed to understand the kinds of failures that would occur if the punch cards weren't punched properly. However, this was never really a big part of programming, and often it was off-loaded to people other than the programmers.
I thought all computer scientists heard about Dijkstra making this claim at one time in their careers. I guess I was wrong? Here is the context:
> A famous computer scientist, Edsger Dijkstra, did complain about interactive terminals, essentially favoring the disciplined approach required by punch cards and batch processing.
> While many programmers embraced the interactivity and immediate feedback of terminals, Dijkstra argued that the "trial and error" approach fostered by interactive systems led to sloppy thinking and poor program design. He believed that the batch processing environment, which necessitated careful, error-free coding before submission, instilled the discipline necessary for writing robust, well-thought-out code.
> "On the Cruelty of Really Teaching Computing Science" (EWD 1036) (1988 lecture/essay)
Seriously, the laments I hear now have been the same in my entire career as a computer scientist. Let's just look toward to 2035 where someone on HN will complain some old way of doing things is better than the new way because its harder and wearing hair shirts is good for building character.
Dijkstra did not make that claim in EWD1036. The general philosophy you're alluding to is described in EWD249, which – as it happens – does mention punchcards:
> The naive approach to this situation is that we must be able to modify an existing program […] The task is then viewed as one of text manipulation; as an aside we may recall that the need to do so has been used as an argument in favour of punched cards as against paper tape as an input medium for program texts. The actual modification of a program text, however, is a clerical matter, which can be dealt with in many different ways; my point is […]
He then goes on to describe what today we'd call "forking" or "conditional compilation" (in those days, there was little difference). "Using AI to get answers", indeed. At least you had the decency to use blockquote syntax, but it's tremendously impolite to copy-paste AI slop at people. If you're going to ingest it, do so in private, not in front of a public discussion forum.
The position you've attributed to Dijkstra is defensible – but it's not the same thing at all as punching the cards yourself. The modern-day equivalent would be running the full test suite only in CI, after you've opened a pull request: you're motivated to program in a fashion that ensures you won't break the tests, as opposed to just iterating until the tests are green (and woe betide there's a gap in the coverage), because it will be clear to your colleagues if you've just made changes willy-nilly and broken some unrelated part of the program and that's a little bit embarrassing.
I would recommend reading EWD1035 and EWD1036: actually reading them, not just getting the AI to summarise them. While you'll certainly disagree with parts, the fundamental point that E.W.Dijkstra was making in those essays is correct. You may also find EWD514 relevant – but if I linked every one of Dijkstra's essays that I find useful, we'd be here all day.
I'll leave you with a passage from EWD480, which broadly refutes your mischaracterisation of Dijkstra's opinion (and serves as a criticism of your general approach):
> This disastrous blending deserves a special warning, and it does not suffice to point out that there exists a point of view of programming in which punched cards are as irrelevant as the question whether you do your mathematics with a pencil or with a ballpoint. It deserves a special warning because, besides being disastrous, it is so respectable! […] And when someone has the temerity of pointing out to you that most of the knowledge you broadcast is at best of moderate relevance and rather volatile, and probably even confusing, you can shrug out your shoulders and say "It is the best there is, isn't it?" As if there were an excuse for acting like teaching a discipline, that, upon closer scrutiny, is discovered not to be there.... Yet I am afraid, that this form of teaching computing science is very common. How else can we explain the often voiced opinion that the half-life of a computing scientist is about five years? What else is this than saying that he has been taught trash and tripe?
The full text of much of the EWD series can be found at https://www.cs.utexas.edu/~EWD/.