Lesser known but possibly more relevant to most HN readers are Feynman's lectures on computation - https://theswissbay.ch/pdf/Gentoomen%20Library/Extra/Richard... . There's some really great explanations in there of computability, information theory, entropy, thermodynamics, and more. Very little of it is now out-dated.
Interesting, he also talks about quantum computing (a first?): p. 191, "We now go on to consider how such a computer can also be built using the laws of quantum mechanics. We are going to write a Hamiltonian, for a system of interacting parts, which will behave in the same way as a large system in serving as a universal computer."
p. 196: "In general, in quantum mechanics, the outgoing state at time t is eⁱᴴᵗ Ψᵢₙ where Ψᵢₙ is the input state, for a system with Hamiltonian H. To try to find, for a given special time t, the Hamiltonian which will produce M = eⁱᴴᵗ when M is such a product of non-commuting matrices, from some simple property of the matrices themselves, appears to be very difficult.
We realize, however, that at any particular time, if we expand eⁱᴴᵗ out (as 1 + iHt − H²t²⁄2 + …) we'll find the operator H operating an innumerable arbitrary number of times — once, twice, three times, and so forth — and the total state is generated by a superposition of these possibilities. This suggests that we can solve this problem of the composition of these A’s in the following way..."
Apropos of Feynman on computing, the story of his time working at Thinking Machines Corp https://longnow.org/ideas/richard-feynman-and-the-connection...
“For our first seminar he invited John Hopfield, a friend of his from CalTech, to give us a talk on his scheme for building neural networks. In 1983, studying neural networks was about as fashionable as studying ESP, so some people considered John Hopfield a little bit crazy. Richard was certain he would fit right in at Thinking Machines Corporation.”