logoalt Hacker News

machinationuyesterday at 9:43 PM0 repliesview on HN

speculative decoding is 1+1

transformer attention is integrals