logoalt Hacker News

lynndotpyyesterday at 12:51 PM2 repliesview on HN

If you're seriously doing deep learning research, it's very very nice to own your own GPU.

For four years of AI PhD research I worked with a 1050Ti on a personal laptop and a 2060 on a personal desktop. You can do a lot of validation and development on consumer GPUs.

That said, the OP does not train an LLM from scratch on a 3090. That would not be feasible


Replies

joefourieryesterday at 1:12 PM

M? The OP literally did train an LLM from scratch in a 3090 (except for the tokenizer), that’s what the whole post is about.

show 1 reply
deskamessyesterday at 4:21 PM

I have an old 2060 with 6GB (I think). I also have a work laptop 3060 with 6GB (shared to 8GB). What can I do with those? I dabble a bit here and there but I would like to run my own local LLM for 'fun'.

Thanks!

show 1 reply