logoalt Hacker News

irishcoffeeyesterday at 10:01 PM1 replyview on HN

I own 2 5070TI cards in a rig I would gladly donate time to for a distributed training model effort. The kicker is the training data. I would want to gate the data to anything before 2022. I don’t know how to coordinate that, but I would really like to be involved in something like this. SETI, for LLMs.


Replies

AlexCoventryyesterday at 10:54 PM

Bandwidth is the killer, in distributed LLM training.

show 1 reply