logoalt Hacker News

bilbo0stoday at 12:23 AM1 replyview on HN

>that’s why I think open-source models will still come out ahead in the long term

In what data centers would these open models be run such that copyright laws will not apply?

Serious question. Trying to figure all this out.

Is it that you think people will run the models on their own laptops or phones? Or will there be some offshore municipality where the models can be served from that is out of the reach of copyright laws? Do you have another idea in mind entirely? How are you thinking on all this?


Replies

archerxtoday at 7:24 AM

You should look up the local llama and stable diffusion sub Reddits to see what is possible to do locally. If you have 24gb of vram you can do A LOT!