logoalt Hacker News

ToucanLoucanyesterday at 11:39 PM1 replyview on HN

Corrections in order of appearance not importance:

* No legitimate justification: their materials are being stolen to train and be regurgitated by LLMs and generate products. They are not being compensated yet their contribution goes on to make AI companies money, and preventing open consumption of their materials, to assist an AI company in rendering them obsolete, is not a justification for retaliating? You would have the barest whiff of a point if OpenAI and company were going to artists, requesting materials for training, and were given tainted ones, that at least I could say was duplicitous. But not when it's publicly posted, that's just an AI company not doing a good job of minding their input.

* Serve only to make access to and transformation of info more difficult: As in, you have to go to the website of the person actually publishing the information, as opposed to having it read in a Google summary? Also worth noting this inconvenience applies only to a theoretical person using an AI search tool. Everyone else is unaffected. Seems like if you're going to a particular service provider whom is uniquely unable to provide the service you want, that seems like an easy to solve issue: use something else.

* can only hope that by these egregiously anti-social luddites: Your daily reminder that the Luddites were not anti-technology, they were anti-corporations using mechanization to make an ever dwindling number of workers produce ever more products of ever lower quality.

* we'll gain the knowledge to render this category of attack moot for the foreseeable future: This is a bad strategy and historically has not worked for a single industry. If your industry itself exists in open opposition to consumer movements, you don't win. At best, you survive. But there's no version of this where everyone just unwillingly adopts AI and you can tell them to deal with it. Whole companies now are cropping up to help people who want to opt-out of the AI future as promised.


Replies

jimmaswelltoday at 12:29 AM

I categorically reject the "stealing" claim. Either training your human brain on a book is stealing, or training an AI on a book is not. There's no meaningful difference as far as how much the act is "stealing". The same people who rightfully found it laughable that dying newspaper companies wanted royalties from Google for search result snippets are suddenly chomping at the bit for copyright law to be vastly expanded and fair use to be gutted, there's no logical consistency.

An aside, I honestly think that if someone recoils at the idea of an AI learning from their idea and using that idea to help someone else, they're just a bad, selfish person.