logoalt Hacker News

The No Fakes Act has a “fingerprinting” trap that kills open source?

182 pointsby guerrillayesterday at 5:01 AM84 commentsview on HN

Comments

Aachenyesterday at 6:02 AM

This reply sounds like a lot more sensible take: https://old.reddit.com/r/LocalLLaMA/comments/1q7qcux/the_no_...

OP replied and there's another in-depth reply to that below it

show 2 replies
alphazardyesterday at 1:59 PM

Likely unconstitutional as it violates the 1st amendment, which has done a very good job of protecting the right to author and distribute software over the years. Clearly an unintended positive consequence, since no one who worked or voted on the Bill of Rights had a computer.

If the courts upheld the part in question, it would create a clear path to go after software authors for any crime committed by a user. Cryptocurrencies would become impossible to develop in the US. Holding authors responsible for the actions of their users basically means everyone has to stop distributing software under their real names. There would be a serious chilling effect, as most open source projects shutdown or went underground.

show 1 reply
dleeftinkyesterday at 6:00 AM

Not saying this would be the right way to go about preventing undesirable uses, but shouldn't building 'risky' technologies signal some risk to the ones developing them? Safe harbor clauses have long allowed the risks to be externalised onto the user, fostering non-responsibility on the developers behalf.

show 5 replies
geraldogyesterday at 6:57 AM

You know kids, in the 80s, a lot of time before the First Crypto Wars, we had something called the Porn Wars on American Congress. I could leave out many depositions to Congress on Youtube, but I will leave you with some good music.

(shill)

https://www.youtube.com/watch?v=2HMsveLMdds

Which is of course the European Version, not the evil American Version.

show 2 replies
SamInTheShellyesterday at 7:11 AM

The actual link from the initial reddit post without the GOOG tracking: https://www.congress.gov/bill/119th-congress/senate-bill/136...

Edit: Also does this mean OpenAI can bring back the Sky voice?

zx8080yesterday at 7:09 AM

> I contacted my reps email to flag this as an "innovation killer."

Chinese companies will happy to drive innovation further after Google and OpenAI giants goes on with this to kill competition in the US.

US capitalism eats itself alive with this.

show 1 reply
_defyesterday at 5:45 AM

"Open Source" in this case means "ML models with open weights"

(not my interpretation, it's what the post states - personally that is not what I think of when I read "Open Source")

show 2 replies
siliconc0wyesterday at 5:54 AM

Yay another bill modeled after the DMCA, what could go wrong?

show 1 reply
xmprtyesterday at 6:02 AM

I think this title is quite misleading given that it's only impacting open source models which is a very narrow interpretation of open source.

josalhoryesterday at 7:32 AM

We do have tech that is "behind doors". Just look at military applications (nuclear, tank and jet design etc). Should "clonable voice and video" be behind close doors? Or should AGI be behind close doors? I think that the approach of the suggested legistation may not the right way to go about; but at a certain level of implementation capability I'm not sure how I would handle this situation.

If current tech appeared all of a sudden in 1999; I am sure as a society we would all accept this, but slow boiling frog theory I guess.

mdhbyesterday at 5:59 AM

[flagged]

logicchainsyesterday at 6:40 AM

It should be called the anti-AGI bill, because trying to ban AI with certain capabilities is essentially banning embodied AI capable of learning/updating its weights live. The same logic applied to humans would essentially ban all humans, because any human can learn to draw and paint nudes of someone else.

rookderbyyesterday at 12:32 PM

Reddit is one of the domains I block using StepenBlack's hosts list [0].

Here is some background info on the act from wikipedia: https://en.wikipedia.org/wiki/No_Fakes_Act.

[0] https://github.com/StevenBlack/hosts/blob/master/readme.md

karlgkkyesterday at 6:18 AM

> voice-conversion RVC model on HuggingFace, and someone else uses it to fake a celebrity, you (the dev) can be liable for statutory damages ($5k-$25k per violation). There is no Section 230 protection here. This effectively makes hosting open weights for audio models a legal suicide mission unless you are OpenAI or Google.

Good.

show 2 replies