logoalt Hacker News

hassaanr10/02/20248 repliesview on HN

This is a valid concern, but we’ve always been very serious about consent and privacy. Our models cannot be used without explicit verbal/visual consent and you hold the keys to your clone.


Replies

jimkleiber10/02/2024

No snark intended...if you're making it much easier to make clones of people verbally and visually, why would I feel confident in you accepting a verbal/visual consent from "me"?

nextaccountic10/02/2024

> you hold the keys to your clone.

Can I run it on my computer?

If it doesn't run on my computer, what keys are you talking about? Cryptographic keys? It would be interesting to see an AI agent run on fully homomorphic encryption if the overhead weren't so huge - would stop cloud companies from having so many intimate, personal data of all sorts of people.

carstenhag10/02/2024

No way I'm going to trust a small company/startup (move fast, break things) with this. Especially in the US.

phito10/02/2024

I don't trust any of you AI people with that.

show 1 reply
d204910/02/2024

Probably the phrase "you hold the keys to your clone" should give anyone pause.

I once worked at a company where the head of security gave a talk to every incoming technical staff member and the gist was, "You can't trust anyone who says they take privacy seriously. You must be paranoid at all times." When you've been around the block enough times, you realize they were right.

You can guarantee you won't be hacked? You can guarantee that if the company becomes massively successful, you won't start selling data to third parties ten years down the road?

arthurcolle10/02/2024

Does the end user optionally get like a big safetensors of their own digital twin?

jncfhnb10/02/2024

And you promise to never get acquired right?

jesterson10/02/2024

> we’ve always been very serious about consent and privacy.

That's quite a commitment, guys, I am sold

/s