logoalt Hacker News

tefkahtoday at 10:41 AM4 repliesview on HN

I struggle to find non-evil applications of voice-cloning. Maybe listening to your dead relative's voice one more time? But those use-cases seems so niche to the overwhelming use this will likely have: misinformation, scamming, putting voice actors out of work.


Replies

apwheeletoday at 12:29 PM

I would clone my own and do things like create scripted tutorials/presentations and audio books.

I do not personally prefer it, but a non-trivial number of individuals like video/audio presentations over writing.

show 1 reply
c0balttoday at 12:37 PM

Selling a voice profile for procedural/generated voice acting (similar to elevenlabs "voices") of a well-known person or pleasant sounding voice could be a legitimate use-case. But only iif actual consent is acquired first.

Given that rights about ones likeness (Personality rights) are somewhat defined there might be a legitimate usecase here. For example, a user might prefer a TTS with the voice of a familiar presenter from TV over a generic voice.

But it sounds exceedingly easy to abuse (similar to other generative AI applications) in order to exploit end-users (social engineering) and voice "providers" (exploitation of personality rights).

show 1 reply
schlupfknotentoday at 11:14 AM

Voice acting for procedurally generated games?

chistevtoday at 11:31 AM

Black mirror episode