This spiel is hilarious in the context of the product this company (https://juno-labs.com/) is pushing – an always on, always listening AI device that inserts itself into your and your family’s private lives.
“Oh but they only run on local hardware…”
Okay, but that doesn't mean every aspect of our lives needs to be recorded and analyzed by an AI.
Are you okay with private and intimate conversations and moments (including of underage family members) being saved for replaying later?
Have all your guests consented to this?
What happens when someone breaks in and steals the box?
What if the government wants to take a look at the data in there and serves a warrant?
What if a large company comes knocking and makes an acquistion offer? Will all the privacy guarantees still stand in face of the $$$ ?
> Are you okay with private and intimate conversations and moments (including of underage family members) being saved for replaying later?
Is this somehow fundamentally different from having memories?
Because I thought about it, and decided that personally I do - with one important condition, though. I do because my memories are not as great as I would like them to be, and they decline with stress and age. If a machine can supplement that in the same way my glasses supplement my vision, or my friend's hearing aid supplements his hearing - that'd be nice. That's why we have technology in the first place, to improve our lives, right?
But, as I said, there is an important condition. Today, what's in my head stays in there, and is only directly available to me. The machine-assisted memory aid must provide the same guarantees. If any information leaves the device without my direct instruction - that's a hard "no". If someone with physical access to the device can extract the information without a lot of effort - that's also a hard "no". If someone can too easily impersonate myself to the device and improperly gain access - that's another "no". Maybe there are a few more criteria, but I hope you got the overall idea.
If a product passes those criteria, then it - by design - cannot violate others' privacy - no more than I can do myself. And then - yeah - I want it, wish there'd be something like that.
It’s definitely a strange pitch, because the target audience (the privacy-conscious crowd) is exactly the type who will immediately spot all the issues you just mentioned. It's difficult to think of any privacy-conscious individual who wouldn't want, at bare minimum, a wake word (and more likely just wouldn't use anything like this period).
The non privacy-conscious will just use Google/etc.
> Are you okay with private and intimate conversations and moments (including of underage family members) being saved for replaying later?
Typically not how these things work. Speech is processed using ASR (automatic speech recognition), and then ran through a prompt that checks for appropriate tools calls.
I've been meaning to basically make this myself but I've been too lazy lately to bother.
I actually want a lot more functionality from a local only AI machine, I believe the paradigm is absurdly powerful.
Imagine an AI reminding you that you've been on HN too long and offering to save off the comment your working on for later and then moving they browser window to a different tab.
Having idle thoughts in the car of things you need to do and being able to just say them out loud and know important topics won't be forgotten about.
I understand for people who aren't neurodiverse that the idea of just forgetting to do something that is incredibly critical to ones health and well-being isn't something that happens (often) but for plenty of other people a device that just helps people remember important things can be dramatically life changing.
> Are you okay with private and intimate conversations and moments (including of underage family members) being saved for replaying later?
Maybe I missed it but I didn't see anything there that said it saved conversations. It sounds like it processes them as they happen and then takes actions that it thinks will help you achieve whatever goals of your it can infer from the conversation.
I agree. I also don't really have an ambient assistant problem. My phone is always nearby and Siri picks up wake words well (or I just hold the powerbutton).
My problem is Siri doesn't do any of this stuff well. I'd really love to just get it out of the way so someone can build it better.
They seem quite honest with who they are and how they do what they do.
> Are you okay with private and intimate conversations and moments (including of underage family members) being saved for replaying later?
One of our core architecture decisions was to use a streaming speech-to-text model. At any given time about 80ms of actual audio is in memory and about 5 minutes of transcribed audio (text) is in memory (this is help the STT model know the context of the audio for higher transcription accuracy).
Of these 5 minute transcripts, those that don't become memories are forgotten. So only selected extracted memories are durably stored. Currently we store the transcript with the memory (this was a request from our prototype users to help them build confidence in the transcription accuracy) but we'll continue to iterate based on feedback if this is the correct decision.
The fundamental problem with a lot of this is that the legal system is absolute: if information exists, it is accessible. If the courts order it, nothing you can do can prevent the information being handed over, even if that means a raid of your physical premises. Unless you encrypt it in a manner resistant to any way you can be compelled to decrypt it, the only way to have privacy is for information not to exist in the first place. It's a bit sad as the potential for what technology can do to assist us grows that this actually may be the limit on how much we can fully take advantage of it.
I do sometimes wish it would be seen as an enlightened policy to legislate that personal private information held in technical devices is legally treated the same as information held in your brain. Especially for people for whom assistive technology is essential (deaf, blind, etc). But everything we see says the wind is blowing the opposite way.