logoalt Hacker News

cluelessyesterday at 9:46 PM1 replyview on HN

Could Q.ai be commercializing the AlterEgo tech coming out of MIT Lab? i.e. "detects faint neuromuscular signals in the face and throat when a person internally verbalizes words"

Yep, looks like that is it. Recent patent from one of the founders: https://scholar.google.com/citations?view_op=view_citation&h...


Replies

mikestorrentyesterday at 10:37 PM

Yeah...

Pardon the AI crap, but:

> ...in most people, when they "talk to themselves" in their mind (inner speech or internal monologue), there is typically subtle, miniature activation of the voice-related muscles — especially in the larynx (vocal cords/folds), tongue, lips, and sometimes jaw or chin area. These movements are usually extremely small — often called subvocal or sub-articulatory activity — and almost nobody can feel or see them without sensitive equipment. They do not produce any audible sound (no air is pushed through to vibrate the vocal folds enough for sound). Key evidence comes from decades of research using electromyography (EMG), which records tiny electrical signals from muscles: EMG studies consistently show increased activity in laryngeal (voice box) muscles, tongue, and lip/chin areas during inner speech, silent reading, mental arithmetic, thinking in words, or other verbal thinking tasks

So, how long until my Airpods can read my mind?

show 1 reply