I understand the emotion,but this looks 50 percent FAKE(as a person doing self research on BCI), there is no way right now that a Non-invasive device can do this much thing in real time,
Their yt channel and editing style also confirm that .
but what ever happens I am optimistic about this tech and I thing this is gonna change people life one day for sure , but we have to wait till few years
Now is a really good time to contribute to https://openeeg.sourceforge.net/doc/ as far as EEG concerns go. There are a myriad of things that can be observed with EEG, and it would honestly be a decent thing to see grow in time.
I cannot dance anymore so I am useless. I am dead.
Those beliefs are the problem, not her ALS.
Creating attachments leads to this suffering. Do not be surprised if you suffer when they are taken away. Acceptance is the answer, not these fake and silly tech toys. Because guess what? She is still not dancing.
The featured video does not explain how it uses signals to produce which outcomes and they basically just say "we use machine learning while outputting a dance". At 07:10 it looks like the person chooses between two binary options of "sad" and "relieved". Unfortunately I doubt the person has anywhere near the real-time input to the performance as much as it is implied. Dentsu is also an advertisng agency in Japan, so it seems like this is more marketing than it is technical.
Dances by physical humans are always choreographed beforehand but live performances always show physical motion that can interrupt or change to unchoreographed movement at any time. I have a hard time believing that this person's brainwaves are mapping and producing the hologram in a specific 3D space, other than instructing it which mood preset to use at a given time.
Excluding the marketing of the ALS story, I guess I'm wondering how it's different from a Michael Jackson hologram performance where someone could adjust the sliders for mathematical functions live?