what's the basis for conversion between hours of neural data to number of tokens? is that counting the paired text tokens?
edit: oops sorry misread - the neural data is tokenised by our embedding model. the number of tokens per second of neural data varies and depends on the information content.
edit: oops sorry misread - the neural data is tokenised by our embedding model. the number of tokens per second of neural data varies and depends on the information content.