NEST: Neural EEG-to-Text Decoding
A Transformer-based approach for decoding EEG brain
signals into natural language
We present NEST (Neural EEG Sequence Transducer), a deep learning framework for decoding EEG brain signals into natural language text. Using a transformer encoder-decoder architecture, NEST processes raw EEG recordings and generates corresponding text sequences. Our model achieves state-of-the-art performance on the ZuCo dataset with a Word Error Rate (WER) of 26.1% and BLEU score of 0.74. We demonstrate that neural activity during reading contains sufficient information for text reconstruction, opening new possibilities for brain-computer interfaces.
Key Results
Methodology
Data Collection
ZuCo dataset with 12 subjects reading natural English sentences while 105-channel EEG is recorded at 500Hz.
Preprocessing
Band-pass filtering (0.5-100Hz), artifact removal, epoch extraction, and z-score normalization.
Model Architecture
Transformer encoder-decoder with 6 layers, 512 hidden dimensions, and 8 attention heads.
Cite This Work
@article{nest2026,
title={NEST: Neural EEG Sequence Transducer for Brain-to-Text Decoding},
author={NEST Project Contributors},
journal={arXiv preprint arXiv:2602.xxxxx},
year={2026}
}