Get Started
RESEARCH PAPER

NEST: Neural EEG-to-Text Decoding

A Transformer-based approach for decoding EEG brain
signals into natural language

Abstract

We present NEST (Neural EEG Sequence Transducer), a deep learning framework for decoding EEG brain signals into natural language text. Using a transformer encoder-decoder architecture, NEST processes raw EEG recordings and generates corresponding text sequences. Our model achieves state-of-the-art performance on the ZuCo dataset with a Word Error Rate (WER) of 26.1% and BLEU score of 0.74. We demonstrate that neural activity during reading contains sufficient information for text reconstruction, opening new possibilities for brain-computer interfaces.

Key Results

26.1%
Word Error Rate
Lower is better
0.74
BLEU Score
Higher is better
73.9%
Accuracy
Character-level

Methodology

📊
Data Collection

ZuCo dataset with 12 subjects reading natural English sentences while 105-channel EEG is recorded at 500Hz.

🔧
Preprocessing

Band-pass filtering (0.5-100Hz), artifact removal, epoch extraction, and z-score normalization.

🧠
Model Architecture

Transformer encoder-decoder with 6 layers, 512 hidden dimensions, and 8 attention heads.

Cite This Work

@article{nest2026,
  title={NEST: Neural EEG Sequence Transducer for Brain-to-Text Decoding},
  author={NEST Project Contributors},
  journal={arXiv preprint arXiv:2602.xxxxx},
  year={2026}
}

Access the Full Paper