Categories: FAANG

MAEEG: Masked Auto-encoder for EEG Representation Learning

This paper was accepted at the Workshop on Learning from Time Series for Health at NeurIPS 2022.
Decoding information from bio-signals such as EEG, using machine learning has been a challenge due to the small data-sets and difficulty to obtain labels. We propose a reconstruction-based self-supervised learning model, the masked auto-encoder for EEG (MAEEG), for learning EEG representations by learning to reconstruct the masked EEG features using a transformer architecture. We found that MAEEG can learn representations that significantly improve sleep stage classification (∼ 5% accuracy…
AI Generated Robotic Content

Recent Posts

EditAnything IC-LoRA – LTX-2.3

This model was trained on 8,000 video pairs, and training is still ongoing for a…

5 hours ago

The Best Smart Home Accessories to Boost Your Curb Appeal (2026)

These locks, lights, and other smart home upgrades let you add automation without messing up…

6 hours ago

Artificial neurons successfully communicate with living brain cells

Engineers at Northwestern University have taken a striking leap toward merging machines with the human…

6 hours ago

Unpredictable AGI may resist full control, making diverse AI safer

Public concern about AI safety has grown significantly in recent years. As AI systems become…

6 hours ago

We can finally watch TNG in 16:9

Somone posted an example of LTX 2.3 outpainting to expand 4:3 video to 16:9. I…

1 day ago

The Complete Guide to Inference Caching in LLMs

Calling a large language model API at scale is expensive and slow.

1 day ago