Edited from Nature Medicine, Jan 6 2026
A poor night's sleep portends a bleary-eyed next day, but it could also hint at diseases that will strike years down the road. A new artificial intelligence (AI) model developed by Stanford Medicine researchers and their colleagues can use physiological recordings from one night's sleep to predict a person's risk of developing more than 100 health conditions.
Known as SleepFM, the model was trained on nearly 600,000 hours of sleep data collected from 65,000 participants with up to 25 years of follow-up. The sleep data comes from polysomnography, a comprehensive sleep assessment that uses various sensors to record brain activity, heart activity, respiratory signals, leg movements, eye movements and more.
Polysomnography is the gold standard in sleep studies that monitor patients overnight in a lab. It is also, the researchers realized, an untapped gold mine of physiological data.
Only a fraction of that data is used in current sleep research and sleep medicine. With advances in artificial intelligence, it's now possible to make sense of much more of it. The new study is the first to use AI to analyze such large-scale sleep data.
Learning the language of sleep
SleepFM is essentially learning the language of sleep.
The model was able to incorporate multiple streams of data - electroencephalography, electrocardiography, electromyography, pulse reading and breathing airflow, for example - and glean how they relate to each other.
To achieve this, the researchers developed a new training technique, called leave-one-out contrastive learning, that essentially hides one modality of data and challenges the model to reconstruct the missing piece based on the other signals.
Forecasting disease
After the training phase, the researchers could fine-tune the model to different tasks.
First, they tested the model on standard sleep analysis tasks, such as classifying different stages of sleep and diagnosing the severity of sleep apnea. SleepFM performed as well as or better than state-of-the-art models used today.
SleepFM analyzed more than 1,000 disease categories in the health records and found 130 that could be predicted with reasonable accuracy by a patient's sleep data. The model's predictions were particularly strong for cancers, pregnancy complications, circulatory conditions and mental disorders, achieving a C-index higher than 0.8.
The C-index, or concordance index, is a common measure of a model's predictive performance.
"For all possible pairs of individuals, the model gives a ranking of who's more likely to experience an event - a heart attack, for instance - earlier. A C-index of 0.8 means that 80% of the time, the model's prediction is concordant with what actually happened," Zou said.
SleepFM excelled at predicting Parkinson's disease (C-index 0.89), dementia (0.85), hypertensive heart disease (0.84), heart attack (0.81), prostate cancer (0.89), breast cancer (0.87) and death (0.84).
"We were pleasantly surprised that for a pretty diverse set of conditions, the model is able to make informative predictions," Zou said.
Models of less accuracy, with C-indices around 0.7, such as those that predict a patient's response to different cancer treatments, have proven useful in clinical settings, he added.
Interpreting the model
The team is working on ways to further improve SleepFM's predictions, perhaps by adding data from wearables, and to understand exactly what the model is interpreting.
"It doesn't explain that to us in English," Dr. J. Zou said. "But we have developed different interpretation techniques to figure out what the model is looking at when it's making a specific disease prediction."
The researchers note that even though heart signals factor more prominently in heart disease predictions and brain signals factor more prominently in mental health predictions, it was the combination of all the data modalities that achieved the most accurate predictions.
"The most information we got for predicting disease was by contrasting the different channels," Mignot said. Body constituents that were out of sync - a brain that looks asleep but a heart that looks awake, for example - seemed to spell trouble.
The study received funding from the NIH, Knight-Hennessy Scholars and Chan-Zuckerberg Biohub.
Journal reference:
Thapa, R., et al. (2026). A multimodal sleep foundation model for disease prediction. Nature Medicine. doi: 10.1038/s41591-025-04133-4.





Post comments