The sleep AI system, dubbed SleepFM, was developed by researchers at Stanford University.
The system learned from a massive dataset of 585,000 hours of sleep data (equivalent to 66 years of continuous recording) from approximately 65,000 patients. This data was collected through comprehensive sleep tests (polysomnography) at specialized clinics, where brain waves, heart rate, breathing patterns, and body movements were precisely tracked during sleep.
“SleepFM is essentially learning the language of sleep,” said James Zou, a senior author of the study. “We devised a technical way to have all these different bodily signals talk to each other and learn a common language.”
The AI model demonstrated remarkably accurate disease prediction. After training, the model initially outperformed existing systems in identifying sleep stages and the severity of sleep apnea. Subsequently, the model was applied to longitudinal medical records, correlating sleep patterns with patient health histories spanning up to 25 years. This allowed the model to pinpoint 130 diseases that could be predicted solely from sleep patterns.
The model’s predictive accuracy was particularly strong for conditions like Parkinson’s disease (89 percent), Alzheimer’s disease (85 percent), prostate cancer (89 percent), breast cancer (87 percent), and heart attack (81 percent), surpassing even some of the predictive tools currently used in clinics. Researchers note that models with lower accuracy (around 70 percent) are currently being used to predict response to cancer treatment.
A key finding of the research was that the most accurate predictions arose not from individual signals, but from the combination and interplay of all data channels. When different bodily systems appeared to be out of sync – for instance, the brain sleeping deeply while the heart showed signs of arousal and stress – it could signal an impending problem.
“The most information for predicting disease came from the comparison and interplay of different channels,” explained Emmanual Mignot, another senior author of the study. “It appears that asynchrony of bodily components indicates a problem.”
The research team acknowledges that the model currently doesn’t explain how it arrives at its predictions. Future research will focus on developing methods to interpret the AI’s decision-making process. The ultimate vision is to integrate this technology with data from wearable devices like smartwatches, enabling comprehensive and personalized health monitoring.
This study represents a groundbreaking step toward transforming sleep, which comprises a third of human life, into a safe and powerful digital sampling method for holistic health assessment.
Separately, other research is exploring the potential of AI for early disease detection through various biological signals.