The food pictures or occasional text entries (2.1% of all events) were further annotated by looking up the reference nutrition values from Calorie King or FNDDS (). Out of the 26,676 events recorded, 22% (5846) were water, 28% (7420) consisted of pre-packaged items with readily accessible nutrition information, and 50% (13,410) were mixed meals with multiple items. We hypothesized that the reported caloric intake should at least meet the resting energy expenditure or maintenance caloric (MC) intake (). The average daily estimated caloric intake for the group (mean 1,947 kcal; 95% CI: 1,917–1,977) was more than their respective maintenance caloric intake (mean 1.233-fold over MC; 95% CI: 1.214–1.251). From push notifications, we had measured a false negative rate or underreporting of food/beverage/water to be 10.34%. Therefore, the actual caloric intake was likely little higher. The extra caloric intake likely accounts for activity above resting metabolism. There was no significant change in body weight during the 3-week reporting period ( Table S1 ), indicating any potential feedback effect on weight loss due to recording food intake was absent.

We monitored healthy, non-shift-worker adult males and females ( Figures 1 A and 1B , Table S1 ) for 3 weeks. After meeting the inclusion and exclusion criteria ( Table S2 ) and signing an informed consent document during an office visit, subjects used the custom mobile application (Salk Metabolic App) installed on their smartphone to take pictures of every food, beverage, or water item they consumed, irrespective of volume or calorie, just prior to consumption ( Figures S1 and S2 ). Appending a textual annotation describing the amount and the item(s) consumed to the pictures was optional.

(C and D) Polar plot of all (C) or calorie-containing (≥5 kcal) (D) ingestion events of each individual plotted against the time of day (radial axis) in each concentric circle. Data from 156 individuals are shown.

We made use of push notifications as an orthogonal measurement technique for assessing the time of diet intake. These push notifications were manually triggered at a random time of the day, during the stated wakeful period of the subject and numbered 1–2 per day. The specific push notification query presented to the subject was dependent on the time when he/she responds to it, not when it was originally dispatched from the server. The push notification presented a query on the user’s device inquiring whether they ate/drank anything in the past 30 min. Subjects had to push a “Yes” or “No” button displayed on their screen, and their responses were recorded on the server. From such responses to push notifications sent at a random time during wakeful hours, we estimated the false negative rate (i.e., when the subject consumed food/beverage/water, but forgot to log the event) for our methodology to be 10.34%.

We built a smartphone software application (“app”) to longitudinally monitor the daily temporal pattern of caloric intake in free-living humans. To record an ingestive event, the participants used the camera function of the smartphone to take a picture of the food or beverage. Food pictures taken in the JPEG format were downscaled to 1/10 their original size on the device itself to reduce network data usage. Participants also had the option for textual entries to substitute for food/drink pictures when picture taking was difficult or when subjects forgot to log their picture entries. Immediately after data logging, the food picture and text entries along with timestamp and geolocation were immediately transferred to a server. Upon confirmation of successful data transfer, the data and the associated metadata (i.e., the timestamp and geolocation) were erased from the subject’s device, eliminating the possibility for the “feedback effect” of prior-recorded information upon present behavior.

The fraction of total calories consumed (starting at 4 a.m.) showed that less than 25% of caloric intake occurs before noon ( Figure 2 F). The percentages of total calories consumed after 6 p.m., 9 p.m., and 11 p.m. (and before 4 a.m. of the next day) were 37.5%, 12.2%, and 3.9%, respectively ( Figure 2 F). After adjusting for maintenance calories (MC) for each individual (), the average cumulative percentage of MC ingested over diurnal time showed the average time by which 50%, 70%, 90%, and 100% of MC were consumed to be 3:32 p.m., 5:04 p.m., 6:11 p.m., and 6:36 p.m., respectively (see Figure 2 G for median values). In summary, there is a systematic bias toward consuming a larger portion of the daily caloric intake toward the late afternoon and evening hours. At the cohort level, in general, food consumed after 6:36 p.m. exceeded the maintenance calories requirement.

From a subset of events that marked the beginning and end of an eating report (i.e., pictures of food at the start and of leftovers at the end of the meal), we calculated the average meal duration to be 14 min 36 s. Therefore, “events” from an individual with a <15 min inter-event interval were combined into one “meal.” At the group level, 25% of all meals were within 1 hr 25 min of another meal, and the median inter-meal interval was 3 hr 6 min. Only 25% of the meals occurred after >6 hr 41 min of fasting ( Figure 2 E).

Surprisingly, in contrast to the self-reported 3 meals/day structure of meals from most of the participants, a breakfast-lunch-dinner temporal pattern was largely absent ( Figure 2 C). At the individual level, the number of events/day showed wide variation ranging from 4.22 ± 0.1 (mean + SEM) for the bottom decile to 15.52 ± 0.34 for the top decile, of which 3.33 ± 0.07 and 10.55 ± 0.24, respectively, were caloric events ( Figure 2 D).

The timestamp of every ingestion event allowed analyses of the temporal aspect of eating. Aggregate data from 3 weeks of monitoring was used to assess eating pattern of the cohort. Caloric (i.e., >5 kcal) events populated a large segment of the 24 hr day ( Figures 1 C and 1D), and there were only 5 hr between 1–6 a.m. when the number of events/hr were <1% of total events ( Figure 2 A). The fraction of events with estimated energy content >5 kcal also reached its nadir ( Figure 2 B) in that interval. Because at the population level human digital activity reaches the trough between 2 and 4 a.m. (), none of the subjects were self-reported shift-workers, and the reporting trough was close to 4 a.m. ( Figures 2 A and 2B), we considered 4 a.m. as the onset of “metabolic day” (i.e., events between 00:00:00 and 03:59:59 hr were included in the previous calendar day).

(C) Representative scatter plot of ingestion events of 11 subjects during the observation period shows the lack of clustering of events into three principal bins for most individuals and a large variation in the total number of events.

(B) The fraction of events with <5 kcal also reaches its peak at 4 a.m. Therefore, we considered 4 a.m. (instead of midnight) as the beginning of the metabolic day.

The day-to-day variation in the time of first or last caloric intake was spread over a few hours ( Figure S3 ). Feeding after several hours of fasting is known to affect neuroendocrine metabolic pathways and adjust the phase of the circadian clock in peripheral organs () so that physiological state transitions from the fasting to the fed state. Changes in the sleep-wake cycle between social/work days and free/weekend days is similar to the circadian desynchrony arising from jet travel between time zones and is called social jetlag (). By analogy, we postulated that the variation in breakfast time between working/week days and free/weekend days likely affects the peripheral clocks in metabolic organs, causing metabolic desynchrony or “metabolic jetlag.” The median breakfast (i.e., the first caloric intake event) time for the entire population changed on Saturday and Sunday ( Figures 3 D and S3 ), so we considered those two days as the “metabolic weekend.” The time of last caloric intake did not significantly change in any of the days, but the variability was relatively large on Friday and Saturday ( Figures 3 E and S3 ). At the population level, the mean times of first caloric intake during weekdays and weekend were 9:21 a.m. (95% CI: 9:15–9:27 a.m.) and 10:26 a.m. (95% CI:10:15–10:37 a.m.), respectively. Delaying breakfast was more common than advancing it, with 40% of the subjects delaying breakfast by 1 hr or longer and 25% by 2.18 hr, while only 7% advanced their breakfast time by >1 hr. The time of last caloric intake was more variable than breakfast. The average last caloric intake time was advanced by >1 hr in 17%, while only 15% delayed the time of last caloric intake by >1 hr on the weekend.

To compare the eating pattern with diurnal activity period, wrist actigraphy () data were collected from 47 randomly selected participants using a CamNTech Motionwatch 8, a device that measured both activity and light. A nighttime drop in activity and an absence of light was scored as time in bed. Integration of activity, light, and ingestion events allowed analyses of eating time relative to activity period ( Figure 3 A). The average time of activity onset and of time to bed showed large variation even among this non-shift worker cohort ( Figure 3 B). The median time interval between daily activity onset and first caloric intake was 1 hr 18 min, while the median time between the last caloric intake and going to bed was 2 hr 22 min ( Figure 3 C). Therefore, the total overnight fasting duration paralleled the time of inactivity (sleep) at night.

(D and E) Median time of first (D) and last (E) caloric event of all individuals on different days of the week. Median (25%–75% interval in box, 10%–90% interval in lines) local time is shown.

(C) Time interval between waking up and the first caloric ingestion or the last caloric ingestion and going to bed. Bars (orange and blue, y axis) indicate the percent of the individuals for whom actigraphy was performed with the indicated number of hours (x axis, 1 hr bins) from waking up to the first caloric intake or from the last caloric intake to sleep. Cumulative percentages (secondary y axis) are shown in color-matched lines.

(B) Wakeful activity duration in a subset of the subjects is shown. Each horizontal bar shows the interval between average wake up and bedtime (+SEM, up to 21 days of monitoring).

Having observed a large variance in the first and last caloric intake ( Figure S3 ) and the absence of a clear 3 meals/day eating pattern ( Figures 2 C and 2D) for most subjects, a potentially better description of an individual’s eating pattern could be the daily duration of caloric intake. Because food intake triggers post-prandial changes in neuroendocrine state that can take minutes to hours to return to resting or fasting state, eating too often could clamp the physiological state between frequent meals to the post-prandial state. Therefore, we defined the daily eating duration as the time interval (4 a.m. onward) that contained 95% (2.5–97.5 percentile) of all intake events during the monitoring period ( Figure 4 A). This approach for arriving at the eating duration from aggregate data over several days is tolerant of occasional non-reporting of some random eating events. Breakfast time weakly positively correlated with the last caloric intake (r= 0.379), so that individuals with earlier breakfast also had their last caloric intake earlier in the evening ( Figure 4 B). The eating duration better correlated with the time of last caloric intake (r= 0.215) than with the time of breakfast (r= 0.035) ( Figures 4 C and 4D) or with BMI (r= 0.017) ( Figure 4 E). The median daily eating duration was 14 hr 45 min, and only 9.7% of the subjects had a daily eating duration <12 hr ( Figure 4 F) long. The weak correlation (r= 0.017) between the eating duration and BMI could be due to the limited sample size, the heterogeneity of the participants in terms of gender and age, and the fact that the eating pattern recorded in the monitoring period is a short-term snapshot of a person’s long-term diet-related behaviors.

(C and D) The daily duration of eating does not correlate with the time of first caloric intake (C) but weakly positively correlates with the time of the last caloric intake (D).

Although the participants were not overtly asked to change nutrition quality or quantity, reducing the eating duration led to reduced estimated caloric intake. Unlike mice, where reducing the eating duration to ∼10 hr does not alter total caloric intake (), our human intervention cohort reduced the estimated daily caloric intake (average reduction 20.26%; 95% CI 4.92%–35.6%; paired t test p < 0.05). Humans consume heterogeneous food types in a time-of-the-day-dependent manner ( Figure S4 ), e.g., coffee is almost always consumed in mornings, while alcohol at night. As a result, during the intervention, it was not the case that items that would have otherwise (in the baseline period) been consumed in the designated 14 hr nighttime fasting hours had been moved to the self-selected 10 hr feeding period during the intervention. Instead, the person would simply not consume such an item rather than consume it at the wrong time of day. This could be one potential explanation for the reduction in caloric intake.

All subjects reduced their eating duration (average reduction: 4 hr 35 min; 95% CI: 3 hr 30 min–5 hr 40 min; p < 0.001), and their weekday/weekend metabolic jetlag was also reduced to <1 hr ( Figures 5 B and 5C). The participants showed a reduction in total body weight (average loss 3.27 kg; 95% CI: 0.9081–5.624 kg) and, accordingly, excess body weight ( Figures 5 D and 5E, Table S3 ) and BMI (average reduction 1.15 kg/m; 95% CI: 0.3247–1.980 kg/m). In a subjective self-assessment of sleep satisfaction, hunger at bedtime, and energy level (in the mornings, and overall over the past few days), statistically significant improvement was observed ( Figure 5 F). All participants voluntarily expressed an interest in continuing unsupervised with the 10–11 hr time-restricted eating regimen after the conclusion of the 16-week supervised intervention. After 36 weeks (1 year since the intervention began), the participants maintained weight loss and sleep improvement and felt more energetic ( Figures 5 D–5F, Table S3 ).

Many factors, including nutrition quality, quantity, physical activity, and genetics contribute to obesity. Although we did not find a simple correlation between BMI and eating duration, we wanted to test whether longer eating duration and erratic eating pattern are contributing factors in subjects with co-occurrence of >25 BMI and >14 hr eating duration. We tested if reducing the eating duration and metabolic jetlag associated with weekday/weekend differences in a subset of individuals would lead to reduction in body weight. We recruited 8 individuals with >14 hr eating duration for a 16-week pilot intervention study, such that each individual’s own baseline data served as the control ( Figure 5 A) condition. Individualized “feedogram” graphics representing a temporal raster plot of ingestion events in successive days were constructed ( Figure 5 B). After obtaining informed consent for intervention, the participants were provided data regarding their eating pattern accrued in the 3-week baseline period (95% eating duration, variance in first and last caloric intake, and weekday-weekend metabolic jetlag) and were shown their own baseline “feedogram” prior to the intervention. Because in rodents a daily eating period of up to 12 hr improves metabolic fitness (), the participants were requested to reduce their caloric-containing eating duration to a self-selected window of 10–12 hr and to consistently follow this duration during both weekdays and weekends so that the metabolic jetlag could be minimized. No overt suggestion concerning nutrition quality, quantity, or caloric content was provided. The individuals continued logging their food pictures using the same app as used in the baseline period for the next 16 weeks and also received a weekly summary of their feedograms and daily eating duration.

(F) Average (+SEM) of subjective measures of energy level, hunger, and sleep in subjects. These subjective measures were assessed on a scale of 1–10, with 10 being the preferred (healthier) end of the range. Higher numbers thus indicated healthier values for the quantity, i.e., more energetic, less hunger at bedtime, and more sleep satisfaction. ∗ p < 0.05, t test.

(B) Representative “feedogram” of a participant during baseline and during intervention. The times of ingestion events are denoted as prominent black rectangles along the 24 hr day represented in each horizontal line (x axis). Yellow represents the time between 6 a.m. and 6 p.m. Eating duration during baseline and intervention is shown as broken lines.

Discussion

Carter et al., 2013 Carter M.C.

Burley V.J.

Nykjaer C.

Cade J.E. Adherence to a smartphone application for weight loss compared to website and paper diary: pilot randomized controlled trial. Collecting human nutrition information in the free-living condition has been a persistent challenge. Recording dietary intake using text entries, selecting from a large library of food items, and specifying the portion size is a ubiquitous feature found in most nutrition apps. Although such apps improve adherence relative to the traditional diary log (), data logging can be cumbersome for mixed meals, and consequently, users may not bother to log small snacks. Furthermore, portion size reporting can be subjective. By adopting an approach centered on food pictures with an optional user-side annotation together with infrequent, but randomly timed, push notifications, we reduced the barrier to data recording. Server side annotation of the picture metadata ensured uniformity across the cohort. Supervised crowd-sourced annotation can make this approach scalable to large cohorts.

Zhang et al., 2014 Zhang R.

Lahens N.F.

Ballance H.I.

Hughes M.E.

Hogenesch J.B. A circadian gene expression atlas in mammals: implications for biology and medicine. Thaiss et al., 2014 Thaiss C.A.

Zeevi D.

Levy M.

Zilberman-Schapira G.

Suez J.

Tengeler A.C.

Abramson L.

Katz M.N.

Korem T.

Zmora N.

et al. Transkingdom control of microbiota diurnal oscillations promotes metabolic homeostasis. Vollmers et al., 2009 Vollmers C.

Gill S.

DiTacchio L.

Pulivarthy S.R.

Le H.D.

Panda S. Time of feeding and the intrinsic circadian clock drive rhythms in hepatic gene expression. Zhang et al., 2014 Zhang R.

Lahens N.F.

Ballance H.I.

Hughes M.E.

Hogenesch J.B. A circadian gene expression atlas in mammals: implications for biology and medicine. By overlaying the daily patterns of food and beverage intake, activity-rest, and light exposure, we could uncover relationships among them ( Figures 3 A–3C). Data integration from multiple such longitudinal data streams has immense promise for disease prognosis. Although the subjects did not have any chronic medical conditions, they also logged their consumption of vitamins, supplements, and occasional over-the-counter medications for minor ailments, thus offering a temporal pattern of drug and supplement use ( Figure S4 ). Greater than 50% of the mammalian transcriptome exhibits diurnal rhythms in a tissue-specific manner (), the gut microbiome shows daily rhythms (), the timing of food affects these rhythms in peripheral organs (), and the targets of a large number of FDA approved drugs show circadian expression (). Therefore, monitoring the timing of drug intake relative to the sleep-wake or feeding-fasting cycle can have a significant impact on disease prognosis and unraveling interactions among food, sleep, and drugs in free-living individuals.

Formally, our work introduces a method and the critical defining parameters for describing the diurnal and longer-term temporal characteristics of nutrition in humans. By creating a scalable method to longitudinally monitor human nutrition in an evidence-based manner, we discovered that the daily eating pattern even among healthy non-shift young workers is highly variable from day to day. For more than half of the participants in the baseline monitoring study, the eating pattern is erratic, and energy intake events span over a large fraction of a 24 hr day, with a relatively short fasting period ( Figure 4 ). Although the first caloric intake after leaving the bed happened within 1 hr 18 min (median value) ( Figure 3 C), less than 25% of the daily caloric intake occurred before noon, while 37.5% was consumed after 6 p.m. ( Figure 2 F). This suggests that breakfast is relatively small in terms of energetic input and that major caloric intake is delayed until later in the afternoon or evening in this set of relatively young ( Table S1 ) non-shift-worker subjects. To address the universality of our observations concerning eating patterns, this method may be extended to a larger population spread over different geographical regions, work schedules (e.g., shift-workers, retired individuals, nurses, pilots), age groups, and/or cultures. It would also be useful to describe the diurnal patterns of caloric intake in humans that do not have a modern lifestyle influenced by electricity, such as hunter-gather societies.

Roenneberg et al., 2012 Roenneberg T.

Allebrandt K.V.

Merrow M.

Vetter C. Social jetlag and obesity. Wittmann et al., 2006 Wittmann M.

Dinich J.

Merrow M.

Roenneberg T. Social jetlag: misalignment of biological and social time. Cappuccio et al., 2011 Cappuccio F.P.

Cooper D.

D’Elia L.

Strazzullo P.

Miller M.A. Sleep duration predicts cardiovascular outcomes: a systematic review and meta-analysis of prospective studies. Copinschi et al., 2014 Copinschi G.

Leproult R.

Spiegel K. The important role of sleep in metabolism. Individuals in our study largely ate throughout the wakeful hours ( Figure 3 ). Consequently, sleep duration and quality largely dictated the eating pattern. Furthermore, since the sleep pattern changes between weekdays (workdays) and weekends, leading to social jetlag (), the breakfast time also changes between weekdays and weekends. These changes in breakfast time are analogous to a person traveling across time zones every weekend and can be described as metabolic jetlag. The intricate connection we observed between sleep and overnight fasting duration suggests that the observed relationship between a short sleep duration and predisposition to metabolic diseases () may be partly explained by the reduced duration of overnight fasting. Similarly, the reported correlation between social jetlag and BMI may also involve metabolic jetlag. The increased daily eating duration likely contributes to increased caloric intake. A change in eating pattern between days (e.g., weekday versus weekend) can affect time-of-day/night-specific changes in food intake from specific food groups ( Figure S4 ). Therefore, one mode by which reduced sleep duration contributes to the increased risk for metabolic diseases could be the increased daily eating duration and associated changes in caloric intake and nutrition quality.

We did not find a simple positive correlation between the daily eating duration and BMI in our cohort. This may be for several reasons, including a limited sample size, heterogeneity of the cohort, and a likely scenario that individuals with long eating duration may also have more physical activity. Nevertheless, reducing the temporal eating period in a feasibility study imparted measurable benefits of clinically relevant magnitude in terms of body weight reduction and sleep improvement without increasing the subjective sense of hunger. This relatively large effect on body weight reduction, even in the small intervention cohort, implies that the benefits might result from multiple changes: restoration of the diurnal rhythm of feeding/fasting, reduction of the weekday/weekend metabolic jetlag, and a reduction in the daily caloric intake. Some benefits of TRF might arise from caloric reduction (CR). At the same time, we cannot rule out the possibility that some benefits of CR in vertebrates including humans might be from TRF, as most CR studies involve caloric intake within a defined time frame. Nevertheless, if time restriction under free-living condition inadvertently leads to caloric reduction, TRF as a method to reduce caloric intake is a more attractive option, as individuals, caregivers, case managers, physicians, and scientists do not have to adopt expensive and laborious methods to accurately track caloric count. Hence, irrespective of mechanism, time restriction offers an effective approach to improve health.

While the relative contribution of daily eating pattern, calories, and nutrition quality to multifaceted health improvement in humans should be examined in detail in future studies, our results highlight that suitable manipulation of the diurnal temporal pattern of caloric intake is a feasible therapeutic approach for improving human health in the free-living condition, in spite of the vast variety of food and beverage types consumed by the average person from day to day. This opens up the possibility for utilizing this strategy by itself or in combination with existing approaches for health improvement.