Last Monday, a group of 18 volunteers on the outskirts of Washington, D.C., began the second phase of a multi-week feeding experiment that will govern every morsel of food going into their bodies and analyze almost everything that comes out. The study, led by Department of Agriculture (USDA) scientists, is part of an ongoing effort to re-evaluate the number of calories eaters derive from popular foods. This particular experiment is focused on how the human body digests lentils and chickpeas, but it builds on prior ones conducted on almonds, walnuts, pistachios, and cashews. Taken together, the various studies make up a growing area of research that’s making calorie estimates more accurate regarding how our bodies interact with food.

That’s right: The calorie count—a common measure used to estimate nutritional needs, as well as shame many of our dietary choices—might not be as accurate as we’ve been led to believe. The origin of the calorie as we know it dates back to the late 19th century, when a chemist named Wilbur Atwater popularized a method of energy calculation that assigned a fixed number of calories to every gram of protein, carbohydrate, and fat in a given food. These numbers—“Atwater factors,” as they’re known—were derived from a series of ethically questionable experiments that his team conducted. In them, Atwater and other scientists sequestered people in air-tight chambers for extended periods of time to calculate their metabolic rates based on diet and different physical activities. Today, Atwater factors are the basis of how most manufacturers calculate calories in accordance with the Food and Drug Administration’s (FDA) food labeling regulations.