If the last Fuji apple you grabbed from your grocery store’s produce section was mealier and less flavorful than the Fujis you remember from childhood, you’re not alone. Your memory isn’t at fault, and it’s not as though you’re particularly bad at picking apples, either.

The truth, though, is much more distressing than either of those possibilities. In chemically comparing modern-day Fujis with tests on samples during the 1970s, a team of Japanese researchers found that today’s apples are less firm and have lower concentrations of a specific acid that contributes to their taste. Their conclusion, published today in the journal Scientific Reports, is that by making apple trees’ blooming time earlier in the year and raising temperatures during apple maturation, climate change has slowly but surely changed the taste and texture of the apples we hold so dear.

They started off by testing two types of newly harvested apples: Fujis—which happen to be the world’s leading apple cultivar—and the Tsugaru. In Japan, apples are taken seriously (the country produces roughly 900,000 tons of apples annually, amounting to 14 pounds per person), and records on these same parameters have been kept on this apples dating back into the 1980s, and in some cases, the 70s.

When the researchers compared modern-day Fujis and Tsugarus to their predecessors, they found that their firmness and concentration of malic acid, which corresponds with an apple’s taste intensity, had slowly declined over the decades. Additionally, the modern apples were more susceptible to watercore, a disease that causes water-soaked regions in the apple’s flesh to break down internally over time. In other words, today’s apples were consistently mealier, less flavorful, and more disease-prone according to objective measurements such as titrating their juices to determine acid concentration, or using mechanical plungers on the fruit’s flesh to test firmness.

To see if climate change might have played a role, they analyzed the long-term climate trends in the two regions of Japan where the apples were grown (Nagano and Aomori prefectures), and found that during the 40-year period, temperatures had gradually risen by a total of about 2°C in each location. Records also indicated that, over time, the date on which apple trees in the two regions began to flower steadily crept earlier, by one or two days per decade. The last 70 days before harvest in each locale—i.e. the days during which the apples hung on the trees, ripening in the sun—were also, on average, hotter.

It’s hard to pin the blame entirely on climate change, because the process of growing apples—along with agriculture as a whole—has changed so drastically over the past few decades. A new harvesting technique or machine, for example, could have played a role in the taste decline. But other studies, conducted in closed, controlled chambers, have demonstrated that higher temperatures during the 70-day ripening window can significantly decrease taste and texture. If the case against climate change isn’t airtight, there’s at least strong circumstantial evidence.

And though the way apples taste is certainly a crucial part of modern life, the most distressing part of this whole saga might be the way in which the changes in these apples resemble climate change itself. You might eat hundreds of apples each year, and they might vary widely in quality, taste and texture. Thus, when they slowly, steadily get worse over the course of decades, it’s nearly impossible to discern the change firsthand. In these cases—both apples and climate change itself—there’s really only one option: Look to the data.