My parents and I used to eat at a restaurant where I would spend the entire dinner interrogating them why in the world the lights were so dim. Displaying great patience (or maybe great hunger), they would calmly repeat, “It’s for ambiance.” The restaurant was trying to maintain the mood, they said, to add a certain something to the meal. I stubbornly maintained that seeing my food could only add to the sensation of tasting it.

The restaurant has since closed (probably because no one could see their food), but researchers are just beginning to understand the complex web of communication between the senses. Charles Spence is a gastrophysicist and experimental psychologist at Oxford University who studies multisensory perception—how input from one sense affects the brain’s interpretation of another. Not synesthesia, like when some people will see letters as having colors. Everyone experiences multisensory perception, as Spence explains in this video, directed by Liam Saint-Pierre.

Taste, for instance, goes far beyond taste buds. Everything from food and beverage color to crunch to shape to (of course) lighting influences how your brain interprets the signals coming from your mouth when you take a bite. Lighting alone, Spence says, has an effect “both on what we choose to order and then on how what we order tastes to us.”

One of his favorite examples comes from a group of researchers who showed that people who like strong coffee drink more of it under bright lights, while people who like weak coffee drink more of it under dim lights. They weren’t consciously deciding based on the ambient lighting; their brains were just doing it for them behind the scenes. Similar unexpected links are everywhere in this field: Higher-pitched music, for example, brings out sweeter tastes and lower-pitched music brings out bitterness. Spence calls this “sonic seasoning.”

It’s not just that everything is connected to taste. Everything is connected to everything. Multisensory research has revealed that the brain isn’t wired to process each sensory input on its own, because no one sense tells it enough to accurately construct the world. “Every given moment that our eyes are open, our ears are open—every waking moment, our brain is processing all of these things at the same time,” says Ladan Shams, a cognitive neuroscientist at UCLA. “And if the brain doesn’t do a good job, then we cannot interact with the environment, and we would not survive.”

Spence, too, thinks in terms of signal processing. “The brain as a biological system is noisy,” he says. The senses send a constant stream of information—most of it irrelevant, some of it important. Your brain has to process all of the signals and put them into conversation with each other to help you pick out the really important stuff. As a result, Shams says, “what we see is profoundly affected by what we hear.” Spence, who will be speaking at the Fifth Annual Future of StoryTelling Summit on October 5-6, has a catchier way of summarizing his findings: “Sound has a taste. Taste has a sound.”

So would I have enjoyed my food more had the lights been a little brighter? I’m going to have to keep arguing with my parents about that one. So far, multisensory research has only shown that under brighter lights, I probably would’ve ordered spicier wings.