When thinking about creating a multisensory experience, it is important to realize that flavor perception involves all five senses. So let’s redefine “flavor." Flavor is more than just the five primary tastes in the mouth. It’s often said that flavor is about 80 percent from the smell in the nose. In addition, we have somatosensation that includes touch, temperature and pain (Youseff, 2015). Professor Acree at Cornell University talked about “seeing the flavor" a few years ago. Sound has recently been called the forgotten flavor sense (Gastropod, 2015). When we talk about human perception of flavor in foods and beverages, we are really talking all five senses. Each sense is called a modality.
The first modality is taste. Each taste bud on the tongue is a cluster of about 100 taste cells. There are three types of taste cells: Type I (salty), Type II (sweet, bitter or umami) and Type III (sour). Each taste cell is taste-specific, that is “one taste, one cell" (Zuker, 2011). Although we are hard-wired to make no mistake in detecting each taste, there is “sensory processing circuitry" (Bigiani, 2011) in the brain to integrate all the gustatory information (Sternini, 2013).
Thanks to the rapid advances in the past 15 years, all the primary taste receptors have been identified (NIZO, 2011). We have 25 bitterness receptors, which are always on, to detect supposedly toxins. We only have one sweetness receptor to search for carbohydrate as source of energy. We also have one umami receptor to detect proteins for survival. These first three receptors in Type II taste cells are protein receptors in the GPCR Family. The next two “receptors" are not protein but ion channels, one for salty and one for sour taste. There have been recent evidences on fatty taste (more specifically, fatty acid taste) as the sixth primary taste (Mattes, 2015). What this meant was that mice could tell something was fat by truly tasting, and not by smell or textural cues.
The second modality is smell. We have about 400 smell receptor genes (Monell, 2015) operating on a pattern recognition model (Buck and Axel, 2004) that are capable of detecting up to 1 trillion odors (Keller and Vosshall, 2014) in a “many-to-many" mode (Downey, 2014). We detect a smell when an odorant binds to the odor receptor (OR) expressed by olfactory sensory neuron (OSN) in the olfactory epithelium. Each OSN expresses only on type of OR, and the Axons of these OSNs project directly to the olfactory bulb (Cheetham and Belluscio, 2014).
The Crossmodal Correspondence Concept
Crossmodal correspondence was defined as how brain process information from different senses to form multisensory experiences in our daily lives (Spence, 2013. Smith, 2014). It is a big field in experimental psychology but we are talking about human flavor perception only. Let us focus only on the interaction between the other four senses and taste: small, sight, sound and touch.
Smell on sweetness as an example: Although the sweetness is perceived in the mouth when a sweetener in the saliva touches the receptors at the tip of the taste cell, there is interaction between taste and smell (Taylor, 2010. Lewandowsky, 2015) That is, retronasal “sweet" aroma (smell) in the nose increases the sweet perception in the mouth (taste) (Prescott, 2015)
Multisensory eating and drinking experience is partially based on crossmodal correspondence. There are three 2015 landmark publications that explore the current state of art and science: Multisensory Processes in Flavour Perception by Prescott, Chemosensory Integration by Prescott, and The Perfect Meal by Spence and Piqueras-Fiszman.
There have been some interesting findings recently on the topic of crossmodal correspondence. On sight and taste, round-shaped food was rated sweeter and angular-shaped food more bitter (Spence, 2015). Most people picked food colored red as sweet, green as sour, white as salty and black as bitter (Spence and Youseff, 2015). Latte coffee flavor was more intense in a white mug than in blue and clear mugs (Van Moore et al, 2014). Meals were preferred if plated visually as “slanting lines ascending to the right" (Youseff et al, 2015). Strange, huh?
Sound is the forgotten flavor sense. At airplane noise level, Umami was the only (or one of the few) primary taste not muted (Dando and Yang, 2015) and maybe even enhanced (Spence, 2015). Higher pitched sound enhanced sweetness and deeper tone increased bitterness perception (Spence, 2015).
How about touch and taste cross talk? Eating with heavier cutlery, people rated their foods more delicious and more liked. Heavier glasses gave a better drinking experience (Spence, 2015). Food with a rougher surface was rated more sour (Slocombe et al, 2015).
Pioneers in the Space
Multisensory eating and drinking experience is about creating experiences based on culinary art, food science and neuroscience. The old molecular gastronomy has transformed into a new field called gastrophysics. It is not just flavor and texture, but crossmodal interactions with all five senses taste, smell, touch, sight and sound.
Among the trailblazing companies in the area of a multisensory eating and drinking experience:
CINNABON: “I Want That"
Nestlé: “A Wake Up Call for All Senses"
Jonnie Walker: “The Boldest Glass"
Haagen-Dazs: “The Perfect Wait"
IBM: “Cognitive Cooking"
British Airways: “Sound Bite"
New Generation of Molecular Gastronomy: “Gastrophysics"
Ultraviolet: Ultra Dining"
Fat Duck: “Sound of the Sea"
The take home message? The taste and smell physiology of today, is the food technology of tomorrow.
Alex Woo, Ph.D., earned his doctorate degree in food science from the University of Wisconsin-Madison. He has more than 30-years experience in various R&D leadership positions in companies including Pepsi, Starbucks, Cargill and Wrigley. He led technical teams to achieve business results. Woo now runs W2O, a boutique food technology firm. He specializes in developing "better foods" with expertise in what he calls “taste and smell technologies."