How food is presented and eaten influences the eating experience. Novel gustatory interfaces have opened up new ways for eating at the dining table. For example, recent developments in acoustic technology have enabled the transportation of food and drink in mid-air, directly into the user's tongue. Basic taste particles like sweet, bitter and umami have higher perceived intensity when delivered with acoustic levitation, and are perceived as more pleasant despite their small size (approx. 20 L or 4mm diameter droplets). However, it remains unclear if users are ready to accept this delivery method at the dining table. Sixty-nine children aged 14 to 16 years did a taste test of 7 types of foods and beverages, using two delivery methods: acoustic levitation, and knife and fork (traditional way). Children were divided into two groups: one group was shown a video demonstrating how levitating food can be eaten before the main experiment whereas the other group was shown the videos after. Our results showed no significant differences in liking of the foods and beverages between the two delivery methods. However, playing the video prior to the test significantly increased the liking and willingness to eat vegetables in the levitation method. Evaluative feedback suggested that a bigger portion size of levitating foods could be the game-changer to integrate this novel technology into real-life eating experiences.
Driving is a highly visual task. Nevertheless, it is a process that involves other senses as well. When we drive, we touch the steering wheel; we listen to what is happening around us, and, even if we are not paying attention to that, we smell what is happening with the car or around it. A scent of gasoline, the burning rubber, the plastic heated up by the sunlight - these are just a few examples. Smell is a very important sense for driving, though it has not been studied much in this context [85], despite being able to provide a much more vivid experience than any other human sense [80]. This thesis aims to fill this gap by investigating opportunities for olfactory interaction in an automotive context. The thesis is mainly focused on designing a scent-delivery device suitable for in-car interaction, on the topic of delivering driving-relevant notifications using scents, and on studying the effects scents have on the driving performance and behaviour, as well as the driver’s mood and well-being. This paper-style PhD thesis consists of two parts. Part II is a collection of seven published papers written in the scope of this thesis, and Part I describes how these papers build a coherent story. Part I starts with an introduction (see Chapter 1) that covers the research questions and contributions of the thesis. It continues with a summary of the background research (see Chapter 2). This overview part then moves on to the description of the approach (see Chapter 3) that covers the process of designing the scent delivery device, the olfactory interaction space, and the studies conducted throughout this PhD. Chapter 4 then summarises the core findings of each study, which are finally discussed in Chapter 5. Part I finishes with a conclusion (see Chapter 6).
Driving is a task that is often affected by emotions. The effect of emotions on driving has been extensively studied. Anger is an emotion that dominates in such investigations. Despite the knowledge on strong links between scents and emotions, few studies have explored the effect of olfactory stimulation in a context of driving. Such an outcome provides HCI practitioners very little knowledge on how to design for emotions using olfactory stimulation in the car. We carried out three studies to select scents of different valence and arousal levels (i.e. rose, peppermint, and civet) and anger eliciting stimuli (i.e. affective pictures and on-road events). We used this knowledge to conduct the fourth user study investigating how the selected scents change the emotional state, well-being, and driving behaviour of drivers in an induced angry state. Our findings enable better decisions on what scents to choose when designing interactions for angry drivers.
Olfactory notifications have been proven to have a positive impact on drivers. This has motivated the use of scents to convey driving-relevant information. Research has proposed the use of such scents as lemon, peppermint, lavender and rose for in-car notifications. However, there is no framework to identify which scent is the most suitable for every application scenario. In this paper, we propose an approach for validating a matching between scents and driving-relevant notifications. We suggest a study in which the olfactory modality is compared with a puff of clean air, visual, auditory, and tactile stimuli while performing the same driving task. For the data analysis, we suggest recording the lane deviation, speed, time required to recover from the error, as well as the perceived liking and comfort ratings. Our approach aims to help automotive UI designers make better decisions about choosing the most suitable scent, as well as possible alternative modalities.
Overreliance in technology is safety-critical and it is assumed that this could have been a main cause of severe accidents with automated vehicles. To ease the complex task of per- manently monitoring vehicle behavior in the driving en- vironment, researchers have proposed to implement relia- bility/uncertainty displays. Such displays allow to estimate whether or not an upcoming intervention is likely. However, presenting uncertainty just adds more visual workload on drivers, who might also be engaged in secondary tasks. We suggest to use olfactory displays as a potential solution to communicate system uncertainty and conducted a user study (N=25) in a high-fidelity driving simulator. Results of the ex- periment (conditions: no reliability display, purely visual reliability display, and visual-olfactory reliability display) comping both objective (task performance) and subjective (technology acceptance model, trust scales, semi-structured interviews) measures suggest that olfactory notifications could become a valuable extension for calibrating trust in automated vehicles.
Smell is a powerful tool for conveying and recalling information without requiring visual attention. Previous work identified, however, some challenges caused by user's unfamiliarity with this modality and complexity in the scent delivery. We are now able to overcome these challenges, introducing a training approach to familiarize scent-meaning associations (urgency of a message, and sender identity) and using a controllable device for the scent-delivery. Here we re-validate the effectiveness of smell as notification modality and present findings on the performance of smell in conveying information. In a user study composed of two sessions we compared the effectiveness of visual, olfactory, and combined visual-olfactory notifications in a messaging application. We demonstrated that olfactory notifications improve users' confidence and performance in identifying the urgency level of a message, with the same reaction time and disruption levels as for visual notifications. We discuss the design implications and opportunities for future work in the domain of multimodal interactions.
Cars provide drivers with task-related information (e.g. "Fill gas") mainly using visual and auditory stimuli. However, those stimuli may distract or overwhelm the driver, causing unnecessary stress. Here, we propose olfactory stimulation as a novel feedback modality to support the perception of visual notifications, reducing the visual demand of the driver. Based on previous research, we explore the application of the scents of lavender, peppermint, and lemon to convey three driving-relevant messages (i.e. "Slow down", "Short inter-vehicle distance", "Lane departure"). Our paper is the first to demonstrate the application of olfactory conditioning in the context of driving and to explore how multiple olfactory notifications change the driving behaviour. Our findings demonstrate that olfactory notifications are perceived as less distracting, more comfortable, and more helpful than visual notifications. Drivers also make less driving mistakes when exposed to olfactory notifications. We discuss how these findings inform the design of future in-car user interfaces.
The sense of smell is well known to provide very vivid experiences and to mediate a strong activation of crossmodal semantic representations. Despite a growing number of olfactory HCI prototypes, there have been only a few attempts to study the sense of smell as an interaction modality. Here, we focus on the exploration of olfaction for in-car interaction design by establishing a mapping between three different driving-related messages ("Slow down", "Fill gas", "Passing by a point of interest") and four scents (lemon, lavender, peppermint, rose). The results of our first study demonstrate strong associations between, for instance, the "Slow down" message and the scent of lemon, the "Fill gas" message and the scent of peppermint, the "Passing by a point of interest" message and the scent of rose. These findings have been confirmed in our second study, where participants expressed their mapping preferences while performing a simulated driving task.
When designing olfactory interfaces, HCI researchers and practitioners have to carefully consider a number of issues related to the scent delivery, detection, and lingering. These are just a few of the problems to deal with. We present OSpace - an approach for designing, building, and exploring an olfactory interaction space. Our paper is the first to explore in detail not only the scent-delivery parameters but also the air extraction issues. We conducted a user study to demonstrate how the scent detection/lingering times can be acquired under different air extraction conditions, and how the impact of scent type, dilution, and intensity can be investigated. Results show that with our setup, the scents can be perceived by the user within ten seconds and it takes less than nine seconds for the scents to disappear, both when the extraction is on and off. We discuss the practical application of these results for HCI.
In the field of Human Computer Interaction (HCI), vision and audition have been the dominating modalities for interacting with users. This is despite the fact that humans are equipped with five basic senses. Because of this, there is a limited number of tools that harness the olfactory system as a communication channel. Recently, several promising scent-delivery devices have been developed, however, there is a lack of guidance on how to use them in a meaningful way for different interactive tasks. In this paper, we propose a three-dimensional framework to compare different scent-delivery devices based on the distance, volume, and speed of the scent-delivery. We discuss how this initial exploration can guide the design of in-car olfactory interfaces beyond previous work on drivers' physical and emotional state.