Loading...

FOOD AND NUTRITION OPEN ACCESS (ISSN:2517-5726)

Interactions between Human, Computer and Food

Jessika Gonçalves dos Santos Aguilar1, Ivan Abdo Aguilar2*

1 Department of Food Science, School of Food Engineering, University of Campinas, Campinas, São Paulo, Brazil
2 School of Interactive Arts and Technology,  Simon Fraser University, Surrey, British Columbia, Canada

CitationCitation COPIED

Aguilar JGS, Aguilar IA. Interactions between human, computer and food. Food Nutr OA. 2019 Apr;2(2):116

Abstract

Technology is always evolving; new discoveries have been more specific and deeper levels of interaction between the real and the virtual worlds have been reached. Nowadays, people are increasingly interested in this immersion of sensations and feelings that technology can provide. Besides being fundamental to life, the act of eating brings people together, causing various interactions between them. Eating brings unique sensations that involve different emotions and feelings in people, which can be leveraged to create greater experiences with the use of human-computer interactions. Technology has brought the virtual world closer to the real world in order to enhance culinary experiences, either by creating digital foods or enhancing real ones, by developing devices and programs that are capable of invoking or measuring sensations, with immersion tests where a participant is inserted into a controlled virtual environment closely resembling the real environment, and by measuring and evaluating emotions/responses. This review presents the relationships between humans, computers and food, called human-food interactions, focusing on the use of computational technologies, exploration of human senses, and digital interactions in food experience design, showing the future challenges that need to be overcome.

Keywords

HCI; Virtual food; Virtual reality; Multisensory experience; Virtual senses; Consumer

Introduction

Human-food interaction (HFI) research has emerged as an area of great interest in the related field of human-computer interaction (HCI) in the last few years [1]. Digitizing techniques exist to bring the objects and elements of the real world into the virtual world, which can be done in many ways by using specialized sensors and scanners, photo imaging, computer assisted design software, among others [2-4].

The virtualization process covers stages of modeling, simulation, optimization, and dynamic studies. Several industries have benefited from such activities and the food industry is still lagging to use the potential offered by virtualization as a tool [5]. With the advancement of technology, someday, there will be a new kind of dinner table, where digital and computational aspects will be linked to design, science, technology and food engineering, providing new sensations from the fully digitized world [6].

The senses are responsible for external daily experiences [7,8]. In research, it is an obstacle to conduct experiments involving the senses because of the difficulty in creating a suitable environment which replicates the complexity of the factors contained in the real environment, since laboratory tests are performed in controlled settings or in front of a computer [7]. Some senses still cannot be completely utilized for interaction with technology, posing a challenge to be overcome. Interaction through the senses is mainly led by vision and hearing, touch can also be used, but taste and smell still remain as great challenges [9].

Human-computer and human-food interactions

To better understand the relationships between computer and human beings, some definitions need to be established. HCI is the study of how computers, and derived technologies, influence the interactions and relationships between humans and computers with their everyday activities [10]. HFI research, HCI research specifically on food, has emerged and attracted significant research interest on how to better understand the influence that technology has on the interactions and relationships between humans and food. Some topics of interest in HFI are: health and ecological sustainability of food production and consumption; social and cultural interactions around food; food practices (for example, eating, disposal, growing, cooking); consumer experience and product evaluation; food safety; and and entertainment [1,11-17].

The robust perception humans have of the world is based upon the processing, integration, analysis, combination and interrelation of the different human senses. Multiple senses are used to explore the environment and perceive information. Through the senses, sight, hearing, touch, and smell, it is possible to experience external stimuli [18]. Multisensory interactions can affect a person’s performance of everyday tasks and can lead to the creation of new immersive experiences between environments, products, services and users, which can be used to engage audiences, convey meaning, and enhance the overall user experience [19-23].

The systems used to evaluate these interrelations can work with one or multiple input forms and interactions between them. Unimodal systems use only one form of input, for instance, speech, vision, mouse, keyboard or pen. On the contrary, multimodal systems function in a more robust way, integrating different input modes together, aiming to provide users with specific tools to control output information, such as using multiple input forms to interact with visualizations and multimedia content [24]. The understanding of communication and interaction between humans and technology can be reached using multimodal interactions, through the combination of human’s natural capabilities of communicating via speech, facial expressions, touch, gesture, among others [18,25].

Cross-modal systems represent interaction methods where one or more senses influence the perception of another sense. For example, the effect of the food color on its taste/flavour and smell. Another form of cross-modal interaction is a form of synesthesia, a perceptual illusion, where a sensory mode (stimulated by the virtual environment) is perceived to stimulate another mode (not stimulated by the virtual environment) [26,27].

Research has shown that stimulating certain senses can lead to perception changes in others. Bi-directional influence on perception has been demonstrated to occur between the senses of: taste and smell [28,29]; taste and vision [30,31]; taste and touch [32,33]; taste and hearing [34,35]; smell and vision [36,37]; smell and touch [38,39]; smell and hearing [40,41]; vision and touch [42,43]; vision and hearing [44,45]; touch and hearing [46,47]; and a mixture of senses together [48-50]. 

Research has also revealed how these cross-modal stimulations can also be applied to new technologies and devices in food-related experiments to induce perception changes. For instance, how the sense of touch, in the case of weight, alters the perception of taste [51]. The sense of sound, of food texture, was shown to influence food taste [52]. Induced vision and smell influenced the perception of taste, using a visual and olfactory display system [53].

Technology to Design Food Experiences

There are several types of technologies capable of promoting HCI. These technologies can be applied in different research fields. In the food area, a tendency to study HFI can be observed, promoting, mainly, the immersion of people in a virtual world completely closed off from the real world, providing new sensations and feelings. This topic brings some examples of technologies and their recent uses within HFI.

Virtual Reality

Virtual reality (VR) is a complete immersion technology where the user is connected to a computational device that can simulate computer generated, real and virtual environments. While immersed in this environment, the user cannot see the real world around them. These environments can be used for simulations, training, entertainment, education, evaluations, among others [54,55].

Immersive VR is a promising method of immersing people in an almost real environment involving the greatest number of senses. Such environments provide many similarities with the real world, allowing researchers to constrain experimental factors to obtain empirical data [7].

The Project Nourished is an example of VR use that provides a complete state of immersion. During a meal, multiple devices are used to enhance the dining experience, for instance: a VR headset, to visualize a simulated environment and change the aesthetics of food; an aromatic diffuser, to smell the virtual food; headphones, to transmit mimicked chewing sound and vibration from the mouth to the ear; food utensils and a glass cup, present in the real and virtual world for interacting with virtual food and beverage; and 3D printed food, used to articulate areal taste and feel to the virtual food. This project allows for a different eating experience with the virtual use of all the senses, giving an option to ingest or not actual foods with calories and an option of creating your own virtual food based on a fictional movie scene, for example [56].

Augmented reality

Augmented reality (AR), a system which combines real and virtual elements, is subject to real-time interactions and has a precise alignment and synchronism between virtual three-dimensional elements and the real world environment. AR seeks to establish an environment which perfectly integrates the real world with the virtual world, where augmented information is superimposed onto images and videos from the real world (captured through a camera). Though AR is most commonly used visually, it can be used to augment all five of the human senses [54,57].

Several studies apply AR for the study of foods, mainly using the sense of vision. These studies aim to reduce food consumption, by modifying the food intake and eating experience, having the potential to assist in the prevention and reduction of diseases related to food consumption, such as obesity, hypertension, cholesterol, heart disease and diabetes [53,58-60].

AR applications are constantly changing and being used more frequently in retail environments [61,62]. AR applications, for example, can present information, such as nutritional and reviews, and have gaming elements to bring entertainment experiences when a product is scanned by a mobile device [61]; personalization of the consumer experience, where customers can create and interact with content which is personalized to them; socialization, consumers can share products with others by taking photos of AR content which can be used as AR marketing; accessibility, consumers without technical skills can create their product; and novelty, the use of the technology being “something new and different” can lead to early adopters [57].

Developed by Narumi et al. [53], Meta Cookie is a prototype that exemplifies the use of AR with food. By using an AR marker, Meta cookie combines a head-mounted display (HMD) with an olfactory display, enabling the user the experience of viewing different cookies (chocolate, strawberry, tea, etc.) and provides a distinctive aroma that varies from each cookie chosen. Results showed that users experienced a change in taste, even though they were all eating just a plain cookie.

An AR mobile grocery shopping application was developed by Ahn et al. [63] to provide real-time customized recommendations of health products and warnings about specific products with health appeal, such as allergies to milk or nuts, low sodium, low fat and general caloric intake.

Inamo is an interactive restaurant which uses AR technology to provide the visitor with full control of the gastronomic experience through the projection of the menu directly on the table surface. The technology also allows the visitor to configure the temperature, make drawings on the table, learn about the local neighborhood, play games and view a live chef camera feed [64].

Virtual Lemonade is a multimodal AR gustation system used to transmit flavor (color and pH value in this case) from a real beverage to a simulated beverage in another location. The system contains three stages: capturing the color and pH value of the original beverage, transmitting the captured information, and simulating the beverage in a different location (using water, LEDs and controlled electronic pulses to augment real-world sourness sensations) [65].

Meibner et al. [66] studied user behavior in a virtual supermarket using an eye tracking technology. Their results showed that participants’ product decisions were greatly influenced by the use of AR information and that participants deemed the use this information as being helpful and desirable in future shopping experiences.

Quick Response code 

Quick response code (QR code) is a two-dimensional matrix symbol used for encoding and storing data (7,000 characters). This symbol was invented in 1994 by Denso Corporation, one of Toyota’s major group companies, with the intended use in production control of automotive parts, but it later was recognized as an ISO international standard (ISO/IEC18004) and widespread into other fields. Today, mobile phones can capture a QR code with their camera, decode and present the information stored inside. QR code scanning technology, nowadays, is widely known and its readers are available across multiple mobile platforms [6,67-69].

According to Schöning et al. [6] food can be improved through computational techniques. An example of this is the German QR Cookie (QKies). The main idea of this product is to use QR code in cookies for digital marketing. QKies QR code is linked to a specific site and can deliver the desired message [70].

3D Printing 

Food layered manufacture (FLM), also known as 3D food printing or food fabrication, is relatively new, expensive and difficult. Food printing is based on Additive Manufacturing (AM), also known as 3D printing, which joins materials, layer by layer, to create physical 3D objects [71].

The use of various foods in printing is still a challenge. Food printing can modify the shape, texture and flavor of food, changing the experience of cooking, using different formats and printed foods [16]. According to Schöning et al. [6], advancements in 3D printing technology help in the development of digitally produced foods. Some uses for food printing are food fabrication; multiscale design and creation of edible food structures; consumer-designed food fabrication; customized food forms and flavors; personal food factory; consumer-designed food fabrication in a domestic and usercontrolled experimentation environment [71].

 The CandyFab project develops machines that inexpensively print objects from pure sugar using a process of melting sugar grains together with hot air. The production is based on the basic principle of any 3D printer stacking solid two-dimensional printed layers [72].

Chocolate can also be printed. The technology company, Choc Edge, a pioneer in 3D chocolate printing, has developed Choc Creator, which creates 3D edible chocolate models. Like CandyFab, Choc Creator uses traditional coordinate system technology, similar to plastic 3D printing. An idea is transformed into a 3D model and a code is generated. This code is loaded into the printer and the chocolate object is printed out layer by layer [73].

The ‘Insects au gratin’ project is a collaborative design project which uses insect flour to print a paste mix, used as a building medium, and combines it with other food products to shape food structures and create a sustainable source of food [74].

Another company specializing in food printing is Bocusini, providing new experiences to consumers, with personalized prints that can range from simple names to busts to cake toppers and are made with different food materials: pasta, chocolate, marzipan, cassis and fondant [75].

Digital Interactions in Food Experience Design

Humans perceive texture and food flavour based on the feel, sound, trigeminal stimulation, appearance, smell and taste [76]. The senses, or sensory systems, are necessary for the production of all external information and perceptions, mainly through the use of vision and hearing [8] which have dominated the field of HCI studies. Taste and smell are two particularly difficult senses to be studied, besides being chemical senses; they make direct contact with the neural substrates of emotion and memory [77,78]. Memories evoked by smell are often emotionally more potent than those by other senses [9]. New researchers have currently focused on the study of touch, taste and smell [78].

The category of physical senses of humans encompasses the sense of touch, sight and hearing, while the sense of taste and smell are chemical senses. The study of taste and smell (sensory study) is challenging, as they are neurologically interrelated and subjective [21]. Digital stimulation of the chemical senses has two “routes”: without the use of chemicals, by electrically and thermally stimulating taste buds; and through the use of chemicals, which can be released using digital control. To transmit the sense of smell and taste, from the computer to the human, both routes may be used [79].

Sense of sight

According to Ramic-Brkic and Chalmers [80], visual perception has become an increasingly important tool in the field of computer graphics. Researches in this area explore the knowledge of the human visual system to render and display three-dimensional graphics.

Sakurai et al. [59] studied how the amount of food affects food behavior, using a tabletop system that is capable of designing virtual dishes around food by altering the apparent volume of food. According to the research results, the amount of food consumed and the perceived volume of food placed on the dish can be influenced by the size of the dish.

The effect of AR to guide the amount of food served on the dish was evaluated by Rollo et al. [81]. Participants were divided into into 3 test groups: with no information (control); with verbal information about the size of the portion; and those that used ServAR, an AR tool created to assist in assembling the virtual portions using a tablet. According to participants, the developed tool was easy to use and useful. The authors concluded that the use of AR improved accuracy in estimating standard plate sizes compared to the other conditions. The tool demonstrated potential in the orientation of serving food.

Eye tracking and electroencephalogram sensor technologies have been used to measure and understand human behaviours and the visual attention consumers have during their shopping experience. This was used to understand how to stimulate sales, assist in marketing decisions, guide design and presentation of products, and help consumers make decisions [66,82].

Sense of hearing

The sense of hearing (auditory system) is a prominent social sense human beings have and can be employed to cue visual attention, create ambience and emotion, provide information on the environment and surrounding areas which are not in the immediate range, and assist other senses in establishing a feeling of presence and immersion [83,84].

According to Carvalho et al. [85], the use of sound is able to enhance the tasting experience. Using chocolate as a taste stimulant, the authors observed that the sound was able to modulate the flavor of the foods, being able to add significant hedonic value during the sensorial analysis.

The perception of sound is one of the main senses explored at The Fat Duck restaurant in the UK. The sound of the sea is served along with the seafood dishes. As the customers enjoy their dinner, their memories and sensations about the sound of the sea, waves crashing, and flying seagulls are transported to the dining table using an iPod [86].

The chewing jockey technology uses a light sensor to detect jaw movements and release a specific sound. A microphone in the jaw can also amplify the sounds of the bite. This type of technology can be used to relate a food to its brand, for example, where the consumer is able to hear the sound attached to that brand while experiencing the food [52].

Sense of touch

The touch (haptic) sense is the main and most used in daily human interactions. The use of hand or finger for haptic interfaces is one of the main research fields in VR [87].

One of the first studies to demonstrate the effectiveness of tactile augmentation as a simple, safe and inexpensive technique for physically touching and tasting virtual objects was performed by Hoffman et al. [88]. The authors conducted a pilot study where participants, located in an immersive environment - a virtual kitchen, physically bit a virtual chocolate bar that would capture the physical properties of a real chocolate. Participants described the experiment as more realistic and fun when physically biting a virtual chocolate bar than imagining biting a virtual chocolate bar in a virtual environment.

Iwata et al. [87] developed a food simulator for the development of a haptic interface for biting. The simulator generates a profile from a person’s bite in the real world and then is able to simulate/ generate the force applied by the bite representing the texture of the food. The equipment has a mechanical configuration that fits in the mouth, has an integrated force sensor with an audible and chemical display for multimodal sensations in a taste. The tests were carried out with two virtual foods: a virtual rice cracker with the sound of bite present and a virtual gummy candy with a chemical substance produced from the five basic tastes. In the case of the virtual rice cracker, a total of 21% of the interviewees recognized the virtual food without any instruction; 66% recognized the food after being informed that it was a virtual rice cracker and 13% did not recognize the food. With respect to virtual gummy candy, 40% of interviewees recognized the virtual food without any instruction; 56% recognized the food after being informed that it was a virtual rice cracker and 4% did not recognize the food.

The Straw-like user system allows the user to experience the sensation of drinking in a virtual environment. Sensations are created based on actual data collected from an ordinary straw attached to the system. The elements that allow the sensation are the change of pressure in the mouth generated when food blocks the straw; the sound of collisions and friction of food; and vibrations that occur with sound [89].

A system called “Electric Food Texture System” was developed by Nijima and Ogawa [90] to simulate chewing foods of different texture using electrical muscle stimulation (EMS). The system consists of a database with the texture of real foods; a part that provides the electrical stimuli; and a part for bite detection. There is no real food in the user’s mouth, however, because of the electrical stimulation; they feel as if they have been eating real food. The authors concluded that EMS was useful for presenting the elastic texture of foods (gummy candy and chocolate), however, it was not useful to present harder textures (potato chip and a rice cracker).

Sense of smell 

Smell (olfactory system) is a primary chemical sense [80]. The sense of smell can evoke feelings, differentiate products, stores, brands and impact the psychological state of the consumer.

Olfactory perception is composed by intensity estimation, qualitative description, and hedonic tone [91]. Smells have been integrated into a VR system by means of olfactory displays that generate scented air from odor materials with the desired components and concentration presenting to the user a more realistic experience [92]. Olfactory displays are categorized in two types: ubiquitous, where a device (not attached to the user in any way) transmits scent into an entire room or directly to the user’s nose by tracking their body and facial movements; wearable, a device is attached to the user (e.g. on an HMD) for closer scent transmission and personal experience [93]. Olfactory displays, however, are difficult to load and are limited in terms of the number of scents that can be stored, created, issued, delivery methods and the distance they can diffuse [8,92].

Odor simulation in VR environments has the possibility of being a means to users to evaluate products. By using virtual prototyping and user evaluation in these environments, industries can reduce the need to develop a real one. The use of scents is shown to be an important factor in product evaluation and consumer intention, even in products not traditionally associated with odors, which can give the product a competitive advantage. The research was conducted with users evaluating products in a virtual multisensory environment, using a HMD with an olfactory display, and presented results similar to real environments, showing that it is possible to evaluate products in VR environments with the use of olfactory displays [92].

State-of-the-art Immersive VR technology coupled with an odor system was studied by Ischer et al. [7]. The “Brain and Behavioral Laboratory-Immersive System” is easy to use and control and provides an immersive, interactive and three-dimensional environment capable of limiting the contamination between odors that can still be linked to images and sounds.

Sense of taste  

The sense of taste (gustatory system) can detect and distinguish between five different stimuli (bitter, salty, sweet, sour and umami) and is a very difficult response to be virtually converted [78,94]. It is a multimodal sensation made up of chemicals substances and involves the senses of smell, hearing and touch. Sound is simple to be synthesized virtually and smell can be achieved through vaporization, so the biggest problem is being able to convert the taste sensation [87]. The stimulation methods to simulate taste sensations digitally are divided in chemical and nonchemical approaches [95].

Virtual food uses electronics to emulate the feel and taste in the mouth of a real-world meal. This technology would be useful for providing sensory inputs and improving gastronomic experiences [96].

Ranasinghe et al. [97] studied the use of electricity to explore the taste sensation as a digital media. Multimodal bottle and spoon control system was used to trigger different taste sensations while users eat or drink food and beverages, by using visual stimulations (superimposing color through the use of alternating LEDs) and tongue stimulations (through controlled electrical pulses). The authors created an alternative way to make drinks taste better, by enhancing the sweetness, saltiness, bitterness and sourness, without changing the content or use of flavoring ingredients, like sugar, as a way of preventing excessive use of chemicals and, therefore, benefitting health nutrition

The Digital Lollipop was created to digitally simulate the taste sensation through electrical stimuli on the human tongue. Through the manipulation of the magnitude, frequency and polarity (properties of electric currents) it is possible to create different types of stimuli [98].

Based on the study of Cruz and Green [99], where it was shown that heating the tip of the tongue resulted in perceived sweet taste sensations, several other studies were carried out. These studies have shown that thermal stimulation on the tongue or nose resulted in sweet sensations without the need for chemicals like sugar in beverages [100-102].

Multisensory senses studies

Sester et al. [103] evaluated the effect of the environment on food choices, in particular, on beverages. The authors developed two virtual environments simulating a bar, an environment with blue furniture (cold) and another with wooden furniture (warmth) and both with visual and musical stimuli projected on the wall. Videos projected on the wall in the first study (46 participants) led to the conclusion that these elements were sufficient to influence declarative drinking options. Videos designed in a second study (120 participants) were used to evaluate the robustness of the method, in which case the participants had to choose one out of 5 beers and it was confirmed that the choices were made according to the environment.

Bangcuyo et al. [104] studied the use of immersive techniques in the hedonistic sensorial evaluation of coffee. At each session, 50 participants evaluated five coffee samples in a traditional sensory analysis booth and then in a virtual coffeehouse. The tests were repeated with a month apart from each other. It was observed that the preference for coffees was different according to the condition in which they were evaluated. The power of the hedonistic scale was improved with inclusion in the contextual (virtual) environment, since the environment influenced the evaluation of participants.

The consumer acceptability of coffee, affected by situational and involvement conditions, was studied by Kim et al. [105]. The authors wanted to understand the effect of evoking sensations in different environments on consumers and evaluated two approaches, simulated coffee and cognitive evocation (using or not using phrases). 200 participants took part in this study. For the simulated coffee condition, the room was made up to look like a common coffee shop. For the evocation condition, subjects were given with a sheet of paper with evocation phrases, and were instructed to read it and imagine the situation. After evaluating two samples, the results showed that the consumers had their tastes influenced by the factors studied, mainly by the environment, which had a greater effect.

Hathaway and Simons [106] evaluated the taste of four brands of soft chocolate chip cookies in two tests involving 49 participants. The first test was performed in the traditional way and the second was in two virtual environments contextualized with audio-visual and olfactory cues related to the cooking of cookies in a domestic kitchen. The first virtual environment had mixed immersion information displayed, on a computer screen and on a phone, and aroma was dispersed in booths, while the second environment was of total immersion, information was presented on video walls, there was use of loudspeakers and scattering of hidden aroma. In this study the hedonistic data was more discriminatory, reliable and consumer acceptance improved in the totally immersive environment. The ability to resolve differences in cookie preference and engagement of participants were improved in virtual environments.

The virtual food buffet where consumers could serve a meal (carrots, pasta and chicken) was studied by Ung et al. [107]. 34 participants partook in the study, being served two meals, one at a virtual buffet and another at a fake food buffet. The researchers observed that the human nutritional behavior was not modified in relation to the amount of energy (in kJ) present in the served food, suggesting that the virtual food buffet is a useful research method.

Limitations and Future Challenges

Despite the research progress in HCI and HFI [15,78,79,127], much work remains to be done before the use of technology with food can be incorporated in industry product creation, evaluation workflow and brought to the consumer’s dinner table. The multisensory experiences in HCI are not completely elucidated [78], the main senses studied, from both the aspect of input and output (feedback) mechanisms, are visual, auditory and tactile. The relationships between HFI and the five senses have been able to influence the user’s food perception; nevertheless, they require several improvements to overcome their limitations, as each sense has multiple unique limitations. Technology for digitizing chemical senses is still being explored, far behind the physical senses, and has many limitations and obstacles to surpass. Some of the limitations in the five senses are summarized in Table 1 and subsequently explained.

Smell: Generation, storage and diffusion of scents; synchronization, matching the intensity, duration, and distribution rate of scents with presented content; availability and access to olfactory displays; and scent alteration, removing an emitted scent in the environment prior to emitting a new scent [79,93,108].

Hearing: Lacks integration of other senses, focus is only on user’s food texture perception while chewing or drinking, which may hinder the effect on the user’s enjoyment, as only part of the eating experience is altered; foods without distinct sounds, not all foods can be enhanced by sound feedback, chewing sounds of certain foods are not distinct/pronounced enough to augment, e.g. soft foods like pasta; configuration, devices work only with preconfigured foods, systems should recognize what the user is eating and automatically filter and augment the adequate sound [52,109].

Taste: Small range of sensations, experiments can generate and evaluate certain taste perceptions (e.g. sourness), but this range must increase to multiple sensations (e.g. sweetness, saltiness, bitterness) and with varying intensities. There is also a lack of studies of food with different viscosities, elasticities, adhesiveness, pH values, temperature, mixture of solids and liquids at the same time (e.g. eating and drinking consecutively or eating a soup with pieces of solid food inside) [65].

Vision: Overwhelming perceived food proportions, if the perceived food size is drastically increased, it can become overwhelming for participants to eat, not only from a perceived quantity point of view but also from not being able to physically fit inside the mouth for a bite; eating methods, restricted to using hands or some preconfigured cutlery; environment restrictions, when shrinking the apparent food size, the environment behind the food must be seen, problems arise when the background is not in a real world controlled or virtual environment, as it is difficult to show what is behind food without distortion [58].

Touch: Hardness range, studies have been limited to relatively hard and soft foods (e.g. crackers and cheese), this range must increase to incorporate foods with hardness characteristics varying in-between these two or those which have a mixture of hardness and softness; vibration locations, vibrations are felt on the teeth, but real food is felt also on the lips and tongue; display shapes, devices have flat surfaces but food have varied surface shapes; lacks integration of other senses, more studies are needed on releasing chemical sensations (taste) with biting [52,87]. Electrical muscle devices to simulate bite have developed, but there is still a lack precise control in varying electrical signals during the course of a meal as the food changes shape and hardness [90]. Studies with direct contact with the tongue and lips have difficulties providing pleasurable experiences [110]. Though there are studies regarding lips stimulation [111,112], they are still in their early stages and need to be more explored with food experiments.

An overall concern in HFI research is that high-fidelity multisensing virtual environments have been used in a variety of applications; however, achieving this type of environment requires high costs. In an attempt to lower these costs, designers have considered the influence of simple and multisensory modalities [80]. Food choice is influenced by multiple determinants [113] and offers a multisensory rich experience, so it is interesting to explore the interactions caused by the act of eating within the virtual environment [15], while also trying to reduce associated costs.

For future studies, haptics is a field in constant development, which can be more incorporated into HFI for virtual food sensing, interaction and assisting with multimodal interactions with other senses. Studies have shown methods in which volume and texture can be felt and interacted with, be it through controllers with actuators [114], ultrasound [115], in air feedback [116] or wearable glove devices [117,118]. Altering perceived shape of elements, using the sense of vision, has been studied [119], for example: changing the perceived size, curvature, and, therefore, touch [120]; changing the perceived weight [121]; altering the perceived hardness [122,123]. With the advancements in VR and AR technologies and their recent affordability with consumer devices, e.g. HTC Vive, Oculus Rift, Google Cardboard and Microsoft Hololens [124,125], a mixture of these technologies could be more incorporated in food research for sensation simulation.

In this area of study, where every day new discoveries appear, the challenges to be solved are constant, frequently challenging the researcher, for instance: determining what technology and sensory experiences to use and their intended effect on people; creation of new multisensory designs; design creations that take into account the relationships between the senses; and understanding the limitations when multi-sense information is simultaneously monitored [126,127]. The creation of immersive environments similar to real environments and the development of methods to facilitate the collection of qualitative and quantitative parameters, to improve sensory stimulation, are other examples of what need to be studied and improved in this area of knowledge involving computers, food and people.


Table 1: Some limitations of research involving the human senses

Conclusions

Advances in the area of study of sensory digitization and virtual food production can change the way people interact with technology; how they eat; how they prepare food and what they feel. Technologies can be used by food brands and in food stores to present their product in an interactive and fun way and to modify the perception of the consumer with respect to such products. The immersive consumer studies, despite being difficult to perform, have been carried out in virtual environments that are closer to the real world environments, which facilitate the perception of consumers, and modifies the opinion about the product, since the environment generates specific emotions resulting in progressively more realistic and substantial reports. Studies involving computational technology, whether using computers, tablets or smartphones, are capable of improving and promoting these aspects of consumer behaviors, representing an emerging technology capable of changing the feelings, sensations and choice of consumers.

Conflict of Interest

The authors declare that they have no conflict of interest.

References

  1. Comber R, Hee-jeong JC, Hoonhout J, O’hara K. Designing forhuman-food interaction: An introduction to the special issueon ‘food and interaction design’. Int J Human-Computer Studies.2014;72(2):181-184.
  2. Quaas S, Rudolph H, Luthardt RG. Direct mechanical dataacquisition of dental impressions for the manufacturing of CAD/CAM restorations. J Dent. 2007 Dec;35(12):903-908.
  3. Petrov M, Talapov A, Robertson T, Lebedev A, Zhilyaev A, et al.Optical 3D digitizers: Bringing life to the virtual world. IEEEComputer Graphics and Applications. 1998;18(3):28-37.
  4. Pieraccini M, Guidi G, Atzeni C. 3D digitizing of cultural heritage.Journal of Cultural Heritage. 2001;2(1):63-70.
  5. Marra F. Virtualization of processes in food engineering. J FoodEng. 2016;176:1.
  6. Schöning J, Rogers Y, Krüger A. Digitally enhanced food. IEEEPervasive Comput. 2012;11(3):4-6.
  7. Ischer M, Baron N, Mermoud C, Cayeux I, Porcherot C, et al.How incorporation of scents could enhance immersive virtualexperiences. Front Psychol. 2014 Jul;5:736.
  8. Matsukura H, Yoneda T, Ishida H. Smelling screen: Development andevaluation of an olfactory display system for presenting a virtualodor source. IEEE Trans Vis Comput Graph. 2013;19(4):606-615.
  9. Obrist M, Velasco C, Vi C, Ranasinghe N, Israr A, et al. Sensing thefuture of HCI: Touch, taste, and smell user interfaces. Interactions.2016;23(5):40-49.
  10. Dix A. Human-computer interaction (1st ed). In: Liu L, Ozsu T(Eds). Encyclopedia of database systems. US: Springer; 2009.
  11. Choi JHJ, Comber R, Linehan C. Food for thought. Interactions.2013;20(1):46.
  12. Choi JHJ, Linehan C, Comber R, McCarthy J. Food for thought:Designing for critical reflection on food practices. Proceedingsof the Designing Interactive Systems Conference. 2012;793-794.
  13. Comber R, Ganglbauer E, Choi JHJ, Hoonhout J, Rogers Y, et al.Food and interaction design: Designing for food in everyday life.Extended Abstracts on Human Factors in Computing Systems.2012:2767-2770.
  14. Comber R, Hoonhout J, Halteren A van, Moynihan P, Olivier P. Foodpractices as situated action. Proceedings of the SIGCHI Conferenceon Human Factors in Computing Systems. 2013;2457-2466.
  15. Grimes A, Harper R. Celebratory technology: New directionsfor food research in HCI. Proceedings of the SIGCHI Conferenceon Human Factors in Computing Systems. Lancaster University.2008:467-476.
  16. Khot RA, Lupton D, Dolejšová M, Mueller FF. Future of food in thedigital realm. Proceedings of the 2017 CHI Conference ExtendedAbstracts on Human Factors in Computing Systems. University ofCanberra. 2017 May:1342-1345.
  17. Parker AG, McClendon I, Grevet C, Ayo V, Chung W, et al. I am what I eat: Identity & critical thinking in an online health forum for kids. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2013
  18. Turk M. Multimodal interaction: A review. Pattern RecognitionLetters. 2014 Jan;36:189-195.
  19. Ernst MO, Bülthoff HH. Merging the senses into a robust percept.Trends in Cognitive Sciences. 2004 Apr;8(4): 162-169.
  20. Murray MM, Molholm S, Michel CM, Heslenfeld DJ, Ritter W, et al.Grabbing your ear: Rapid auditory-somatosensory multisensoryinteractions in low-level sensory cortices are not constrained bystimulus alignment. Cereb Cortex. 2005 Jul;15(7):963-974.
  21. Obrist M. Mastering the senses in HCI: Towards multisensoryinterfaces. Proceedings of the 12th Biannual Conference on ItalianSIGCHI Chapter, 2. 2017.
  22. Stein BE, Meredith MA. The merging of the senses (1st ed.). USA:The MIT Press.
  23. Vi CT, Ablart D, Gatti E, Velasco C, Obrist M. Not just seeing,but also feeling art: mid-air haptic experiences integrated ina multisensory art exhibition. Int J Hum Comput Stud. 2017Dec;108:1-14.
  24. Oviatt S. Ten myths of multimodal interaction. Communicationsof the ACM. 1999;42(11):74-81.
  25. Bhowmik AK. Interactive Displays: Natural human-interfacetechnologies (1st ed.). John Wiley & Sons. 2014.
  26. Biocc F, Jin K, Yung C. Visual touch in virtual environments: Anexploratory study of presence, multimodal interfaces, and crossmodal sensory illusions. Presence: Teleoperators and VirtualEnvironments. 2001;10(3):247-265.
  27. Slocombe BG, Carmichael DA, Simner J. Cross-modal tactiletaste interactions in food evaluations. Neuropsychologia. 2016Jul;88:58-64.
  28. Lim J, Fujimaru T, Linscott TD. The role of congruency in tasteodor interactions. Food Quality and Preference. 2014 Jun;34:5-13.
  29. Ventanas S, Mustonen S, Puolanne E, Tuorila H. Odour and flavourperception in flavoured model systems: Influence of sodiumchloride, umami compounds and serving temperature. FoodQuality and Preference. 2010 Jul;21(5):453-462.
  30. Delwiche JF. You eat with your eyes first. Physiology & behavior.2012 Nov;107(4):502-504.
  31. Michel C, Velasco C, Gatti E, Spence C. A taste of Kandinsky:Assessing the influence of the artistic visual presentation of foodon the dining experience. Flavour. 2014;3(1):7.
  32. Biggs L, Juravle G, Spence C. Haptic exploration of platewarealters the perceived texture and taste of food. Food Quality andPreference. 2016 Jun;50:129-134.
  33. Harrar V, Spence C. The taste of cutlery: How the taste of food is affected by the weight, size, shape, and colour of the cutlery used to eat it. Flavour. 2013;2(1):21.
  34. Carvalho FR, Wang QJ, van Ee R, Persoone D, Spence C. “Smooth operator”: Music modulates the perceived creaminess, sweetness, and bitterness of chocolate. Appetite. 2017;108:383-390.
  35. Knoeferle KM, Woods A, Käppler F, Spence C. That sounds sweet:Using cross-modal correspondences to communicate gustatoryattributes. Psychology & Marketing. 2015;32(1):107-120.
  36. Jacquot M, Noel F, Velasco C, Spence C. On the colours of odours. Chemosensory Perception. 2016; 9(2):79-93.
  37. Jadauji JB, Djordjevic J, Lundström JN, Pack CC. Modulation ofolfactory perception by visual cortex stimulation. Journal ofNeuroscience. 2012 Feb;32(9):3095-3100.
  38. Croy I, Drechsler E, Hamilton P, Hummel T, Olausson H. Olfactorymodulation of affective touch processing-A neurophysiologicalinvestigation. Neuroimage. 2016 Jul;135:135-141.
  39. Demattè ML, Sanabria D, Sugarman R, Spence C. Cross-modalinteractions between olfaction and touch. Chemical Senses. 2006May;31(4):291-300.
  40. Deroy O, Crisinel AS, Spence C. Crossmodal correspondencesbetween odors and contingent features: Odors, musical notes,and geometrical shapes. Psychonomic Bulletin & Review. 2013Oct;20(5):878-896.
  41.  Howes D. Scent, sound and synaesthesia. In: Tilley C (ed).Handbook of material Culture. London (L): Sage; 2006. p. 161-172.
  42. Lunghi C, Morrone MC, Alais D. Auditory and tactile signalscombine to influence vision during binocular rivalry. Journal ofNeuroscience. 2014 Jan;34(3):784-792.
  43. Thesen T, Vibell JF, Calvert GA, Österbauer RA. Neuroimaging ofmultisensory processing in vision, audition, touch, and olfaction.Cognitive Processing. 2004;5(2):84-93.
  44. Chen YC, Yeh SL, Spence C. Crossmodal constraints on humanperceptual awareness: Auditory semantic modulation ofbinocular rivalry. Frontiers in Psychology. 2011 Sep;2:212.
  45. Shams L, Kamitani Y, Shimojo S. Visual illusion induced by sound.Cognitive Brain Research. 2002 Jun;14(1):147-152.
  46. Gillmeister H, Eimer M. Tactile enhancement of auditory detectionand perceived loudness. Brain research. 2007 Jul;1160:58-68.
  47. Lloyd DM, Merat N, Mcglone F, Spence C. Crossmodal links betweenaudition and touch in covert endogenous spatial attention.Attention. Perception & Psychophysics. 2003 Aug;65(6):901-924.
  48. Auvray M, Spence C. The multisensory perception of flavor.Consciousness and cognition. 2008 Sep;17(3):1016-1031.
  49. Cytowic RE. Synesthesia: A union of the senses (2nd ed.). USA: The MIT press; 2002.
  50. Marks LE. The unity of the senses: Interrelations among them odalities (1st ed.). United Kingdom (UK): Academic Press; 1978.
  51. Hirose M, Iwasaki K, Nojiri K, Takeda M, Sugiura Y, et al.Gravitamine spice: A system that changes the perception of eating through virtual weight sensation. Transactions of theVirtual Reality Society of Japan. 2014;19(4):541-550.
  52. Koizumi N, Tanaka H, Uema Y, Inami M. Chewing jockey: Augmented food texture by using sound based on the cross modal effect. Proceedings of the 8th International Conference onAdvances in Computer Entertainment Technology. 2011;21.
  53. Narumi T, Kajinami T, Tanikawa T, Hirose M. Meta cookie. Proceedings of the ACM SIGGRAPH 2010 Emerging Technologies.2010;18.
  54. Azuma RT. A survey of augmented reality. Presence: Teleoperators and Virtual Environments. 1997;6(4):355-385. 
  55. National Research Council. Virtual reality: Scientific and technological challenges. Washington, DC: The National Academies Press; 1995.
  56. P/N. The Project Nourished. 2018.
  57. Loijens LS. Augmented reality for food marketers and consumers. Wageningen: Wageningen Academic Publishers; 2017.
  58. Narumi T, Ban Y, Kajinami T, Tanikawa T, Hirose M. Augmentedperception of satiety: Controlling food consumption by changingapparent size of food with augmented reality. Proceeding of theSIGCHI Conference on Human Factors in Computing Systems.2012:109-118.
  59. Sakurai S, Narumi T, Ban Y, Tanikawa T, Hirose M. Calibra Table:Tabletop system for influencing eating behavior. Proceedings ofthe SIGGRAPH Asia 2015 Emerging Technologies. 2015;4:2-6.
  60. Suzuki E, Narumi T, Sakurai S, Tanikawa T, Hirose M. Changing drinking behavior and beverage consumption using augmented reality. In: Yamamoto S (Ed). Human interface and the management of information. Lecture Notes in Computer Science. Switzerland: Springer; 2015. p. 648-660.
  61. Javornik A. Augmented reality: Research agenda for studyingthe impact of its media characteristics on consumer behaviour.Journal of Retailing and Consumer Services. 2016;30:252-261.
  62. McCormick H, Cartwright J, Perry P, Barnes L, Lynch S, et al.Fashion retailing-past, present and future. Textile Progress.2014;46(3):227-321.
  63. Ahn J, Williamson J, Gartrell M, Han R, Lv Q, et al. Supportinghealthy grocery shopping via mobile augmented reality. ACMTransactions on Multimedia Computing, Communications andApplications TOMM. 2015;12(1):16-24.
  64. Inamo. Inamo London. 2018.
  65. Ranasinghe N, Jain P, Karwita S, Do EYL. Virtual lemonade: Let’steleport your lemonade! Proceedings of the Tenth InternationalConference on Tangible, Embedded, and Embodied Interaction.2017;183-190.
  66. Meibner M, Pfeiffer J, Pfeiffer T, Oppewal H. Combining virtualreality and mobile eye tracking to provide a naturalisticexperimental environment for shopper research. Journal ofBusiness Research. 2017.
  67. ISO B. IEC 18004: 2006-Information technology. Automaticidentification and data capture techniques. QR Code. 2006;126.
  68. Soon TJ. QR code. Synthesis Journal. 2008;59-78.
  69. Liu Y, Yang J, Liu M. Recognition of QR Code with mobile phones.Chinese Control and Decision Conference. 2008;203-206.
  70.  QKies sag’s mitkeksen. 2018.
  71. Wegrzyn TF, Golding M, Archer RH. Food layered manufacture: anew process for constructing solid foods. Trends in Food Science& Technology. 2012;27(2):66-72.
  72. The Candy Fab Project. 2018.
  73. Choc Edge Limited-3D ALM Chocolate Printing. 2018.
  74. Soares S, Forkes A. Insects au gratin-an investigation intothe experiences of developing a 3D printer that uses insectprotein based flour as a building medium for the production ofsustainable food. Proceedings of the 16th International conferenceon Engineering and Product Design Education. 2014;426-431.
  75. Bocusini-Printing Food Naturally. 2018.
  76. Bult JH, de Wijk RA, Hummel T. Investigations on multimodalsensory integration: texture, taste, and ortho-and retronasalolfactory stimuli in concert. Neuroscience letters. 2007;411(1):6-10.
  77. Krishna A. An integrative review of sensory marketing: Engagingthe senses to affect perception, judgment and behavior. Journal ofConsumer Psychology. 2012;22(3):332-351.
  78. Obrist M, Gatti E, Maggioni E. Multisensory experiences in HCI.IEEE Multimedia. 2017;24(2):9-13.
  79. Spence C, Obrist M, Velasco C, Ranasinghe N. Digitizing the chemicalsenses: Possibilities & pitfalls. International Journal of HumanComputer Studies. 2017 Nov; 107(C):62-74.
  80. Ramic-Brkic B, Chalmers A. Olfactory adaptation in virtualenvironments. ACM Transactions on Applied Perception. 2014Jul;11(2):6-16.
  81. Rollo ME, Bucher T, Smith SP. ServAR: An augmented reality toolto guide the serving of food. International Journal of BehavioralNutrition and Physical Activity. 2017;14(1):1-10.
  82. Khushaba RN, Wise C, Kodagoda S. Consumer neuroscience:Assessing the brain response to marketing stimuli usingelectroencephalogram (EEG) and eye tracking. Expert Systemswith Applications. 2013 Jul; 40(9): 3803-3812.
  83. Hale KS, Stanney KM. Handbook of virtual environments: Design,implementation, and applications. CRC Press; 2018:2.
  84. Kohlrausch A, Braasch J, Kolossa. An introduction to binauralprocessing. The Technology of Binaural Listening. 2013:1-32.
  85. Carvalho FR, Steenhaut K, Ee R van, Touhafi. Sound-enhancedgustatory experiences and technology. Proceedings of the 1stWorkshop on Multi-sensorial Approaches to Human-FoodInteraction, Japan;2016 Nov; 5.
  86. The Fat Duck. 2018.
  87. Iwata H, Yano H, Uemura T. Food simulator: A haptic interface forbiting. Proceedings of the 2004 Virtual Reality; 2004:51-57.
  88. Hoffman HG, Hollander A, Schroder K. Physically touching andtasting virtual objects enhances the realism of virtual experiences.Virtual Reality. 1998;3(4):226-234.
  89. Hashimoto Y, Nagaya N, Kojima M. Straw-like user interface: Virtualexperience of the sensation of drinking using a straw. Proceedingsof the 2006 ACM SIGCHI international conference on Advances incomputer entertainment technology. 2006:42.
  90. Nijima A, Ogawa T. A study on virtual food texture by electricalmuscle stimulation. Transactions of the Virtual Reality Society ofJapan. 2016;21(4):575-583.
  91. Harel D, Carmel L, Lancet D. Towards an odor communication system.Computational Biology and Chemistry. 2003 May;27(2):121-133.
  92. Carulli M, Bordegoni M, Cugini U. Integrating scents simulationin virtual reality multisensory environment for industrialproducts evaluation. Computer-Aided Design and Applications.2015;13(3):320-328.
  93. Chen Y. Olfactory display: Development and application in virtualreality therapy. Artificial Reality and Telexistence-Workshops.2006;1:580-584.
  94. Zhang Y, Hoon MA, Chandrashekar J. Coding of sweet, bitter, andumami tastes: Different receptor cells sharing similar signalingpathways. Cell. 2003;112(3):293-301.
  95. Ranasinghe N, Cheok A, Nakatsu R, Do EYL. Simulating the sensationof taste for immersive experiences. Proceedings of the 2013 ACMinternational workshop on Immersive media experiences. 2013Oct:29-34.
  96. Turk V. Virtual food tech adds bite to VR. New Scientist [Internet].2016.
  97. Ranasinghe N, Suthokumar G, Do EY-L. Virtual ingredients for foodand beverages to create immersive taste experiences. MultimediaTools and Applications. 2016 Oct;75(20):12291-12309.
  98. Ranasinghe N, Do EY-L. Digital lollipop: Studying electricalstimulation on the human tongue to simulate taste sensations.ACM Transactions on Multimedia Computing, Communicationsand Applications. 2016 Oct;13(1):1-21.
  99. Cruz A, Green BG. Thermal stimulation of taste. Nature. 2000Feb;403:889-892.
  100. Ranasinghe N, Do EYL. Virtual sweet: Simulating sweet sensationusing thermal stimulation on the tip of the tongue. Proceedingsof the 29th Annual Symposium on User Interface Software andTechnology. 2016: 127-128
  101. Samshir NA, Johari N, Karunanayaka K, Cheok AD. Thermal sweet taste machine for multisensory internet. Proceedings of the 4th International Conference on Human Agent Interaction. 2016;325-328.
  102. Suzuki C, Narumi T, Tanikawa T, Hirose M. Affecting tumbler:Affecting our flavor perception with thermal feedback.Proceedings of the 11th Conference on Advances in ComputerEntertainment Technology. 2014;19.
  103. Sester C, Deroy O, Sutan A, Galia F, Desmarchelier J-F,et al.Having a drink in a bar: An immersive approach to explore theeffects of context on drink choice. Food Quality and Preference.2013;28:23-31.
  104. Bangcuyo RG, Smith KJ, Zumach JL, Pierce AM, Guttman GA, etal. The use of immersive technologies to improve consumertesting: the role of ecological validity, context and engagement inevaluating coffee. Food Quality and Preference. 2015;41:84-95.
  105. Kim S-E, Lee SM, Kim KO. Consumer acceptability of coffee asaffected by situational conditions and involvement. Food Qualityand Preference. 2016;52:124-132.
  106. Hathaway D, Simons CT. The impact of multiple immersion levelson data quality and panelist engagement for the evaluation ofcookies under a preparation-based scenario. Food Quality andPreference. 2017;57:114-125.
  107. Ung C-Y, Menozzi M, Hartmann C, Siegrist M. Innovations inconsumer research: The virtual food buffet. Food Quality andPreference. 2018;63:12-17.
  108. Murray N, Ademoye OA, Ghinea G, Muntean GM. A Tutorial forolfaction-based multisensorial media application design andevaluation. ACM Computing Surveys (CSUR). 2017;50(5):67.
  109. Endo H, Ino S, Fujisaki W. The effect of a crunchy pseudochewing sound on perceived texture of softened foods. PhysiolBehav. 2016 Dec;167:324-331.
  110. Ranasinghe N, Suthokumar G, Lee KY, Do EYL. Digital flavor:Towards digitally simulating virtual flavors. Proceedings ofthe 2015 ACM on International Conference on MultimodalInteraction. 2015;139-146.
  111. Maezawa H, Matsuhashi M, Yoshida K, Mima T, Nagamine T, etal. Evaluation of lip sensory disturbance using somatosensoryevoked magnetic fields. Clin Neurophysiol. 2014 Feb;125(2):363-369.
  112. Tsutsui Y, Hirota K, Nojima T, Ikei Y. High-resolution tactiledisplay for lips. In: Yamamoto S (eds). Human interface and themanagement of information: Applications and services, LectureNotes in Computer Science. Switzerland: Springer; 2016. p. 357-366.
  113. Mikkelsen BE, Bucher T, Hieke S, Verain MCD, Puttelaar J.Measuring food choice and consumption behaviour with real,fake or virtual food realities-A comparative approach from theRICHFIELDS program. 10th International Conference on Methodsand Techniques in Behavioral Research. 2016;88-95.
  114. Benko H, Holz C, Sinclair M, Ofek E. Normal touch and texturetouch: High-fidelity 3d haptic shape rendering on handheldvirtual reality controllers. Proceedings of the 29th AnnualSymposium on User Interface Software and Technology.2016;717-728.
  115. Carter T, Seah SA, Long B, Drinkwater B, Subramanian S.UltraHaptics: Multi-point mid-air haptic feedback for touchsurfaces. Proceedings of the 26th annual ACM symposium onUser interface software and technology. 2013;505-514.
  116. Sodhi R, Poupyrev I, Glisson M, Israr A. AIREAL: interactive tactileexperiences in free air. ACM Transactions on Graphics (TOG)-SIGGRAPH 2013 Conference Proceedings. 2013;32(4):134.
  117. Israr A, Zhao S, McIntosh K, Kang J, Schwemler Z, et al. Po2:augmented haptics for interactive gameplay. ACM SIGGRAPH.2015;Emerging Technologies:21.
  118. Scheggi S, Meli L, Pacchierotti C, Prattichizzo D. Touch the virtualreality: Using the leap motion controller for hand tracking andwearable tactile devices for immersive haptic rendering. ACMSIGGRAPH. 2015;Posters:31.
  119. Lécuyer A. Simulating haptic feedback using vision: A survey ofresearch and applications of pseudo-haptic feedback. Presence:Teleoperators and Virtual Environments. 2009;18(1): 39-53.
  120. Ban Y, Narumi T, Tanikawa T, Hirose M. Pot the magic pot:Interactive modification of the perceived angular shape. ACMSIGGRAPH. 2013;Posters:37.
  121. Taima Y, Ban Y, Narumi T, Tanikawa T, Hirose M. Controllingfatigue while lifting objects using pseudo-haptics in a mixedreality space. Haptics Symposium (HAPTICS). 2014;IEEE:175-180.
  122. Gaffary Y, Le Gouis B, Marchal M, Argelaguet F, et al. AR feels“softer” than VR: Haptic perception of stiffness in augmentedversus virtual reality. IEEE transactions on visualization andcomputer graphics. 2017;23(11):2372-2377.
  123. Punpongsanon P, Iwai D, Sato K. Soft AR: Visually manipulatinghaptic softness perception in spatial augmented reality.IEEE Transactions on Visualization and Computer Graphics.2015;21(11):1279-1288.
  124. Dingman H. The state of VR: Where Oculus Rift, HTC Vive, GearVR, and others stand right now. PCWorld. 2015.
  125. Rauschnabel PA, Ro YK. Augmented reality smart glasses: Aninvestigation of technology acceptance drivers. InternationalJournal of Technology Marketing. 2016;11(2):123-148.
  126. Chu C-CP, Dani TH, Gadh R. Multi-sensory user interface for a virtual-reality-based computer aided design system. ComputerAided Design. 1997;29(10):709-725.
  127. Velasco C, Carvalho FR, Petit O, Nijholt A. A multisensoryapproach for the design of food and drink enhancing sonicsystems. Proceedings of the 1st Workshop on Multi-sensorialApproaches to Human-Food Interaction. 2016.