JCC  Vol.9 No.9 , September 2021
A Survey on Simulation for Weight Perception in Virtual Reality
Abstract: Virtual reality (VR) technology can provide users with an immersive experience as if they are in the real world, which can be applied in the fields of entertainment, education and scientific research, etc. In order to improve the sense of presence and immersion in VR, the design of multimodal feedback is an important component. In particular, the simulation of weight of virtual objects poses many challenges due to the limitations of hardware and software. Many researchers focused on this issue in various ways. These methods are mainly divided into two categories: device-based simulation and software-based simulation. This paper investigates the focus of software-based simulation, particularly for the virtual feedback methods proposed by researchers in recent years. We introduce the background of these proposed methods, technical implementation principles, application scenarios, the advantages and disadvantages of these simulation methods, and the evaluation criteria. We also propose the future challenges and the development of simulation methods for weight perception of virtual objects in VR.

1. Introduction

In order to improve the naturalness and user experience of the virtual reality system, it is necessary to simulate the physical properties of the object, which includes the simulation of the weight of the object. Research has shown that users who get appropriate physical attribute feedback in the virtual reality system can effectively improve the user’s sense of immersion [1] [2]. A variety of technologies have been added to make the virtual reality system more realistic [3] - [10]. The weight simulation of objects has application potential in virtual reality systems [11]. For example, in virtual training teaching scenes, students can feel the weight of equipment, and can better distribute physical strength during real use; another example in virtual reality games, the weight feedback of the object increases the user’s immersion and so on. First of all, simulating the weight of virtual objects is the need for VR application scenarios. Secondly, the fidelity of the current weight simulation methods is not realistic enough to be as delicate as in the real world. The simulation methods need further research and development. In addition, most of them can only be used in specific scenarios, and there is no method that can cope with most of the applications in daily applications.

The research and development of simulation methods for the weight of virtual objects promotes the understanding of the mechanism by which people perceive the weight of objects in virtual reality systems on one hand, such as, the examination of how the addition of visual feedback has an effect on the weight evaluation of virtual objects by users. On the other hand, the simulation method promotes the development of interactive devices that can be used in virtual reality, and the application of robotics and new materials needs to be further explored.

To this end, we investigate the state of art in research and development of hap-tic weight simulation. In order to determine the scope of research, this article first explained the definition and functional requirements of virtual weight simulation methods, and then introduced the classification of virtual object weight simulation methods, including the realization principles and corresponding application scenarios, followed by descriptions of commonly used indicators and methods for evaluating simulation methods. Finally, the future development and challenges are pointed out.

2. Definition and Quantified Metrics of Simulation Methods of Haptic Weight

2.1. Definition

In the real world, our assessment of the weight of objects relies on direct and indirect assessments [12]. Direct evaluation means that the user can directly pick up the object and judge the weight of the object through the feedback of the hand force; indirect evaluation means that the weight of the object is evaluated through tools or other information, such as electronic weight on the scale. In virtual reality, users mainly obtain information through vision, and it is difficult for the weight of virtual objects to be reflected visually during the interaction process. A general virtual reality system uses a handle to interact with objects, and the weight the user feels during the interaction is actually the weight of the controller. The user’s perception of the weight of virtual objects includes the quality that the user actually feels, the weight estimation of the virtual object, and the ability to distinguish the weight difference between the virtual objects. Virtual weight is very useful in some scenes. For example, in VR games, a good grasp of the weight of a basketball can help users determine how much force to shoot the basket, or the weight of components in the collaborative architectural design process can allow designers to understand the work. The overall weight distribution is understood. This article mainly investigates the weight simulation method of virtual objects, usually using physical simulation methods [13] [14] [15], namely tactile and force feedback to provide a sense of weight [1]. In addition, it can also help users perceive the weight of virtual objects through other forms, such as special visual feedback [16] [17] [18] [19]. The weight of a virtual object is non-visual information, but it can help users perceive it through visual feedback. In the past decades, there is a number of haptic weight simulation methods had been proposed, and thus it is impossible to cover all these methods in one survey paper.

2.2. Functions and Requirements

As mentioned above, haptic weight sensation includes direct perception of the mass of virtual object has or indirect weight reflection from during the interaction. To fulfill the conception of this, the haptic weight simulation method functions need to be defined. First, simulation methods should help users distinguish objects with different mass. The proposed technique would have the ability to reflect the weight of virtual object, while there may have many virtual objects in VR for users to manipulate, it should make noticeable difference when interacting with two objects which have noticeable different weight, especially the objects without simulation. The virtual environment may not provide accurate weight haptic perception for users but different weight perception may do help in getting closer to reality.

Second, the haptic weight simulation method should do no or little destruction to the sense of experience. The haptic weight perception is to simulate the actual experience in reality, make virtual reality more realistic. The proposed methods should be natural during interaction with object while providing weight simulation to help users perceive weight. Improper simulation methods will cause users bad immersion or presence.

3. Taxonomy of Haptic Weight Simulation Methods

Virtual objects are weightless in reality for users in reality. The virtual reality system allows users to observe virtual objects through visual simulation. The virtual reality system used by general users is only equipped with handles. In order to maintain universality, VR game manufacturers have developed interaction logic of the handle. When the virtual object is touched or held based on the handle, the physical properties such as the material and weight of the virtual object cannot be sensed. In laboratory research, the restrictions on virtual reality interactive devices are small, and additional interactive devices such as data gloves, customized feedback devices, etc. can be added. However additional feedback devices will increase the cost, so there are some methods of using software simulation to make users form the illusion of weight by changing the content displayed in virtual reality or through other non-device-related methods.

Each simulation method has its specific application scenario, and the simulation method has its own advantages and disadvantages. In the research on the full-hand force feedback, the sensory discrimination ability (i.e., Weber fraction) when using the simulation device was also investigated, and the performance of the full-hand force feedback device in the user interaction process was discussed. The feedback device is not intended to replace others force feedback systems. Each device has its specific application scenario, and there is the most suitable force feedback simulation system or method for different application scenarios [20]. There are different ways to categorize haptic weight simulation methods according to, such as the feedback source, the supported gestures, the device-depended or composed category just like Revue et al. proposed which is shown in Figure 1 [21]. In this survey, simulation methods are classified, in terms of the device-depended, into device-based and software-based methods.

3.1. Device-Based Simulation Method

Many VR controllers use vibration to remind the user that they have touched a virtual object when they interact with a virtual object. It is almost impossible to rely on the vibration of the handle alone to provide users with a weight perception. The most direct way to use the device to simulate the weight of a virtual object is to let the user hold a real object corresponding to the virtual object in reality, but obviously, this method is not very versatile. The simulation of the weight of the virtual object does not need to completely simulate the mass and shape of the object. It only needs to simulate the force feedback it gives the user. The problem becomes much simpler. Therefore, the force feedback device can also help the user perceive the weight of virtual object. The difference between weight simulation feedback and force simulation feedback is: force simulation feedback can simulate forces in multiple dimensions, weight simulation feedback only simulates the force in the vertical direction, weight simulation feedback is a type of force simulation feedback.

Early weight simulation equipment was grounded and bulky, such as PHANTOM [22]. This type of simulation equipment usually has a fixed place to

Figure 1. Two dimensions to classify current haptic technologies.

provide force. When simulating the weight of an object, the simulated weight range can be relatively large, but generally the scope is limited, lake of flexibility, and the usage is restricted. In recent years, handheld devices and wearable devices have been sought after by commercial companies and researchers. Based on their flexibility and portability, various applications can be developed on top of them.

At present, there are commercial data gloves products with force feedback, such as CyberGraspTM data gloves [23], Dexmo [24], Rutgers Master II [14], etc. to pro-vide force feedback for the palm or wrist. There are also special equipments for weight simulation in the research, such as PHANTOM [22], SPIDAR G & G [25], HIRO III [26] and so on. The simulation of the weight of a virtual object is mainly divided into two types: one is to simulate the gravity of the object in static state, and the other is to simulate the inertia of the object in motion. The simulation of the object’s inertia, includes the change of gravity and the center of gravity. The change of the object’s center of gravity can be simulated using the transfer of liquid metal [27]. Some simulation equipment are shown in Figure 2.

3.1.1. Simulation of Static Weight

Simulation equipment driven by electric motor.

Generally, the force feedback is provided by the mechanical structure, and the motor is a common component in these devices. For different interaction methods and application scenarios, the forms of equipment are also various. Grounded devices like Master Finger 2. Wearable devices such as Grabity, PIVOT, CLAW, etc.

Figure 2. Device for simulating the weight of virtual objects.

Master Finger 2 is composed of exomechanical bones on the index finger and thumb, pressure sensors on the finger cot and supporting equipment. In the process of interacting with virtual objects, the damping of the mechanical arm can be changed by the motor and pressure sensor to provide force feedback. This device is suitable for grabbing or grasping tasks. Since it is a device fixed on the desktop, there is a limitation of the range of activities [28].

Choi et al. designed a controller for grasp named Grabity. Grabity is a wearable pseudo-tactile interaction device. It mainly simulates the stiffness and weight of the user when grasping objects through force feedback devices installed on the index finger and thumb. Compared with grounded force feedback devices (such as Master-Finger 2), the device is small in size and can be moved in a large range in a virtual environment. The device mainly contains feedback of rigid force and tangential force, which is convenient for simulating the weight and inertia of objects. Two user studies have shown that the device can well simulate different levels of weight and force feedback and has great potential. The paper also investigated the user’s comfort level of using the device and other questionnaires, and the results showed that the user’s acceptance of this device is relatively high [29]. Haptic PIVOT is a device tied to the wrist. It has a rotatable handle. When the user grabs or throws an object in virtual reality, the handle can be rotated from the wrist to the user’s palm. The handle is driven by a motor. The force applied to the palm of the hand can change the user’s perception of the object. Using PIVOT with both hands can simulate lifting a heavier object in a virtual scene, and can simulate the weight of the object or the inertial change during throwing or picking [30].

Other simulated devices, such as CLAW could provide haptic weight perception. CLAW is a handheld multifunctional pseudo-tactile feedback device specially designed for grasping, touching and triggering objects. In addition to the conventional controller functions, it also has an auxiliary feedback device connected to the index finger, which can track the movement of the index finger. Use motors to control damping or force feedback. Claw has been recognized by many users in the task of grasping objects, and can provide natural, effective and comfortable feedback [31].

Simulation equipment driven by air jet.

In addition to the motor, the source of force feedback can also be provided by the thrust generated by the jet propeller during operation. Weight simulation equipment driven by air jet include Thor’s Hammer, Aero-plane, etc.

Thor’s Hammer is a handheld force feedback device that uses propellers to pro-vide force feedback. It is shaped like a hammer. Each surface of the hammer body has a propeller, which can provide a maximum of 4 N and less than 0.11 N in a single direction [32].

Similarly, drones flying with jet propellers can also be used as a source of force feedback. UAVs have the advantages of flexibility and portability. Video transmission and remote monitoring based on UAVs are more popular applications. UAVs can also be combined with virtual reality for user interaction. One way is to follow the user’s movement through the drone. When the user needs to touch an object, the material carried by the drone allows the user to touch and perceive the shape. When the user needs to grab the object, it is also through the drone itself. The braking force provides the user with weight. This method is more flexible than the current force feedback device, is not limited by the location of the device, and can quickly provide the user with the weight when grasping different objects [13].

Other simulation equipment.

In addition to the above two simulation devices that provide force feedback sources, there are some more novel simulation devices, such as GravityCup. GravityCup simulates the force feedback of users in virtual reality applications such as pouring water in virtual reality. It is actually a container equipped with a water pump and a corresponding tube. When the user holds the virtual cup, the water pump can be used to inject or pump water into the container in real time. To achieve the change of weight, and because its content is liquid, it can well simulate the change of the center of gravity when interacting in virtual reality [33]. There are also some devices that simulate the weight of objects based on stretching the skin [34] [35].

3.1.2. Inertial Simulation

In addition to the direct grasping task, the weight of the object can also be estimated in other ways during the virtual reality interaction, such as the inertial force feedback during the movement of the virtual object. The simulation method of inertia is divided into the method of changing the center of gravity of the equipment and the method of changing the inertial force.

Change the center of gravity of the equipment.

During the movement of the virtual object, there may be a change in the center of gravity of the object. When the device is used for simulation, the position of the center of gravity of the device can also be changed to provide similar feedback. There is generally a movable mass in this type of simulation equipment, and the position of the slider is changed according to the change of the center of gravity of the simulated object, so that the user can perceive the change of the center of gravity of the object.

ElastOsciliation is a 3D multi-level force feedback device, which mainly simulates the vibration feedback of virtual objects due to motion during the interaction process. It consists of 6 elastic bands and the corresponding motor control device’s center of gravity transformation. A typical case is the experience of a fishing rod vibrating or shaking a wine glass. It provides two-dimensional center of gravity changes. In addition to enhancing the user experience by changing the device’s center of gravity, it also allows users to change the way they hold the device to get a closer experience [36].

Ryu et al. proposed Elastick, a device used to evaluate weight through the shaking or swinging method. It has a movable part that can control the swing. It simulates the inertial feedback force of different objects during shaking or swinging with different swing amplitudes, so as to compare the virtual The weight of the object produces an estimate [37].

Onishi et al. designed BouncyScreen, a force feedback device that simulates the movement of objects. It consists of a display screen and a one-dimensional mobile device. It uses a handle to interact. When the user uses the handle to push the object to move, the display will move with the handle. While moving and providing damping feedback, the objects in the display will also have corresponding visual effects. User examples prove that this device and simulation method can effectively enhance the authenticity and presence of interaction in virtual reality [38].

Most weight simulation devices need to be grounded, which is not convenient for large-scale movement in the virtual scene. Therefore, ungrounded force feedback devices have attracted more and more interest. Ginga et al. proposed a non-grounded force feedback device called HapSticks. This device is shaped like a chopstick and can provide a stable virtual weight feedback. However, compared to the real weight, the user is in virtual reality. The ability to recognize different weights is worse when using it. Experimental data shows that under the same simulation paradigm, there are obvious differences in weight estimation between users. If users need to make a unified estimation of the weight of the same object, it is necessary to adjust the weight feedback more accurately [39].

Transcalibur is handheld weight simulation device that contains two robotic arms. The center of gravity of the controller can be changed by changing the degree of expansion and closing of the robotic arms, and it can simulate the feeling of different virtual objects grasped on the hand, and can even simulate dynamically Changes in the center of gravity of the object. It does not need to completely match the weight of virtual objects and real objects. Several preset models can help users experience the real feeling of holding different virtual objects in their hands [15].

A German research team designed drag: on, a handheld simulation device that provides resistance and changes in the center of gravity. Its structure is similar to Transcalibur. It has two mechanical arms as regulators for changing the center of gravity, but its mechanical arms can be controlled to expand or close, similar. The shape of the fan can simulate a more refined weight transfer situation [40].

Sagheb et al. proposed SWISH, an interactive device for gravity center simulation, used to simulate the change of the center of gravity of the liquid in the container. The principle of simulation is that the mass block inside the device is movable, and the center of gravity position of the simulated object can be changed in real time as needed [41].

There is also DPHF (dynamic passive haptic feedback) that uses changing the distribution of the center of gravity of handheld devices to provide feedback on changes in the weight of virtual objects. This device can be combined with other feedback methods, such as HR (haptic retargeting) to provide more convenient interaction. HR refers to changing the movement of the real hand by controlling the rendering of the virtual hand. Its essence is through virtual reality and reality. The mismatches can create the illusion [42] [43].

Combining the equipment’s vibration, center of gravity change feedback and visual display feedback, in the interaction process, a movable weight changing device is used to simulate the weight change of the object during the interaction process, and the vibration motors on the left and right sides are used to provide feedback. Even if the weight of the controller held by the user is the same, it is possible to obtain a different weight experience through the feedback in these interactions. In the user research case, the left and right vibration motors can be controlled separately. The gravity center simulation device behind the equipment is a mass that can slide on the slide rail. The mass can change position on the slide rail to simulate the change of the center of gravity in one dimension. However, the range of changes in the center of gravity that can be simulated is limited [44].

Some researchers have proposed that by modeling and analyzing the object to be simulated, the shape and center of gravity data can be mapped to a simplified handle design to provide users with a more realistic object experience. For example, use some materials that are convenient to cut to make the shape, and use the counter-weight to place the corresponding center of gravity simulation position to get a similar experience [45].

Change the inertial force.

In addition to changing the center of gravity distribution of interactive devices, inertial feedback can also be provided for objects in motion in other ways.

DeltaTouch is a wearable 3D pseudo-tactile feedback device that simulates the weight, shape, material and other attributes of objects by controlling the force feedback given to the palm of the human hand [46]. The user’s weight evaluation experiment shows that this device can allow the user to distinguish the difference in weight.

Seungwoo et al. proposed a force feedback handheld controller based on a micro-jet propeller, which can provide up to 14 N moving weight simulation within 0.3s. It is mainly used to solve the general force feedback device when simulating virtual objects in virtual reality. In the absence of grounding, the inertial force cannot be well simulated. This device can truly simulate the change of 2D plane weight and has the advantage of lightness [47]. The Thor’s Hammer mentioned above can also simulate inertial forces.

Other methods.

In real life, users generally confirm the amount of liquid content in the container by shaking, thereby determining the weight of the object. Based on this principle, the researcher designed a device that can simulate the inertial force change of the container content during shaking. The user case studies show that the users’ perception of weight is more real than without feedback, and this method allows them to more accurately assess the amount of content [48].

In addition to exoskeleton-type force feedback devices, there are also more flexible soft skin stretching devices, which generally use silicone elastic materials, and are mostly used in the form of wearable devices in the interaction of virtual scenes. The device is flexible, thin and portable, and does not require external complex mechanical skeleton design. It can be combined with data gloves to reduce the weight of the feedback device and reduce the impact on the interaction of the virtual scene. It has very high application value [49].

In addition to the direct use of force feedback devices, the use of other feedback channels such as vibration, sound and visual feedback can also help users evaluate the magnitude of force [50].

When we encounter obstacles in the real world, we feel obstructed and cannot directly pass through, but in the virtual world, objects are allowed to pass through. To this end, some researchers have proposed a method of providing blocking feedback, that is, through electrical stimulation to stimulate the shoulder, arm, and wrist muscle groups to produce a reaction force to pull the user’s arm back. This method can also produce a downward movement. The reaction force makes the user feel the weight when lifting the virtual object [51].

Some stimulation devices are listed in Table 1 for comparison.

3.2. Software-Based Simulation Method

The method of using equipment to simulate weight has the advantages of real feedback and good user immersion. At present, commercial virtual display systems often enable handheld controllers to interact in a virtual environment and use the same tool to grab different virtual objects. For virtual objects, a specific

Table 1. Information about weight simulation equipment.

controller can only provide the same tactile feedback, so the software-based weight simulation method can compensate for this problem. Compared with the equipment simulation method, the software-based simulation method has wider applicability and lower cost.

Regarding the software simulation method, the main influence is the user’s assessment of the weight of the virtual object. Masayuki et al. studied the weight perception process in virtual reality through user examples. When the user lifts objects, speed has a short-term unimodal impact on weight perception, that is, the estimation of weight by people occurs in the preliminary planning process of the motion trajectory. In, a more detailed study of the user’s lifting process found that the user’s weight perception ability becomes more sensitive during the deceleration phase of the lifting, and pointed out that the illusion mechanism has the potential to be applied in virtual reality [52].

Based on the software simulation method, it can be divided into two ways: mismatch and enhance according to the principle of realization. Mismatch helps users to produce illusion by making the interaction in virtual reality not match the user. The enhance method mainly uses the form of visual expression to let users understand the virtual reality. The weight assessment of the object becomes heavier.

3.2.1. Mismatch

There are often interactive methods in virtual reality that do not match those in the real world. These interactive methods have pros and cons. It can help users quickly search and select information in virtual reality, and can provide guidance, but they are always reminded. The user is in a virtual environment, causing a certain sense of experience destruction. Interaction methods that do not match reality can also be used to simulate the weight of virtual objects. Rietzler et al. found that the distance between the controller in the real world and virtual hand in the virtual reality, that is offset, can help users perceive the weight of virtual objects. When the user grasps an object, the virtual hand does not match the actual hand position, resulting in an offset, and the size of the offset and the movement performance during grasping can be controlled to allow the user to perceive the weight of the object. They also use this mechanism to let the user perceive the weight of the object. The user estimates the weight of the virtual object under different offset conditions [19].

There are many ways to provide analog tactile feedback. Many of these feed-backs use the principle of proprioceptive mismatch to provide pseudo-tactile. However, the relationship between the strength of pseudo-tactile feedback and the size of the object itself is uncertain. Research shows that stronger visual pseudo-tactile feedback is needed for larger virtual objects [53].

The wearable force feedback device can judge the weight of the object by the distance between the thumb and the index finger in the virtual reality when grasping the object. The other is to provide the force of the finger grasping, using weber Fraction is used as an indicator to evaluate the effects of the two methods. Although it is relatively simple, it can provide effective analog feedback [54].

Virtual reality does not have to be exactly the same as in reality. Users will gradually accept the interactive methods in virtual reality, such as interacting with handles and rays. The researchers first discussed the virtual feedback method based on algorithm simulation, for example, how the visual feedback works when the real hand and the virtual hand are offset. It is found that a more real experience can be obtained than the simulation without feedback, and then accepts the participant’s suggestion to increase the dimension of feedback, such as the task prompt sound. The combined feedback method can provide better presence and immersion [55].

Using the characteristics of virtual reality and the real world does not match to enhance the user’s perception of objects and the redirection tool interaction method proposed by Strandholt et al. It is aimed at making virtual tools and real tools when tools are used in virtual reality. A mismatch is generated to simulate the physical properties of the object used by the tool. For example, when hitting with a hammer in a virtual environment, the virtual hammer is offset from the real hammer, which makes the user think that the hammer has produced a vibration [56].

In addition to the force feedback provided by the device itself, there is also a way to provide a simple or realistic interaction interface by measuring the strength of the EMG signal when the user grasps an object. The EMG sensor is worn on the forearm, and when the user grasps, the degree of muscle exertion can be monitored when it is taken, and corresponding visual feedback is provided according to the magnitude of the force. The feedback includes the swing amplitude, expression, movement trajectory of the virtual avatar’s hand, or the performance of the corresponding object when subjected to different forces in real life, and guide the user uses the appropriate force to interact with the virtual object [57]. Similarly, another researcher proposed that a similar EMG signal acquisition device can be used to compare the actual force used by the user with the weight of the target virtual object. When the force used by the user is less than the weight of the target object, the force arrow in front of the user indicates less than the ideal on the contrary, it indicates greater than the ideal force. When the used force matches, it indicates that the output is just right. The user research experiment is set to lift dumbbells, set different target weights, and let users use this feedback method to evaluate the virtual object’s performance. Corresponding to the weight, it is found that the user can achieve an accuracy of 77.71% [17].

In order to make better use of the current consumer-level virtual reality system, Rietzler et al. proposed a multi-modal pseudo-tactile feedback method using existing equipment and software [19]. The effect of this method was evaluated in user research, and the results showed that only visual and tactile cues, and even guided perceptions can convey kinesthetic feedback, and the user’s enjoyment and presence are significantly increased.

Pseudo-tactile feedback refers to the use of visual feedback and the attributes of human visual-tactile perception to simulate touch in a virtual environment, and stimulate human perception to produce hallucinations through a variety of sensory channels. It can be used to simulate different properties of objects such as the elasticity of virtual springs and pictures. Texture, or the quality of virtual objects, has great potential to produce various user experiences [58] [59].

The method of changing the control/display (C/D) ratio can also help users perceive the weight of virtual objects. The C/D ratio refers to the ratio of the user’s real hand to the movement amplitude of the corresponding virtual hand in virtual reality. When the hand in virtual reality is completely controlled by the real hand, the C/D ratio is 1. In the scene, the C/D ratio is usually not 1. In user research by Lionel et al., it is found that controlling the C/D ratio can significantly affect the user’s estimation of the physical properties of the interactive object. If the C/D ratio is less than 1, Users will tend to perceive lighter weight, and conversely perceive heavier weight. At the same time, applying different C/D ratios can also allow users to clearly distinguish the weight of different virtual objects. This discovery can help VR developers or researchers to enhance the user’s weight perception ability in virtual reality [60].

An example of controlling the C/D ratio is the method proposed by David et al. to use virtual avatars to help users perform weight assessment. The user performs weightlifting tasks in virtual reality, and captures the user’s real-time actions through motion capture devices. The user displays a virtual avatar controlled in real time using motion capture data. By changing the control ratio, the virtual avatar’s actions do not match the actual actions. Coupled with the facial expressions of the virtual avatar, users can distinguish and perceive the weight of the virtual barbell [16].

The visual delay on the display system can affect the user’s perception of force and weight in the virtual environment. In a study, a force feedback robot was used to provide force feedback, and the synchronization of vision and force feedback was canceled in the virtual display. The survey shows that the user feels heavier weight during lifting tasks when there is visual delay than when there is no visual delay. Visual delay will make the user feel that the interaction does not match, and when the delay is large, it will cause the user the evaluation of the difficulty of completing the task increases, so the illusion of the object is heavier [61].

In the previous section, this article introduced the device’s simulation of the inertia or center of gravity changes during the movement of virtual objects. It is more difficult to simulate the tactile sensation algorithmically, and it is more difficult to simulate the changes in inertia and center of gravity of objects. Yu et al. proposed a pseudo-tactile feedback technology that can display the weight and mass distribution of virtual objects. This technology mainly uses the virtual object to change the C/D ratio during the rotation process to change the mapping relationship between the motion and force prompts, thereby generating a sense of weight, thereby generating a weight experience without changing the quality-related attributes of the object. In addition, two methods for simulating mass distribution are proposed, one is to manipulate the pivot point of rotation, and the other is to adjust the rotation movement according to the real-time dynamics of the object, the latter method is more effective [62].

3.2.2. Enhance

In addition to the simulation of weight through interactive methods, the attributes of the virtual object itself will also affect the user’s weight perception. Akihiro et al. set up three kinds of weight illusion experiments to explore the influence of the object’s own attributes on weight perception, namely: size, brightness, and material. The corresponding rule in reality is: smaller objects look lighter but feel heavier than large objects, bright objects look lighter but feel heavier than dark ones, and objects with lighter materials look lighter but feel heavier than objects with heavier materials. According to their experimental results, the illusion of weight caused by brightness and materials in VR is opposite to that in the real world, but the weight caused by size is the same as in the real world [18]. Changing the brightness to change the user’s perception of the weight of an object can be used not only in virtual reality, but also in augmented reality systems [63].

Using algorithms to simulate virtual objects can not only provide users with the perception of weight when interacting with objects, but also intervene in real life through the perception of weight. Virtual reality simulation of object weight has a very wide range of application scenarios. For example, users observe their own virtual avatars in virtual reality, enhance their cognition of their own obesity, and help control weight [64].

In mixed reality, some researchers have proposed to render different virtual object shapes on real objects. Even if the weight of the real object does not change, the rendered virtual object can give the user the illusion of a change in the center of gravity [65].

4. Evaluation of Virtual Weight Feedback Method

In the process of research on virtual weight feedback methods, researchers often need to evaluate the effects of the proposed feedback methods. The content of the evaluation includes the user’s subjective experience and some objective standards. In terms of subjective experience evaluation, user experience surveys in virtual reality mainly consider the following factors: sickness, immersion, presence, and enjoyment.

In VR, the user produces sickness due to the scene or interaction. The degree of sickness in virtual reality interaction affects the user’s experience in virtual reality. To measure the degree of user sickness, you can evaluate the user’s sickness through some questionnaires. Subjective scales include SSQ [66] and VRSQ [67]. Immersion means being there, which is another dimension to measure the virtual reality experience. It can reflect the user’s investment in the virtual world, and it can also be reflected through questionnaires. Presence, similar to immersion, is also a measure of the user’s involvement in virtual reality. It refers to the sense of being together in a virtual world [68], which can be evaluated using Witmer-Singer presence questionnaire [69]. Enjoyment is used to measure the degree of user enjoyment in virtual reality. The higher the degree, the more effective the scene or interaction method can attract users, and the better the user experience. Presence, enjoyment and sickness indicators can be assessed using E2I questionnaires [70]. The above-mentioned measurement indicators are subjective feelings, which are related to the physical interaction equipment and the user’s experience, and they cannot be measured by objective definitions. Both the user’s awareness of interactive devices and the experience in VR may have an impact on the sense of existence [71] [72] [73] [74]. In addition, NASA TLX questionnaire is often used in virtual reality to evaluate the mental burden of users in virtual reality [75].

Some studies also used custom questionnaires or scoring methods to score scenarios with simulated feedback. The content of the questionnaire mainly included the comfort of the scene and whether the interaction was natural or not. The significance test was performed by comparing the scenes without simulated feedback. In order to further verify the effectiveness of the simulation method, it is also possible to evaluate the user experience feedback after completing the experiment. In particular, in order to better evaluate the proposed simulation feedback method, it is often said that the simulation feedback method is combined with the application scenario to perform tasks in the virtual world and objectively evaluate the effect of the simulation feedback through the user’s task performance. In some interactive tasks, the quality of the interactive mode can also be evaluated by the embodiment questionnaire [76]. In the process of using the virtual system, the difficulty of user fatigue can also be used as one of the evaluation criteria of the quality of the interaction method. A good interaction method should be able to reduce the user’s mental burden as much as possible, and will not quickly enter a state of fatigue [77].

The strength of virtual object weight simulation can be evaluated by Just-Noticeable-Difference (JND) [78] and Weber fraction [79]. The Weber fraction is the ratio between the difference threshold and the stimulus intensity [54]. For simulated weight, the calculation method of Weber fraction is as Equation (1):

WF = JND m r (1)

JND refers to just noticeable difference. m r refers to the reference weight.

5. Challenge and Development in Future

In order to fill the gap for simulating haptic weight sensations, future research and technical challenges are discussed below.

5.1. Open Questions on Haptic Weight Simulation

To design haptic weight simulation methods, more studies are needed to understand the perception characteristics when haptic weight stimuli being perceived simultaneously.

The detection and/or discrimination threshold for a had arouse attention in many researches, such as the investigation of C/D ratio on the perception of mass [60]. However, it’s still unclear how these thresholds may change when a user simultaneously perceives other feedback. The modulation of weight stimuli and other simulations would cause any other influence in immersion or presence is possible. Actually, many proposed methods, such as Grabity involves vibrotactile feedback and stiffness force feedback at the same time. More rigorous studies are needed to reveal the coupling effect among other modalities. Considering the various haptic stimuli to be rendered during interaction, the relation of combinations is needed to be verified.

5.2. Limitation of Interaction Mode

As mentioned above, most of the simulation methods focus on special application, such as grasping, lifting. To simulate weight and fine manipulation between bare hand and virtual objects, more interaction modes need to be simulated. Just for grasping, there is a rich family of gestures, not all of them work in one simulation methods. For example, different gestures produce different weight sensations. Therefore, the sensing system of a simulation needs to be sufficiently accurate to capture the change in the interaction mode.

Another question is the consistency of visual feedback and real hand motion. Support of multi-interaction mode means that the accurate capture of hand motion could also need to be shown during interaction as visual feedback. The robustness of hand motion tracking may do influence it.

5.3. Structural Design and Fabrication

The device-based stimulation methods have its inherent advantages in stimulation more realistic feedback during interaction but are also dragged down for need of external peripheral. In order to provide more realistic feedback and do less influence on user experience, a more light and more flexible device needs to be designed. For example, holding an irregular object causes different force sensation in palm, the high-fidelity is needed in this situation. Such device needs careful design. For device simulation methods, the current mainstream solution is to use handheld or portable for wearable devices, whether it is to simulate the center of gravity or the inertia, most of the solutions use mechanical structures. The use of mechanical structures for simulation is naturally accompanied by an increase in the size of the device; currently there are still shortcomings for simulated devices, such as versatility Inferior, the simulation equipment cannot be interoperable in some tasks; the range of simulated forces is limited, and handheld and wearable devices cannot simulate heavy virtual objects, such as a 10 Kg barbell, due to their size limitations.

5.4. Extend Applications

Besides, those mentioned above methods are proposed for individual users while the multiuser manipulation is rarely been considered. The collaborative environment has many potential applications such as simulated training, remote medical assistance. The stimulation effect in the collaborative virtual reality needs to be explored for there is not only first perspective but visual feedback from other users.

6. Conclusion

In this article, we investigated the method of weight simulation in virtual reality. Through the investigation, we found that various methods have their corresponding application scenarios and application conditions. There is no one technology that can completely make users unable to distinguish between reality and virtual world. The simulation technology has various implementation angles and principles, but the evaluation criteria are basically inseparable from subjective experiences such as enjoyment, presence, and immersion. The evaluation methods of most simulation methods are through subjective questionnaire surveys, such as allowing users to compare in application scenarios to determine whether the methods are effective. A small number of methods can use the performance of completing the task to evaluate the quality of the method. Generally speaking, there are still fewer objective evaluation factors, and it is difficult to compare evaluation simulation methods with other simulation methods. From the simulation methods listed, it is found that many methods have their specific application scenarios, and user case studies have also been done to achieve better performance in the target scenarios. The diversity of weight simulation methods also allows us to see the impact of other feedback in virtual reality on the weight evaluation of virtual objects. How to obtain weight experience by affecting other perceptions is also worthy of our exploration and thinking.

Cite this paper: Ye, X. (2021) A Survey on Simulation for Weight Perception in Virtual Reality. Journal of Computer and Communications, 9, 1-24. doi: 10.4236/jcc.2021.99001.

[1]   Burdea, G.C. (1999) Haptic Feedback for Virtual Reality. Proceedings of International Workshop on Virtual Prototyping, Laval, May 1999, 17-29.

[2]   Kerruish, E. (2019) Arranging Sensations: Smell and Taste in Augmented and Virtual Reality. The Senses and Society, 14, 31-45.

[3]   Regan, M. and Pose, R. (1994) Priority Rendering with a Virtual Reality Address Recalculation Pipeline. Proceedings of the 21st Annual Conference on Computer Graphics and Interactive Techniques, Orlando, 24-29 July 1994, 155-162.

[4]   Patney, A., Salvi, M., Kim, J., Kaplanyan, A., Wyman, C., Benty, N., Luebke, D. and Lefohn, A. (2016) Towards Foveated Rendering for Gaze-Tracked Virtual Reality. ACM Transactions on Graphics, 35, Article No. 179.

[5]   Benko, H., Holz, C., Sinclair, M. and Ofek, E. (2016) Normaltouch and Texturetouch: High-Fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers. Proceedings of the 29th Annual Symposium on User Interface Software and Technology, Tokyo, 16-19 October 2016, 717-728.

[6]   Albert, R., Patney, A., Luebke, D. and Kim, J. (2017) Latency Requirements for Foveated Rendering in Virtual Reality. ACM Transactions on Applied Perception, 14, Article No. 25.

[7]   Benoit, M., Guerchouche, R., Petit, P.-D., Chapoulie, E., Manera, V., Chaurasia, G., Drettakis, G. and Robert, P. (2015) Is It Possible to Use Highly Realistic Virtual Reality in the Elderly? A Feasibility Study with Image-Based Rendering. Neuropsychiatric Disease and Treatment, 11, 557-563.

[8]   Overbeck, R.S., Erickson, D., Evangelakos, D., Pharr, M. and Debevec, P. (2018) A System for Acquiring, Processing, and Rendering Panoramic Light Field Stills for Virtual Reality. ACM Transactions on Graphics, 37, Article No.197.

[9]   Tsingos, N., Gallo, E. and Drettakis, G. (2004) Perceptual Audio Rendering of Complex Virtual Environments. ACM Transactions on Graphics, 23, 249-258.

[10]   Nordahl, R. and Nilsson, N.C. (2014) The Sound of Being There: Presence and Interactive Audio in Immersive Virtual Reality. In: Collins, K., Kapralos, B. and Tessler, H., The Oxford Handbook of Interactive Audio, Oxford University Press, Oxford, 213-233.

[11]   Ranasinghe, N., Jain, P., Karwita, S., Tolley, D. and Do, E.Y.-L. (2017) Ambiotherm: Enhancing Sense of Presence in Virtual Reality by Simulating Real-World Environmental Conditions. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, 6-11 May 2017, 1731-1742.

[12]   Ernst, M. and Banks, M. (2002) Humans Integrate Visual and Haptic Information in a Statistically Optimal Fashion. Nature, 415, 429-433.

[13]   Abtahi, P., Landry, B., Yang, J., Pavone, M., Follmer, S. and Landay, J.A. (2019) Beyond the Force: Using Quadcopters to Appropriate Objects and the Environment for Haptics in Virtual Reality. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, 4-9 May 2019, Paper No. 359.

[14]   Bouzit, M., Burdea, G., Popescu, G. and Boian, R. (2002) The Rutgers Master II-New Design Force-Feedback Glove. IEEE/ASME Transactions on Mechatronics, 7, 256-263.

[15]   Shigeyama, J., Hashimoto, T., Yoshida, S., Narumi, T., Tanikawa, T. and Hirose, M. (2019) Transcalibur: A Weight Shifting Virtual Reality Controller for 2D Shape Rendering Based on Computational Perception Model. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, 4-9 May 2019, Paper No. 11.

[16]   Jáuregui, D.A.G., Argelaguet, F., Olivier, A.-H., Marchal, M., Multon, F. and Lecuyer, A. (2014) Toward “Pseudo-Haptic Avatars”: Modifying the Visual Animation of Self-Avatar Can Simulate the Perception of Weight Lifting. IEEE Transactions on Visualization and Computer Graphics, 20, 654-661.

[17]   Lee, J., Kim, J.-I. and Kim, H. (2019) Force Arrow 2: A Novel Pseudo-Haptic Interface for Weight Perception in Lifting Virtual Objects. 2019 IEEE International Conference on Big Data and Smart Computing (BigComp), Kyoto, 27 February-2 March 2019, 1-8.

[18]   Maehigashi, A., Sasada, A., Matsumuro, M., Shibata, F., Kimura, A. and Niida, S. (2021) Virtual Weight Illusion: Weight Perception of Virtual Objects Using Weight Illusions. Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, May 2021, Article No. 344.

[19]   Rietzler, M., Geiselhart, F., Gugenheimer, J. and Rukzio, E. (2018) Breaking the Tracking: Enabling Weight Perception Using Perceivable Tracking Offsets. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, 21-26 April 2018, Article No. 128.

[20]   Tzafestas, C.S. (2003) Whole-Hand Kinesthetic Feedback and Haptic Perception in Dextrous Virtual Manipulation. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans, 33, 100-113.

[21]   Bouzbib, E., Bailly, G., Haliyo, S. and Frey P. (2021) “Can I Touch This?”: Survey of Virtual Reality Interactions via Haptic Solutions. arXiv Preprint arXiv:2101.11278.

[22]   Massie, T.H. and Salisbury, J.K. (1994) The Phantom Haptic Interface: A Device for Probing Virtual Objects. Proceedings of the ASME Winter Annual Meeting, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Chicago, November 1994, 295-300.

[23]   Aiple, M. and Schiele, A. (2013) Pushing the Limits of the Cyber-GraspTM for Haptic Rendering. 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, 6-10 May 2013, 3541-3546.

[24]   Gu, X., Zhang, Y., Sun, W., Bian, Y., Zhou, D. and Kristensson, P.O. (2016) Dexmo: An Inexpensive and Lightweight Mechanical Exoskeleton for Motion Capture and Force Feedback in VR. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, 7-12 May 2016, 1991-1995.

[25]   Murayama, J., Bougrila, L., Luo, Y., Akahane, K., Hasegawa, S., Hirsbrunner, B. and Sato, M. (2004) SPIDAR G&G: A Two-Handed Haptic Interface for Bimanual VR Interaction. Proceedings of Euro-Haptics, Munich, 5-7 June 2004, 138-146.

[26]   Endo, T., Kawasaki, H., Mouri, T., Ishigure, Y., Shimomura, H., Matsumura, M. and Koketsu, K. (2010) Five-Fingered Haptic Interface Robot: HIRO III. IEEE Transactions on Haptics, 4, 14-27.

[27]   Niiyama, R., Yao, L. and Ishii, H. (2014) Weight and Volume Changing Device with Liquid Metal Transfer. Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction, Munich, February 2014, 49-52.

[28]   Giachritsis, C., Barrio, J., Ferre, M., Wing, A. and Ortego, J. (2009) Evaluation of Weight Perception during Unimanual and Bimanual Manipulation of Virtual Objects. World Haptics 2009 3rd Joint Euro-Haptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Salt Lake City, 18-20 March 2009, 629-634.

[29]   Choi, I., Culbertson, H., Miller, M.R., Olwal, A. and Follmer, S. (2017) Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality. Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, Québec, 22-25 October 2017, 119-130.

[30]   Kovacs, R., Ofek, E., Gonzalez Franco, M., Siu, A.F., Marwecki, S., Holz, C. and Sinclair, M. (2020) Haptic Pivot: On-Demand Handhelds in VR. Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, Virtual, 20-23 October 2020, 1046-1059.

[31]   Choi, I., Ofek, E., Benko, H., Sinclair, M. and Holz, C. (2018) Claw: A Multifunctional Handheld Haptic Controller for Grasping, Touching, and Triggering in Virtual Reality. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, 21-26 April 2018, Article No. 654.

[32]   Heo, S., Chung, C., Lee, G. and Wigdor, D. (2018) Thor’s Hammer: An Ungrounded Force Feedback Device Utilizing Propeller-Induced Propulsive Force. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, 21-26 April 2018, Article No. D110.

[33]   Cheng, C.-H., Chang, C.-C., Chen, Y.-H., Lin, Y.-L., Huang, J.-Y., Han, P.-H., Ko, J.-C. and Lee, L.-C. (2018) GravityCup: A Liquid-Based Haptics for Simulating Dynamic Weight in Virtual Reality. Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology, Tokyo, November 2018, Article No. 51.

[34]   Minamizawa, K., Fukamachi, S., Kajimoto, H., Kawakami, N. and Tachi, S. (2007) Gravity Grabber: Wearable Haptic Display to Present Virtual Mass Sensation. ACM SIGGRAPH 2007 Emerging Technologies, San Diego, August 2007, 8-es.

[35]   Schorr, S.B. and Okamura, A.M. (2017) Fingertip Tactile Devices for Virtual Object Manipulation and Exploration. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, 6-11 May 2017, 3115-3119.

[36]   Tsai, H.-R., Hung, C.-W., Wu, T.-C. and Chen, B.-Y. (2020) Elastoscillation: 3D Multilevel Force Feedback for Damped Oscillation on VR Controllers. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, 25-30 April 2020, 1-12.

[37]   Ryu, N., Lee, W., Kim, M.J. and Bianchi, A. (2020) Elastick: A Handheld Variable Stiffness Display for Rendering Dynamic Haptic Response of Flexible Object. Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, Virtual, 20-23 October 2020, 1035-1045.

[38]   Onishi, Y., Takashima, K., Fujita, K. and Kitamura, Y. (2021) BouncyScreen: Physical Enhancement of Pseudo-Force Feedback. 2021 IEEE Virtual Reality and 3D User Interfaces (VR), Lisboa, 27 March-1 April 2021, 363-372.

[39]   Kato, G., Kuroda, Y., Nisky, I., Kiyokawa, K. and Takemura, H. (2016) Design and Psychophysical Evaluation of the HapSticks: A Novel Non-Grounded Mechanism for Presenting Tool-Mediated Vertical Forces. IEEE Transactions on Haptics, 10, 338-349.

[40]   Zenner, A. and Krüger, A. (2019) Drag: On: A Virtual Reality Controller Providing Haptic Feedback Based on Drag and Weight Shift. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, 4-9 May 2019, Paper No. 211.

[41]   Sagheb, S., Liu, F.W., Bahremand, A., Kidane, A. and Li Kam Wa, R. (2019) SWISH: A Shifting-Weight Interface of Simulated Hydrodynamics for Haptic Perception of Virtual Fluid Vessels. Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology, New Orleans, 20-23 October 2019, 751-761.

[42]   Zenner, A., Ullmann, K. and Krüger, A. (2021) Combining Dynamic Passive Haptics and Haptic Retargeting for Enhanced Haptic Feedback in Virtual Reality. IEEE Transactions on Visualization and Computer Graphics, 27, 2627-2637.

[43]   Zenner, A. and Krüger, A. (2020) Shifting & Warping: A Case for the Combined Use of Dynamic Passive Haptics and Haptic Retargeting in VR. 33rd Annual ACM Symposium on User Interface Software and Technology, Virtual, 20-23 October 2020, 1-3.

[44]   Mizuno, T., Maeda, J. and Kume, Y. (2013) Weight Sensation Affected by Vibrotactile Stimulation with a Handheld Vision-Tactile-Force Display Device. 2013 10th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, Krabi, 15-17 May 2013, 1-6.

[45]   Fujinawa, E., Yoshida, S., Koyama, Y., Narumi, T., Tanikawa, T. and Hirose, M. (2017) Computational Design of Hand-Held VR Controllers Using Haptic Shape Illusion. Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology, Gothenburg, 8-10 November 2017, Article No. 28.

[46]   Trinitatova, D. and Tsetserukou, D. (2019) DeltaTouch: A 3D Haptic Display for Delivering Multimodal Tactile Stimuli at the Palm. 2019 IEEE World Haptics Conference (WHC), Tokyo, 9-12 July 2019, 73-78.

[47]   Je, S., Kim, M.J., Lee, W., Lee, B., Yang, X.-D., Lopes, P. and Bianchi, A. (2019) Aero-Plane: A Handheld Force-Feedback Device That Renders Weight Motion Illusion on a Virtual 2D Plane. Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology, New Orleans, 20-23 October 2019, 763-775.

[48]   Yamamoto, T. and Hirota, K. (2015) Recognition of Weight through Shaking Interaction. 2015 IEEE World Haptics Conference (WHC), Evanston, 22-26 June 2015, 451-456.

[49]   Chossat, J.-B., Chen, D.K., Park, Y.-L. and Shull, P.B. (2019) Soft Wearable Skin-Stretch Device for Haptic Feedback Using Twisted and Coiled Polymer Actuators. IEEE Transactions on Haptics, 12, 521-532.

[50]   Herbst, I. and Stark, J. (2005) Comparing Force Magnitudes by Means of Vibro-Tactile, Auditory, and Visual Feedback. IEEE International Workshop on Haptic Audio Visual Environments and their Applications, Ottawa, 1 October 2005, 5 p.

[51]   Lopes, P., You, S., Cheng, L.-P., Marwecki, S. and Baudisch, P. (2017) Providing Haptics to Walls & Heavy Objects in Virtual Reality by Means of Electrical Muscle Stimulation. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, 6-11 May 2017, 1471-1482.

[52]   Hara, M., Higuchi, T., Yamagishi, T., Ashitaka, N., Huang, J. and Yabuta, T. (2007) Analysis of Human Weight Perception for Sudden Weight Changes during Lifting Task Using a Force Display Device. Proceedings of 2007 IEEE International Conference on Robotics and Automation, Rome, 10-14 April 2007, 1808-1813.

[53]   Kim, J. and Lee, J. (2021) The Effect of the Virtual Object Size on Weight Perception Augmented with Pseudo-Haptic Feedback. 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Lisbon, 27 March-1 April 2021, 575-576.

[54]   Hummel, J., Dodiya, J., Wolff, R., Gerndt, A. and Kuhlen, T. (2013) An Evaluation of Two Simple Methods for Representing Heaviness in Immersive Virtual Environments. 2013 IEEE Symposium on 3D User Interfaces (3DUI), Orlando, 16-17 March 2013, 87-94.

[55]   Rietzler, M., Geiselhart, F., Frommel, J. and Rukzio, E. (2018) Conveying the Perception of Kinesthetic Feedback in Virtual Reality Using State-of-the-Art Hardware. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, 21-26 April 2018, Article No. 460.

[56]   Strandholt, P.L., Dogaru, O.A., Nilsson, N.C., Nordahl, R. and Serafin, S. (2020) Knock on Wood: Combining Redirected Touching and Physical Props for Tool-Based Interaction in Virtual Reality. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, 25-30 April 2020, 1-13.

[57]   Kim, M., Kim, J., Jeong, K. and Kim, C. (2020) Grasping VR: Presence of Pseudo-Haptic Interface Based Portable Hand Grip System in Immersive Virtual Reality. International Journal of Human-Computer Interaction, 36, 685-698.

[58]   Nakakoji, K., Yamamoto, Y. and Koike, Y. (2020) Toward Principles for Visual Interaction Design for Communicating Weight by Using Pseudo-Haptic Feedback. Create10-The Interaction Design Conference, Edinburgh, 30 June-2 July 2010, 1-6.

[59]   Lécuyer, A. (2009) Simulating Haptic Feedback Using Vision: A Survey of Research and Applications of Pseudo-Haptic Feedback. Presence: Teleoperators and Virtual Environments, 18, 39-53.

[60]   Dominjon, L., Lécuyer, A., Burkhardt, J.-M., Richard, P. and Richir, S. (2005) Influence of Control/Display Ratio on the Perception of Mass of Manipulated Objects in Virtual Environments. IEEE Proceedings, VR 2005. Virtual Reality, Bonn, 12-16 March 2005, 19-25.

[61]   van Polanen, V., Tibold, R., Nuruki, A. and Davare, M. (2019) Visual Delay Affects Force Scaling and Weight Perception during Object Lifting in Virtual Reality. Journal of Neurophysiology, 121, 1398-1409.

[62]   Yu, R. and Bowman, D.A. (2020) Pseudo-Haptic Display of Mass and Mass Distribution during Object Rotation in Virtual Reality. IEEE Transactions on Visualization and Computer Graphics, 26, 2094-2103.

[63]   Ban, Y., Narumi, T., Fujii, T., Sakurai, S., Imura, J., Tanikawa, T. and Hirose, M. (2013) Augmented Endurance: Controlling Fatigue While Handling Objects by Affecting Weight Perception Using Augmented Reality. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, 27 April-2 May 2013, 69-78.

[64]   Wolf, E., Döllinger, N., Mal, D., Wienrich, C., Botsch, M. and Latoschik, M.E. (2020) Body Weight Perception of Females Using Photorealistic Avatars in Virtual and Augmented Reality. 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Porto de Galinhas, 9-13 November 2020, 462-473.

[65]   Hashiguchi, S., Sano, Y., Shibata, F. and Kimura, A. (2014) RV Dynamics Illusion: Psychophysical Influence on Sense of Weight by Mixed-Reality Visual Stimulation of Moving Objects. International Conference on Virtual, Augmented and Mixed Reality, Heraklion, 22-27 June 2014, 55-64.

[66]   Balk, S.A., Bertola, D.B. and Inman, V.W. (2013) Simulator Sickness Questionnaire: Twenty Years Later. Proceedings of the 7th International Driving Symposium on Human Factors in Driver Assessment, Training, and Vehicle Design, Bolton, 17-20 June 2013, 257-263.

[67]   Kim, H.K., Park, J., Choi, Y. and Choe, M. (2018) Virtual Reality Sickness Questionnaire (VRSQ): Motion Sickness Measurement Index in a Virtual Reality Environment. Applied Ergonomics, 69, 66-73.

[68]   Schuemie, M.J., Van Der Straaten, P., Krijn, M. and Van Der Mast, C.A. (2001) Research on Presence in Virtual Reality: A Survey. CyberPsychology & Behavior, 4, 183-201.

[69]   Witmer, B.G. and Singer, M.J. (1998) Measuring Presence in Virtual Environments: A Presence Questionnaire. Presence: Teleoperators and Virtual Environments, 7, 225-240.

[70]   Lin, J.-W., Duh, H.B.-L., Parker, D.E., Abi-Rached, H. and Furness, T.A. (2002) Effects of Field of View on Presence, Enjoyment, Memory, and Simulator Sickness in a Virtual Environment. Proceedings IEEE Virtual Reality 2002, Orlando, 24-28 March 2002, 164-171.

[71]   Fontaine, G. (1992) The Experience of a Sense of Presence in Intercultural and International Encounters. Presence: Teleoperators & Virtual Environments, 1, 482-490.

[72]   Lee, W. and Lim, Y.-K. (2010) Thermo-Message: Exploring the Potential of Heat as a Modality of Peripheral Expression. CHI’10 Extended Abstracts on Human Factors in Computing Systems, Atlanta, 10-15 April 2010, 4231-4236.

[73]   Held, R.M. and Durlach, N.I. (1992) Telepresence. Presence: Teleoperators & Virtual Environments, 1, 109-112.

[74]   Sheridan, T.B. (1992) Musings on Telepresence and Virtual Presence. Presence: Teleoperators & Virtual Environments, 1, 120-126.

[75]   Hart, S.G. and Staveland, L.E. (1988) Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. Advances in Psychology, 52, 139-183.

[76]   Gonzalez-Franco, M. and Peck, T.C. (2018) Avatar Embodiment. Towards a Standardized Questionnaire. Frontiers in Robotics and AI, 5, Article No. 74.

[77]   Lee, K.A., Hicks, G. and Nino-Murcia, G. (1991) Validity and Reliability of a Scale to Assess Fatigue. Psychiatry Research, 36, 291-298.

[78]   Lin, J.Y., Jin, L., Hu, S., Katsavounidis, I., Li, Z., Aaron, A. and Kuo, C.-C.J. (2015) Experimental Design and Analysis of JND Test on Coded Image/Video. Applications of Digital Image Processing, 38, Article No. 95990Z.

[79]   Ross, H.E. and Brodie, E.E. (1987) Weber Fractions for Weight and Mass as a Function of Stimulus Intensity. The Quarterly Journal of Experimental Psychology, 39, 77-88.