Introduction

The sense of bodily self relies on an updated body representation that allows to feel that our body parts belong to us1,2. This feeling is referred to as body ownership (BO) and is critical to the integrity of the self and to guide actions with an appropriate sense of agency3. Efficient and accurate multisensory integration is crucial to building a sense of ownership of one’s body1,4. Indeed, frontoparietal-cerebellar circuits integrate visual, tactile, and proprioceptive signals according to temporal and spatial congruence principles5,6. Disrupting the congruency of multisensory input through visual tactile or motor signals conflicts in real-time7,8 or with pre-recorded videos5 can generate perceptual illusions that alter the sense of bodily self, promoting disownership feelings2. The experience of disownership can be characterized by the feeling of unfamiliarity, unreality, uselessness, and alienation of body parts, as well as by delusional beliefs.

Since the BO is consequent to the correct localization of bodily information on a multimodal, spatial body map9, it seems closely linked to the build-up of the body schema (BS). BS is the dynamic sensorimotor representation of the body in space, continuously updated by incoming sensory information, ready to interact with the environment appropriately10,11. Specifically, the ability to determine one’s body position in space relies on afferent and efferent sensory information, alongside with stored information about body topography and body metrics (i.e., shape and size)11,12,13,14.

The close relationship between BO and BS clearly emerges in the Rubber Hand Illusion15, in which a visual stimulation of a rubber hand, synchronous with a tactile stimulation of the hidden natural hand, results in the feeling of ownership toward the fake hand15. It is plausible that ownership feelings toward the fake hand arise from a recalibration of BS16,17 when the illusion is induced according to the body’s anatomical constraints18,19.

The importance of body part representation in space to grasp and reach objects within the external environment allows us to link BS to space representation strictly. The individual arm’s length influences the estimation of reachable space around the body (peripersonal space, PPS)20. Moreover, changes in body representation also seem to modulate space coding, notably in cases such as tool-use-dependent plasticity10,21 and in the presence of alterations in sensorimotor information from specific body parts, as described in deafferented patients22,23,24. Also, changing the size of the body through bodily illusion modulates the PPS25.

Thus, the hypothesis that PPS representation relies on the intrinsic body metric has been proposed26. PPS is conceived as a sensory-motor interface for hand-object interaction27,28 coded in a body-part centered reference frame29 by frontoparietal bimodal neurons, integrating tactile stimuli on the body with nearby visual/acoustic stimuli30. Moreover, multisensory integration within the PPS seems to be a mechanism underlying BO6. Indeed, BO feelings mainly emerge within the PPS, and there is an overlap of the multisensory areas involved in PPS and the subjective sense of ownership31. Although it is now accepted that a visuo–tactile conflict can alter feelings of ownership toward one’s hand, it is still unclear whether such multisensory conflict may also impact deeper aspects of the sensorimotor representation of one’s own body in space (i.e., BS) and of the space around the body in terms of action potentiality (i.e., PPS).

This work aimed to examine whether reducing ownership towards one’s own arm, induced through multisensory conflict, affects the relationship between body and space. Experiment 1 investigated whether visuo–tactile conflict affects the dynamic sensorimotor representation of the body in space (i.e., BS). We employed the Forearm Bisection Task to assess changes in BS32,33, as it requires participants to integrate incoming sensory information (e.g., somatosensory, and proprioceptive) with stored information regarding body metrics and topography.

Based on the hypothesis that the sense of ownership would derive from the localization of bodily signals on the multimodal map of the body9, we should expect that a multisensory conflict would induce disownership feelings over one’s hand and may also affect the representation of one’s own body. Specifically, we predicted a change in the representation of one’s body in space due to the multisensory mismatch that would make the body boundaries less defined. Experiment 2 tested the effect of the same multisensory conflict on the PPS representation. A Reachability Judgment Task evaluated the extension of the reaching space34,35. We expected that the asynchronous condition would induce feelings of disownership over one’s hand, also influencing space representation. In particular, the multisensory conflict would affect the boundaries of one’s body and its potentiality of action.

Experiment 1

Methods

Participants. Twenty-six healthy participants (16 female, M = 30.11, SD = 9.16) participated in Experiment 1. All subjects were right-handed, as assessed by the Edinburgh Handedness Inventory36. They had no history of neurological or psychiatric disease. All participants gave written informed consent to participate in the study, approved by the Ethical Committee of the University of Milano-Bicocca (Protocol Number: RM-2020–337) and conducted following the standards of the Helsinki Declaration37. All methods were performed following the relevant guidelines and regulations set by the Ethical Committee. Informed consent was obtained to publish the images in an online open-access publication.

Procedure and Experimental Design. Blindfolded participants were comfortably seated and placed their arms radially extended on a table in front of them (Fig. 1a). First, they were asked to indicate the perceived midpoint of their forearm, considering the elbow (olecranon) and the middle finger tip as the extremities (Fig. 1b), as a measure of the body metric representation (i.e., Forearm Bisection Task)32,33. Bisections were executed for both forearms in a counterbalanced order across participants, performing ballistic movements with the index finger of the contralateral hand. No corrections to the estimation were allowed. Each trial began with the contralateral index finger placed 30 cm away from the subject’s midsagittal plane. The perceived midpoint was measured with a flexible ruler, set a few millimeters close to the participant’s arm with the starting point (0 cm) at the tip of the middle finger. Participants performed ten trials for each arm. The Forearm Bisection Task was performed for both forearms before and after two visuotactile stimulation conditions (80 trials).

Figure 1
figure 1

Procedure of Experiment 1. (a) Participants’ posture. (b) Experimental design: before and after a visuo–tactile stimulation, a Forearm Bisection Task was performed for both forearms. During the visuo–tactile stimulation, participants observed their left hand, through the HDM, being stroked by a paintbrush during a synchronous or asynchronous condition. At the end of each condition, the Embodiment Scale was filled in. The visuo–tactile conditions order was counterbalanced across participants.

During visuo–tactile stimulation, participants wore the head-mounted display (HMD, Samsung Gear VR) through which they observed a video captured by a 180° webcam (60 frames, 1280 × 720 resolution; ELP, AilipuTechnology CO., Ltd., 2.7 inches, 1080 pixels) placed on the HMD. Participants were exposed to a real-time video (i.e., synchronous condition) or an 850 ms delayed video (i.e., asynchronous condition) of their left hand while being stroked by a paintbrush at a frequency of 1 Hz in different positions and directions for 60 s (Fig. 1b). Thus, during the synchronous condition (i.e., the control condition), tactile stimulation was congruent with the visual feedback seen in the HMD. In contrast, in the asynchronous condition (i.e., the experimental condition), visual stimulation was delayed by 850 ms relative to the tactile sensation felt on the hand. During stimulation, participants were asked to avoid head and hand movements. The order of the two visuo–tactile conditions was counterbalanced between participants.

After each condition, we also administered the adapted 7-point Embodiment Scale38 (See Table 1) to assess embodiment feelings towards the left hand seen through the HDM, disembodiment, and physical sensations towards the real left hand. Participants rated their agreement on 18 items ranging from − 3 to + 3. The initial ten questions pertained to the sense of ownership, the sense of agency, and the colocation of the real and seen hands (i.e., embodiment subscale). The subsequent six items constituted the disembodiment subscale, aimed at capturing two subcomponents of the disembodiment experience: Loss-of-own-hand (i.e., loss of control and position sense of the own hand) and Movement (i.e., perception of the movement of one’s hand toward the seen hand and vice versa). Finally, the last two questions addressed physical sensations (i.e., physical sensations subscale). Also, at the end of each condition, we checked if participants explicitly recognized the difference in delay between the two conditions by asking them if the tactile and visual feedback were at the same time (“Was the touch you felt on your hand and the touch you saw through the HDM synchronous?”, similar to previous work8). Finally, the participants’ forearm length was measured (i.e., distance from the middle fingertip to the olecranon).

Table 1 Embodiment Scale (adapted from Romano et al.)38.

Each participant performed two visuo–tactile conditions and eight blocks of the Forearm Bisection Task in total (i.e., two blocks before and two blocks after each visuo–tactile stimulation, one block for each forearm).

Analysis. Embodiment Scale responses were transformed in Z-scores with a normal distribution39,40,41 using an ipsatization procedure to control for any response bias in the subject’s questionnaire ratings. The scores of the three subscales of the Embodiment Scale were also calculated by averaging each scale’s scores38. Then, each subscale (i.e., embodiment, disembodiment, and physical sensations) were compared across the two visuo–tactile conditions (synchronous and asynchronous) using Paired Sample t-tests.

The Forearm Bisection Task scores were transformed into percentage deviation scores by dividing each estimation (i.e., subjective midpoint) by the actual arm length and multiplying the result by 100 (% deviation = subjective midpoint/arm length × 100). If the deviation was more than 50%, it indicated a proximal shift from the actual midpoint (i.e., deviation toward the elbow); if the deviation was less than 50%, the shift was distal (i.e., deviation toward the finger)32,33. Then, the difference between the bisection percentage before and after the two visuo–tactile stimulation conditions was calculated (i.e., Bisection Shift). A positive value meant a distal shift towards the fingers, while a negative value indicated a proximal shift towards the elbow.

We thus measured the influence of the visuo–tactile stimulation on the Bisection Shift using a Linear Mixed Model42 (LMM) and the maximal random-effect structure43. Condition (Asynchronous vs. Synchronous), Forearm (Left vs. Right), and their interaction were considered fixed factors of the LMM. Random intercepts and slope were estimated for the Condition and Forearm factors within participants. We included participants as a random effect variable to consider inter-subject variability properly. F test with Satterthwaite’s method for degrees of freedom was used for statistical significance, and α was set to 0.05. Then, we considered mean values and a 95% Confidence Interval (CI) to explore significant effects and interactions44,45. We reported marginal R2 (R2m) to express the variance explained by fixed effect and conditional R2 (R2c) to express the variance explained by both fixed and random effects of the overall model. In addition, we performed a correlation analysis to investigate the relationships between subjective ratings in the embodiment scale and the Bisection Shift in the forearm’s metric bisection. Analyses were performed using Jamovi (Version 1.6.23.0)46 and lme4, lmerTest, MuMIn packages47,48 of R software (R Core Team 2016)49.

Results and discussion

Only four participants failed to perceive the critical difference in visuo–tactile synchrony between the experimental and control conditions, so they were removed from the analysis as non-responsive to the critical experimental manipulation.

First, Paired Sample t-tests revealed a significant difference between the synchronous and asynchronous conditions in the feelings of embodiment, disembodiment, and physical sensation toward one’s hand (Embodiment: t(21) = 3.10, p = 0.005, Cohen’s d = 0.66; Disembodiment: t(21) =  − 2.69, p = 0.014, Cohen’s d =  − 0.57; Physical Sensation: t(21) =  − 2.30, p = 0.031, Cohen’s d =  − 0.49). Specifically, the Embodiment subscale score was lower in the asynchronous condition than in the synchronous one, suggesting a decrease in the feelings of ownership toward one’s hand (Asynchronous: M = 0.36, SE = 0.07, 95% CI [0.21, 0.51]; Synchronous: M = 0.73, SE = 0.09, 95% CI [0.55, 0.91]). Also, the Disembodiment subscale score was higher in the asynchronous condition (Asynchronous: M =  − 0.82, SE = 0.11, 95% CI [− 1.04, − 0.60]; Synchronous: M =  − 1.06, SE = 0.08, 95% CI [− 1.24, − 0.89]). The Physical Sensation scale score followed a similar trend (Asynchronous: M =  − 0.53, SE = 0.10, 95% CI [− 0.75, − 0.31]; Synchronous: M =  − 0.74, SE = 0.10, 95% CI [− 0.95, − 0.54]). We further explored the Disembodiment subscale by considering its two subcomponents38,50: Loss-of-own-hand (i.e., loss of control and position sense of the own hand) and Movement (i.e., perception of the Movement of one’s hand toward the virtual hand and vice versa). In contrast to the traditional Rubber Hand Illusion paradigm38,50, in the present experiment the real hand and virtual hand are placed in the same position. Consequently, we did not expect participants to perceive any approaching feelings between the virtual and real hand position, and the items related to the Movement subcomponent could be considered control questions. Indeed, only the Loss-of-own-hand component was significant (t(21) =  − 3.23, p = 0.004, Cohen’s d =  − 0.69): the scores were higher in the asynchronous condition (Asynchronous: M =  − 0.73, SE = 0.13, 95% CI [− 1.00, − 0.46]; Synchronous: M =  − 1.04, SE = 0.11, 95% CI [− 1.28, − 0.81]), revealing stronger feelings of loss of one’s hand. In contrast, the effect on the Movement component was not significant, confirming our prediction (p > 0.53). The plotted results of these analyses can be found in the Supplementary Materials (Supplementary Figs. 1 and 2).

Then, the Forearm Bisection Shift analysis revealed the effect of visuo–tactile stimulation on body metrics. The interaction effect between Condition and Forearm was significant (F(1,813) = 10.94; p < 0.001). The visuo–tactile stimulation of the left forearm induced a distal shift in the Asynchronous condition (M = 1.75%; SE = 1.04%; 95% CI [− 0.40, 3.91]) but not in the Synchronous condition (M =  − 0.28%; SE = 0.69%; 95% CI [− 1.70, 1.15]). Furthermore, the Bisection Shift of the right forearm, i.e. not manipulated, is comparable between the Asynchronous (M =  − 0.44%; SE = 0.91%; 95% CI [− 2.33, 1.45]) and Synchronous condition (M =  − 0.50%; SE = 0.73%; 95% CI [− 2.01, 1.03]) with a percentage score around 0 (Fig. 2). The main effects of Forearm and Condition were not significant (p > 0.17). Considering the overall model, both random and fixed effects explained 43.26% of the variance, while fixed effects explained 2.56%.

Figure 2
figure 2

Results of Experiment 1: effects of the visuo–tactile stimulation on Forearm Bisection Shift. Comparison of the percentage of the Forearm Bisection Shift (Pre–Post visuo–tactile stimulation) between Condition (Synchronous and Asynchronous) and Forearm (Left and Right). Positive values indicate a distal shift (toward the fingers), negative values indicate a proximal shift (toward the elbow). Lines indicate Confidence Intervals set at 95%.

Correlation analysis revealed no significant correlation between the altered subject’s feelings of embodiment toward one’s hand and the strength of the Forearm Bisection Shift (all p > 0.23).

Experiment 1 showed that the visuo–tactile manipulation effectively decreased embodiment feelings and increased disembodiment and physical sensations after the asynchronous condition. The significance of the Loss-of-own-hand component of the disembodiment subscale indicated a specific modulation in the feelings related to the loss of control and the position sense of one’s hand more than the perception of a movement in space (i.e., Movement component). Moreover, the visuo–tactile stimulation also modulated the estimation of the forearm midpoint, altering BS representation. Specifically, a shift in the midpoint estimation was present only in the Asynchronous stimulation of the left forearm. Thus, the visuo–tactile manipulation seemed to affect BS representation only of the body part involved in the multisensory conflict, i.e. left forearm, and not the contralateral arm.

Accordingly, in Experiment 2, we verified the effect of the multisensory conflict on the PPS, considering only the manipulated forearm.

Experiment 2

Methods

Participants. A new sample of twenty-seven participants (16 females, M = 23.92, SD = 3.43) participated in Experiment 2; none had been involved in Experiment 1. The exclusion criteria were the same as in Experiment 1. All participants gave written informed consent to participate in the study, approved by the Ethical Committee of the University of Milano-Bicocca (Protocol Number: RM-2020–337) and conducted in accordance with the standards of the Helsinki Declaration37. All methods were performed following the relevant guidelines and regulations set by the Ethical Committee. Informed consent was obtained to publish the images in an online open-access publication. One participant was removed from the sample due to a technical problem during the experiment (N = 26).

Procedure and experimental design. The experiment consisted of a Pre-experimental Session and an Experimental Session. The Pre-experimental Session determined the individual reachability threshold using the Reachability Judgment Task. The Experimental Session evaluated the reaching space extension before and after two visuo–tactile stimulation conditions (Synchronous vs. Asynchronous), following the same experimental design as Experiment 1.

During the Pre-Experimental Session, the Reachability Judgment Task was implemented in a virtual reality scenario using 360° pictures (Insta360 ONE X2) through Unity 3D graphic engine 2018. The photos were taken in the experiment room from the perspective of a mannequin sitting in front of a table (the distance from the camera lens to the floor was 120 cm; the distance between the dummy’s body and the top of the table of 35 cm). Participants comfortably sat at a fixed distance from a table. They wore a black cape to mimic the same posture and aspect as the mannequin, thus increasing the sense of embodiment toward the virtual body, the realism of the setting and immersion in the virtual environment (Fig. 3a). Initially, participants actually performed five grasping movements with the left hand toward a blue parallelepiped (3 × 3 × 6 cm) placed 15 cm from the edge of the table. This procedure reinforced the movement that had to be imagined during the experimental task35. Then, the participants wore an HDM (Oculus Quest 2). To induce a sense of embodiment over the virtual body and reduce possible discomfort due to the virtual scenario, participants underwent a habituation phase, in which they were allowed to explore and observe the virtual room and virtual body for 60 s. Then, thirty-three different 360° photos were randomly presented through the HDM connected to the computer (OMEN X 900-293 nl Desktop, Intel Core i7-7800X, 16 GB RAM, NVIDIA GeForce RTX 2080Ti 11 GB). Each photo showed a blue parallelepiped placed on the table along the sagittal body-midline axis in one of the thirty-two possible positions (from the starting of the table − 0 cm- up to 80 cm, in steps of 2.5 cm). Participants had to press the right and left pedals, counterbalanced across participants, to evaluate as quickly and accurately as possible whether the virtual object presented was reachable or unreachable with their left hand (Fig. 3b). The object disappeared as soon as the response was provided. A grey screen mask was displayed for a variable time, from 1500 to 2000 ms, between one stimulus and the next. Each object’s position was presented five times for a total of 160 trials. At the end of the experiment, the actual length of the arm (i.e., the distance from the acromion to the tip of the middle finger) and the maximum reachable distance (from the edge of the table to the further point reachable by stretching the arm) of each participant was measured.

Figure 3
figure 3

Procedure of Experiment 2. (a) Posture and position of participants during the Reachability Judgement Task. (b) Experimental design: before and after the visuo–tactile stimulation a Reachability Judgement Task was performed. Participants judged whether or not a virtual stimulus was reachable with their left hand without performing any actual movement with the arm. During the visuo–tactile stimulation, participants placed their hands on a support on their legs and observed their left hand being stroked by a paintbrush during the synchronous and asynchronous conditions. At the end of each condition, the Embodiment Scale was filled in. The visuo–tactile conditions order was counterbalanced across participants.

Then, participants performed the Experimental Session on a different day, one week after the Pre-Experimental Session. In the Experimental Session, the Reachability Judgment Task was performed before and after the two visuo–tactile stimulation conditions. The task was the same as in the previous session, except for the positions at which the target was presented. These positions were tailored to each participant’s individual reachability threshold determined during the Pre-experimental Session (i.e., reference distance). Thus, thirteen object positions were proposed, including the reference distance: six reachable and six unreachable distances, placed before or after the individual threshold, respectively. The presentation of each distance was repeated five times in a randomized order for a total of 65 trials. As in Experiment 1, participants wore the black cape and the HDM and underwent the visuo–tactile stimulation in two conditions (Synchronous and Asynchronous). Even though the procedure for the visuo–tactile stimulation was the same as Experiment 1, the position of the participants’ hands changed (Fig. 3b). To prevent a conflict between the position of the virtual body and the real one during the Reachability Judgment Task, participants placed their hands on a support located on their legs and held this fixed position throughout the experiment. At the end of each visuo–tactile condition, participants completed the Embodiment Scale. At the end of the experiment, as in Experiment 1, we checked again if participants explicitly recognized the difference in delay between the two conditions by asking them if the tactile and visual feedback were simultaneous.

Analysis. In the Pre-experimental Session, we calculated the point of subjective equality (PSE) for each participant, indicating the individual virtual threshold of virtual reachability. Within each block, the positive responses (i.e., the object is reachable) were summed for each of the 32 distances (from 0 to 80 cm), with a maximum value of 5 (all positive responses). The PSE was extracted from each participant’s psychometric functions and computed by plotting the proportion of responses for which the object is perceived as reachable. Data points were fitted with a logistic function using the following equation:

$$P = \frac{1}{{1 + e^{{ - \beta \left( {\chi - \alpha } \right)}} }}$$
(1)

P is the proportion of the reachable responses, χ is the distances at which the object was presented, α is the intercept, and β is the psychometric function’s slope. These estimated coefficients were used to calculate the PSE (\(-\) α/β, negative ratio of the two parameters), which is the critical value of the transition at which subjects begin to report more than 50% of the times that the object was reachable (i.e., reachability threshold).

The same analysis was computed on the Embodiment Scale responses in the Experimental Session as in Experiment 1. Participants who did not perceive any difference in the delay in the asynchronous condition were discarded from the analysis. Regarding the Reachability Judgment Task, we performed the same analysis as the Pre-experimental Session but considering the thirteen distances. Thus, the PSE for each participant, Condition (Synchronous and Asynchronous), and Session (Pre and Post visuo–tactile stimulation) was computed. We considered the difference between post and pre visuo–tactile stimulation thresholds (i.e., Reachability Shift = Post–Pre visuo–tactile stimulation) as a dependent variable. Then, we verified the effect of the visuo–tactile stimulation on the reaching estimation (i.e., Reachability Shift). Here, we applied the General Linear Model (GLM) since the LMM did not improve the data fit. Data were inspected for outliers in each condition: points that fell outside ± 2.5 SD from participants’ Reachability Shift mean were discarded from the analysis. Accordingly, three more participants were excluded. Since the normality of data was confirmed by the Shapiro–Wilk test for both variables (p > 0.28), a Paired Sample t-test on the dependent variable Reachability Shift was performed to compare the two visuo–tactile conditions. We also conducted a correlation analysis to investigate the relationships between subjective ratings in the embodiment scale and the Reachability Shift.

Results and discussion

The PSE calculated for each participant during the Pre-experimental Session (M = 454.0, SE = 20.3) represented the individual virtual reachability threshold. The participants’ maximum virtual reachability significantly differed from the real one (t(22) = 3.83, p < 0.001, Cohen’s d = 0.80). According to the previous works, participants tended to overestimate their action possibility (Mean difference = 90.0, SE difference = 23.5)51,52. In the experimental session, we used individual PSE as the benchmark distance to set the Reachability Judgment Task.

Regarding the Experimental Session, only one participant who did not perceive any difference between the two conditions was excluded from further analysis. Furthermore, a difference between the two visuo–tactile conditions in the Embodiment subscale was found (t(25) =  − 3.22, p = 0.004, Cohen’s d =  − 0.63), revealing lower feelings of embodiment during the asynchronous condition (Asynchronous: M = 0.46, SE = 0.08, 95% CI [0.28, 0.63]; Synchronous: M = 0.79, SE = 0.04, 95% CI [0.71, 0.87]). However, no significant differences emerged in the Disembodiment (t(25) = 1.89, p = 0.070, Cohen’s d = 0.07) and Physical Sensation (t(25) = 0.99 , p = 0.331, Cohen’s d = 0.11) subscales. As in Experiment 1, we further explored the Disembodiment subscale by considering its two subcomponents separately. A significant difference between the two conditions emerged only in the Loss-of-own-hand component (t(25) = 2.37, p = 0.026, Cohen’s d =  − 0.46): the scores were higher in the asynchronous condition (Asynchronous: M =  − 1.04, SE = 0.07, 95% CI [− 1.19, − 0.89]; Synchronous: M =  − 1.25, SE = 0.07, 95% CI [− 1.41, − 1.10 ]), revealing a stronger feeling of loss of one’s hand. The Movement subcomponent (i.e., the perception of a movement of one’s hand toward the virtual hand and vice versa) did not show a significant difference (p > 0.99). The matching between the position of the participant’s hand and that of the hand observed through the HDM during the visuo–tactile stimulation could be the reason of this lack of difference. Thus, the Loss-of-own-hand component seems more sensitive to the difference between synchronous and asynchronous stimulations following this experimental task. The plotted results of these analyses are reported in the Supplementary Materials (Supplementary Figs. 3 and 4).

Considering the effect of the visuo–tactile stimulation on the reachability judgments, we found a significant difference in the Reachability Shift between the Asynchronous and Synchronous condition (t(21) = -2.27, p = 0.034, Cohen’s d = -0.48). The values are positive (see Fig. 4), and the.

Figure 4
figure 4

Results of Experiment 2: effects of the visuo–tactile stimulation on Reachability Shift (Post–Pre visuo–tactile stimulation). Comparison of the Reachability Shift between the two conditions (Synchronous and Asynchronous). Positive values indicate an extension of the reaching space after the visuo–tactile stimulation, while negative values indicate a reduction. Lines indicate Confidence Intervals set at 95%.

Synchronous stimulation suggests an extension of the reaching space after simultaneous stroking (M = 7.09 cm, SE = 4.69 cm, 95% CI [-2.65, 16.83 ]). In contrast, the asynchrony between the visual and tactile feedback would reduce the reaching space: values are negative (M =  − 8.50 cm, SE = 3.84 cm, 95% CI [− 16.48, − 0.52 ]). Again, correlation analysis revealed that the alteration of the subject feelings of embodiment toward one’s hand was not significantly correlated with the amount of Reachability Shift (all p > 0.10).

Experiment 2 confirmed that asynchronous stroking could change bodily self-perception, specifically decreasing feelings of embodiment and increasing the perception of loss of one’s hand. However, we did not find a significant overall effect in the disembodiment and physical sensations subscales, as in Experiment 1. These differences could be due to the experimental setting during the visuo–tactile stimulation. It is also plausible that the different postures of hands, closer to one’s body in Experiment 2, may have reinforced cues coming from one’s body, reducing disembodiment feelings and physical sensations. This result could suggest that the illusion could be highly susceptible to contextual modification. It’s also plausible that Physical Sensation could be less sensitive and more variable to experimental manipulation than the Embodiment subscale and the Loss-of-own-hand component, as reported in other studies38.

Furthermore, results showed that asynchronous visuo–tactile stimulation could reduce one’s reachability space (i.e., the space of one’s action potential toward objects). In contrast, the synchronous stimulation would instead increase the reachability space and extend the space of interaction with objects.

General discussion

In this study, we induced changes in the perception of one’s body in space and the space around it through a multisensory conflict. In Experiment 1, we found that a multisensory conflict can affect the feelings of ownership towards one’s body part and the metric representation of the body. This result suggests that the disruption of coherent multisensory integration of tactile and visual information may decrease feelings of ownership toward one’s body part and, simultaneously, make the dynamic sensorimotor representation of the hand in space (i.e., BS) less defined. In Experiment 2, we also found that the mismatch between tactile and visual information can influence the PPS, decreasing the perception of the potentiality of actions towards objects (i.e., Reaching Space).

This work confirmed that a temporal mismatch between visual and tactile stimuli affects the subjective sense of the bodily self by decreasing the sense of ownership and enhancing feelings of loss toward one’s hand5,7,8. Furthermore, we found that multisensory conflict seems to affect body metric representation, as witnessed by the change in the forearm midpoint estimation. This result suggests that the BS and BO would be ruled by principles of temporal and spatial congruence integrating multisensory signals, crucial to building a proprioceptive “skeleton” centered on multisensory information about the body10,27. Previous studies showed a close link between the sense of BO and BS2,16,17,53,54. Indeed, a failure to update the coherent representation of one’s body in space due to degraded signals could result in the sense of disownership54. According to the previous literature, since BO can be considered a property of the multisensory integration space, this result could be allied to the response of multisensory neurons1,4,6. BO could emerge as a result of a correct updating of the body position in space, i.e., the centering of such multisensory neurons involved in BO and PPS representations31. Thus, the mismatch between visual and tactile stimuli would produce a more labile and less defined body representation. The results confirm the extreme dynamism and plasticity of some aspects of our body representation and how the type of incoming information continuously updates it. At the same time, BS also represents the expectation of how the senses should be integrated over time16. Therefore, in the absence of visuo–tactile mismatch, information would be incorporated in the expected way according to previous experience and a coherent body representation, and body schema would not undergo any changes, remaining consistent over time.

The perceptual mismatch also seems to affect the representation of the space around the body: reachability judgments are influenced by the coherence of incoming sensory information. In a well-known view, PPS is conceived as a multisensory space anchored to specific body parts and is the space of body-object interactions27,28. The object’s distance would be judged as if it was placed farther, and the reaching space would be reduced, probably because the multisensory conflict would not allow a correct update of body representation, decreasing the chance of the body getting in contact with the objects. In contrast, the feeling of ownership toward the seen hand is confirmed and reinforced by continuous and consistent stroking, enhancing the precision of potential reaching estimation. This result follows previous works stating that changes in body representation due to multisensory stimulations affect PPS according to the perceived body location55,56. This study further reveals that disrupting the coherence of incoming sensory information is enough to modify the PPS, also in the absence of direct modulation of the perceived body location. It confirms the close relationship between multisensory conflict, BO and PPS.

This work showed that BO, BS, and PPS are strictly linked. A less defined body representation with a decreased sense of ownership would also affect the space of hand-object interactions. However, we did not find an explicit correlation between a change in the sense of body ownership and in BS and PPS (similar to other bodily Illusion57,58). It is plausible that implicit and automatic features related to the multisensory integration of online sensory cues would better explain the BS and PPS modulation. It is also conceivable that subjective feelings of body ownership may reflect more cognitive aspects57. We can speculate that the current modulation in healthy patients may not fully replicate the disownership feelings in brain-damaged patients. It could be interesting to propose the same experimental paradigm to patients suffering from BO disorders (i.e., somatoparaphrenia), characterized by a pervasive sense of disownership. Previous studies have proved multisensory stimulation’s effect in modulating somatoparaphrenia59,60. Also, feelings of disownership would affect the encoding of space. Indeed, in somatoparaphrenic patients who do not consider the arm to belong to themselves, the response to noxious stimuli is reduced, indicating an altered relationship between body and space61. Although BS and PPS are closely related and based on partly common mechanisms, they can be considered partially independent10 and dissociable62. Indeed, some apparent discrepancies in the two constructs emerged in the present work. Whether the visuo–tactile mismatch influences both BS and PPS, only the synchronous and coherent stimulation affects only PPS. In the condition of synchrony between tactile and visual cues, the metric perception of one’s body would not be modulated explicitly, probably because BS also relies on the expectation of how different sensory signals are integrated16. While the influence of this synchronous and continuous sensory stimulation would instead emerge during estimates of reachability: the more enriched and coherently integrated representation of one’s body would induce an increase in the perception of one’s potential for action. Therefore, BS is the required “skeleton” to support PPS representation, even if it is not enough to explain other additional signals integrated into it10,27. Future studies are necessary to clarify the effect of synchronous stimulation in PPS.

In conclusion, results suggest that multisensory conflicts would influence the subjective sense of the bodily self and alter the relation between body and space, affecting the representation of one’s body in space and the surrounding space regarding action potentials. This work emphasizes a close relationship between BO, BS and PPS.