Perceived location of touch

From Scholarpedia
Jack Brooks and Jared Medina (2017), Scholarpedia, 12(4):42285. doi:10.4249/scholarpedia.42285 revision #186293 [link to/cite this article]
Jump to: navigation, search
Post-publication activity

Curator: Jared Medina

 


Perceived Location of Touch

Figure 1: The perceived location of touch likely arises from processing the stimulus in a serial manner – from detection to location in external space.

Perceiving the location of touch on our skin is a surprisingly complex process. Signals from cutaneous receptors within the skin are transmitted to the brain by afferent neurons to primary somatosensory cortex (S1) via the thalamus. Given noise, both at the level of skin receptors and neurally, the brain needs to decide both whether a stimulus was presented and where that stimulus is located. S1 is organized as a topographic map in which adjacent locations on the skin are represented next to one another. Information in these primary representations have been shown to be plastic – such that a neuron in S1 may represent touch at one location of the skin surface at one point in time, but then represent touch at a different location after cortical and peripheral changes. Therefore, further processing is required to relate activity in these primary representations to a sensation on a specific position on the skin surface. This is achieved by integration with higher order body representations. Finally, touch needs to be localized – not only to a position on the skin surface – but in reference frames relative to the body and the external environment. Therefore, the location of touch can be represented in a number of different representations, each with their own reference frame. We will present evidence on the different stages of processing and representation that are involved in localizing touch.

Contents

Detection of touch and localization: A serial process.

There are two major hypotheses regarding conscious detection of touch and localization. In the first hypothesis, it has been argued that detection and localization are completely dissociable processes. If this were the case, then one would expect individuals who can detect but not localize touch, along with those who could localize but not detect touch. A number of individuals have been reported with intact tactile detection but poor tactile localization (Anema et al. 2009; Halligan et al. 1995; Paillard, Michel & Stelmach, 1983). Evidence for the second arm of this double dissociation comes from a disorder known as numbsense, in which a touch that cannot be detected can be accurately localized. The first case was documented in 1983 in a patient with left parietal damage, who was apparently unable to feel touch on her right hand (Paillard, Michel & Stelmach, 1983). In an experiment designed to examine her localization abilities, she was instructed to point where she was touched (even if she couldn't feel it) on her right hand. Her responses were broadly accurate - as she was able to "point approximately to the locus of stimulation". This observation was taken as evidence for a dissociation between tactile detection and localization.

Figure 2: A. In typical numbsense tasks, decision criteria for detection and localization can vary. For example, when asking an individual if they felt touch, responses to the same stimulus can vary based on the decision criterion used for that trial. Furthermore, binary encoding of a localization response (R) to a stimulus (S) as correct or incorrect can vary based on criterion set by the experimenter. If decision criteria are not matched across tasks, this could lead to a perceived dissociation between detection and localization performance caused by differing criteria, not separable processes.

However, there are a number of concerns regarding whether this evidence provides strong support for the hypothesis that tactile detection and localization are separable processes. The primary concern is the criteria used for a correct response in detection versus localization tasks. First, the claim of intact localization is strongly dependent on what is coded as a “correct” localization response. If one uses being within 20 mm of the target location as the criterion, the patient in Paillards’s study was correct on only 26% of localization trials, in contrast to 90% in a healthy control participant. Furthermore, the criterion for detecting touch can also vary. When presented with a near-threshold stimuli, participants need to make a decision as to whether this diminished sensation was caused by a tactile stimulus. If the participant only reports touch that has similar intensity and qualia to touch presented to their intact hand, they may not report feeling the touch. Supporting this, Paillard reported that the participant reported static pressure to her right hand as an "event" in later testing sessions, contrasting earlier testing in which she felt no touch. If these “events” were felt but not reported in earlier sessions, it is possible that she could both detect and localize touch. Overall, the evidence is unclear as to whether the numbsense of Paillard's patient is evidence for a detection/localization dissociation, or can be explained by a different mechanism (see figure 2, see also Signal to noise ratio in neuroscience).

In opposition to the hypothesis that tactile detection and localization are separable, a second, serial hypothesis has been proposed (see Harris, Thein & Clifford, 2004). In this serial hypothesis, participants first detect whether a tactile stimulus has been presented or not, and then localize that stimulus. Given that it is a serial hypothesis, detection is necessary for tactile localization. However, if this is the case, how could numbsense-like performance be observed? This was examined by instructing healthy individuals to point to near-threshold touch on the fingers (Harris, 2006). Participants were presented with a brief tactile stimulus, followed by a backwards mask, making tactile detection difficult. Detection was measured using a yes-no method, whereas localization was measured using a forced choice method where participants had to declare which finger was touched. Participants were able to localize stimuli when they apparently did not detect them. This finding, similar to numbtouch, appeared to be evidence for parallel processing. However, Harris and colleagues noted an important difference between the detection and localization tasks. Tactile detection involves adopting an arbitrary detection criterion for the yes-no task, such that a touch may have been detected but not declared as detected. This was not present in the forced choice localization task. A conservative threshold for a "yes" response could result in participants not reporting touch, even though they have sufficient information to detect and even localize the touch. The experiment was repeated using a forced-choice task for both tactile detection and localization, dealing with the issue of having different detection criteria for different tasks. When presenting forced-choice detection and localization tasks, participants could not localize stimuli that they did not detect. These results suggest that the differences observed in individuals with numbtouch may be due to different criterion used in localization versus detection tasks in individuals with numbtouch (Medina & Coslett, 2016). Future studies of potential numbsense will need larger sample sizes and equivalent reporting methods for comparison between studies.

Somatosensory plasticity and localization

After stimulating the skin surface, activity travels through the thalamus to primary somatosensory cortex (S1). The somatosensory cortex is organized topographically, such that (with a few exceptions) adjacent locations of the body are represented in neighboring locations on the map (see figure 3). Although it is topographic, the relationship between the size of the skin surface and the size of the map is not uniform across all regions of the body. These non-uniformities can arise through regional differences in the density of sensory innervation or limb usage.

Somatosensory cortex is also plastic. In non-human primates, the reorganization of the somatosensory cortex after amputation, skin island transfers, and other interventions are well studied (Merzenich & Jenkins, 1993). For example, when the third digit is removed by amputation the representations of the palm and adjacent digits expand into this space, so that the second and fourth digits now share a border in the cortex. Intensive stimulation of the skin surface also results in cortical reorganization. The spatial and temporal properties of the stimulation determine how S1 is reorganized. When the fingers of monkeys are stimulated simultaneously for a prolonged period, the finger representations become closer, whereas sequential stimulation moves them apart (Wang et al. 1995). In humans, synchronous stimulation of the fingers also results in changes in S1. Braun and colleagues (Braun et al. 2000) touched participants simultaneously on the first and fifth digits for an hour a day until twenty hours was reached. Near-threshold touch of either finger was misattributed to the other finger at a much higher rate than measured before the experiment. Thus increased usage not only induces topographic changes in S1, but also changes the perceived location of touch. Touch arising from self-generated movement can have similar effects; experienced piano players have much better two-point discrimination thresholds on the fingertips compared to non-musicians (Ragert et al. 2004). In addition their tactile acuity on the fingers has a dose-dependent relationship with hours of practice.

Studies of cortical plasticity demonstrate extensive changes in S1 topography. However, less research has been done examining how potential changes in S1 topography, due to plasticity, relate to changes in perception. Given that S1 is plastic, the relationship between activity in a specific region of S1 and perceiving touch in a particular location on the skin surface cannot be fixed, such that one set of neurons always represents touch at a specific location. There must be further processing that takes information from somatosensory regions and interprets it, such that conscious perception of touch location emerges. Very little is known about exactly how the brain interprets somatosensory activity as a particular tactile sensation. Some initial evidence towards understanding this comes from individuals with brain damage due to stroke.

Individuals who have had strokes in somatosensory regions often report reduced sensitivity to touch along with biases in tactile localization. For example, stroke patients often demonstrate localization errors such that tactile stimuli are localized towards the center of the hand (Rapp, Hendel & Medina, 2002). Interestingly, healthy individuals show similar “central” biases when presented with near-threshold tactile stimuli. For example, weaker touch on the forearm is mislocalized toward its middle (Steenbergen et al. 2014). Why would individuals with somatosensory damage presented with suprathreshold stimuli, along with neurologically intact individuals with near-threshold stimuli, demonstrate such a central tendency? General models that explain spatial bias under uncertainty could explain these tactile localization biases. Huttenlocher and colleagues proposed the category adjustment model (Huttenlocher & Others, 1991) to explain biases in spatial memory. Memories of spatial locations are biased towards the middle of a categorical space, and away from category boundaries, resulting in central error. Importantly, this central error increases as a function of uncertainty. In both cases (suprathreshold touch for brain-damaged individuals, and near-threshold touch for neurologically-intact individuals), somatosensory information is noisy and uncertain. One possibility is that, in interpreting information from somatosensory regions, the brain uses similar heuristics to interpret this noisy activation as touch in a particular location.

Figure 3: A depiction of the the representation of the body in the primary somatosensory cortex. This map, often referred to as the sensory homunculus, is distorted compared to the body. https://en.wikipedia.org/wiki/Cortical_homunculus#/media/File:Sensory_Homunculus.png

From S1 to Higher-Order Body Representations

Representations in S1 are distorted. However, our everyday experience is not consistent with having a distorted body - that is, our fingers do not feel larger even though they have a larger cortical representation than our back. Therefore, information from the “distorted” representation in S1 needs to be mapped to a veridical representation of the skin surface. The majority view is that there are multiple representations of the body that are integrated with output from S1 (Schwoebel & Coslett, 2005). To map from these distorted representations, it has been proposed that there are higher-order body representations that serve to link S1 to a representation of locations on the skin surface. One way to examine the relationship between tactile localization and higher-order representations is by manipulating perceived body size.

The Pinocchio illusion provides a key piece of evidence supporting the existence of these representations (Lackner, 1988). In this illusion the participant holds their nose with one hand. The biceps brachii of that arm are vibrated; this is known to engage muscle receptors, creating the illusion that the arm is moving into extension. Given that the hand is touching the nose, and that the hand is perceived as moving away from the body, the individual often experiences that the nose is growing longer. One explanation is that the brain needs to interpret conflicting information (the arm is moving forward, and the hand is touching the nose). Given that the nose can grow, a higher-order representation of body size and shape is thought to change, resulting in perceived elongation of the nose. If these higher-order representations of body size and shape change, how does this affect tactile localization? That is, if someone perceives a body part to be longer/shorter, do they perceive concomitant changes in perceived touch? Or are the changes in perceived body size and shape separate from changes in tactile localization? In a variant of the Pinocchio illusion, when one finger is grasped by the other hand, vibration of the elbow flexors of the other arm can be used to create the illusion that the finger is extended. Using this illusion, de Vignemont and colleagues (de Vignemont et al. 2005) had participants judge the distance between two tactile stimuli presented on either the forehead (a reference judgment) or the finger during the illusion. Tactile distances were perceived to be longer on the finger, providing evidence that changes in perceived body size lead to changes in the perception of tactile distances. Visual information can also distort higher order representations of body size. To test if such distortions influence touch perception an experiment was devised using distorted views of the body (Taylor-Clarke, Jacobsen & Haggard, 2004). Participants viewed their hand as reduced and their forearm as magnified for one hour, after which touch perception was tested. Perceived distance was reduced on the finger and increased on the forearm. In conjunction with the vibration results, this observation may demonstrate that changes in perceived body size result in changes in perceived extent on the skin.

Localization in external space

The studies reviewed so far have focused primarily on localizing touch on a location on the skin surface – usually referred to as a somatotopic frame of reference. In this frame of reference, the location of touch is represented relative to the skin surface itself, such that stimulation of the right index fingertip would result in a response regardless of where that finger is in external space. However, further processing is required to account for stretch and movement of the skin during muscle contractions, movement of the limbs with respect to the body, and movement of the body compared to the outside world. To act on the external world, one needs to know where a tactile stimulus is relative to a number of different frames of reference apart from simply location on the skin surface. Generally speaking, these are called “external” frames of reference, which are available in parallel in conjunction with somatotopic reference frames for processing touch location. The spatial co-ordinates of these external reference frames can have different origins. There are a number of non-human primate studies that have demonstrated that the body is encoded in various external frames of reference, including eye-centered reference frames for reach planning (Batista et al. 1999) and body-centered reference frames for limb movement (Lacquaniti et al. 1995; see also Colby, 1998 for a review). Studies for neurologically-intact and brain-damaged individuals provide evidence for coding tactile location in various external reference frames.

Medina and Rapp (Medina & Rapp, 2008) reported an individual with left fronto-parietal damage who experienced tactile synchiria – a condition in which stimulation of the ipsilesional hand results in sensation on both hands. These phantom contralesional percepts subsequent to ipsilesional stimulation were highly localizable – more so than actual stimulation on the contralesional hand itself. If this lesion only affected somatotopic representations, then moving the hand into the contralesional space should not change its strength. In contrast if the lesion influenced external representations, then moving the hand should influence the effect. They found that changing hand position resulted in a change in the rate of phantom percepts, such that more phantoms were observed when the hands were in contralesional space (versus ipsilesional space) in both head- and trunk-centered reference frames (See also Bartolomeo et al. 2004; Moro, Zampini & Aglioti, 2004, for similar findings from tactile extinction.)

A number of behavioural studies provide evidence for how tactile information might be encoded in both somatotopic and external reference frames (see also evidence from tactile temporal order judgment). Using other paradigms, Azañón & Soto-Faraco (Azañón & Soto-Faraco, 2008) asked participants to judge the location of a visual stimulus preceded by a tactile cue with crossed arms. They found a crossmodal cueing effect - but interestingly, the cueing effect was significant when the tactile cue was on the opposite hand when the interstimulus interval (ISI) was less than 100 msec. In contrast there was a significant cueing effect when cue and stimulus were on the same side for ISIs over 200 msec. These results could be explained by a “tactile remapping” process, in which tactile location is first represented in a somatotopic reference frame, and then in an external frame (see Yamamoto & Kitazawa, 2001). However, a second hypothesis is that transformation of reference frame co-ordinates occurs rapidly and is followed by integration of spatial information from these frames. Brandes & Heed (Brandes & Heed 2015) measured reach trajectories to uncrossed or crossed feet. The target stimulus was presented after reaching was initiated. Reaches to visual targets were deflected towards the correct foot at 138 msec regardless of foot posture. Reaches to tactile targets for uncrossed feet were redirected at 158 msec. This additional 20 msec likely represents remapping from somatotopic to external co-ordinates. When the feet were crossed, putting the somatotopic and external reference frames in conflict, the deflection was delayed to over 200 msec. This additional delay can be attributed to a spatial integration process, in which enough evidence for location has to accumulate before it guides the reach. Although participants were accurate, on occasion they initiated a movement in the wrong direction. These observations are consistent with the claim of rapid co-ordinate transformation such that the reference frames are concurrently available for the integration of spatial information. Such a process weights each reference frame individually during the integrative process of determining touch location (See Badde & Heed, 2016 for an in-depth discussion of this hypothesis).

There are no reports (that we are aware of) demonstrating that changes of hand position in head- and trunk-centered reference frames creates errors in tactile localization (e.g. the participant is touched on the index finger, but feels it on the middle finger when in a different hemispace). However, there is evidence for changes in tactile location perception when changing finger position relative to the hand. Coslett (Coslett, 1998) reported one brain-damaged individual who made more tactile localization errors on the contralesional hand when the fingers were spread apart versus close together. Haggard and colleagues (Haggard et al. 2006) designed a study in which participants had to verbally identify the finger or hand that was touched. The hands were positioned either one above the other (postural condition) or with interlaced fingers (spatial condition). The postural condition did not influence finger or hand identification. In contrast, interlacing the fingers impaired the ability to name the hand that was touched but not the finger that was touched. However, other studies find evidence finger identification still refers to external space but to a lesser degree than hand identification (Riemer et al. 2010). A study using temporal order judgements confirms that the fingers are localized in external reference co-ordinates (Heed, Backhaus & Roder, 2012). Thus hand and finger identification tasks use the same representations but may rely on them to different degrees. As postural and spatial manipulations did not influence finger identification it would seem that only a somatotopic representation is accessed, whereas hand identification relies on an external representation. A similar study by Overvliet and colleagues (Overvliet et al. 2011) found that performance on a forced-choice tactile localization task was best when the fingers were far apart as opposed to close together or interleaved. This provides some evidence that external spatial representations – perhaps in a hand-centered reference frame – influences tactile localization performance on the skin surface.

Open questions

A number of challenges remain in understanding how touch is localized. The different representations of the body need to be explored further, as do their locations in the brain. For instance it is not known if localization errors occur in relatively simple tasks such as pointing to landmarks on the hand are a result of distorted body representations (Longo & Haggard, 2010) or other central biases. If touch localization incorporates expectations, what are these prior expectations and where are there neural representations? Is touch localization optimised to minimise errors (mean), to select the most probable location (mode), or for the goal of the task it is a part of? Further how do multisensory inputs influence touch location? Although touch localization is a simple task, it is a complex process with many open questions.

Currently we can use relatively non-invasive techniques to induce rapid and robust distortions of body representations, but long-lasting changes have not been observed. Treatments are needed for patients with impaired touch localization and other sensorimotor functions, of which there are over forty known disorders of body representations (de Vignemont, 2010). Tactile Illusions that change the size, shape, location, or ownership of a body part could have therapeutic potential for these disorders.

Acknowledgments

This material is based upon work supported by the National Science Foundation under grant no. 1632849.

References

  • Anema, H.A. et al. (2009). A double dissociation between somatosensory processing for perception and action. Neuropsychologia 47(6): 1615–1620. 
  • Azañón, E. & Soto-Faraco, S. (2008). Changing Reference Frames during the Encoding of Tactile Events. Current Biology 18(14): 1044–1049. 
  • Badde, S. & Heed, T. (2016). Towards explaining spatial touch perception: Weighted integration of multiple location codes. Cognitive Neuropsychology 33(1–2): 26–47. 
  • Bartolomeo, P., Perri, R. & Gainotti, G. (2004). The influence of limb crossing on left tactile extinction. Journal of Neurology, Neurosurgery & Psychiatry 75(1): 49. 
  • Batista, A.P. et al. (1999). Reach Plans in Eye-Centered Coordinates. Science 285(5425): 257. 
  • Brandes, J. & Heed, T. (2015). Reach Trajectories Characterize Tactile Localization for Sensorimotor Decision Making. The Journal of Neuroscience 35(40): 13648. 
  • Braun, C. et al. (2000). Differential activation in somatosensory cortex for different discrimination tasks. The Journal of Neuroscience 20(1): 446–450. 
  • Colby, C.L. (1998). Action-oriented spatial reference frames in cortex. Neuron 20(1): 15-24. 
  • Coslett, H.B. (1998). Evidence for a Disturbance of the Body Schema in Neglect. Brain and Cognition 37(3): 527–544. 
  • Haggard et al. (2006). The brain’s fingers and hands. Experimental Brain Research 172(1): 94–102. 
  • Halligan, P.W. et al. (1995). Sensory detection without localization. Neurocase 1(3): 259–266. 
  • Harris, J.A., Thein, T. & Clifford, C.W.G. (2004). Dissociating Detection from Localization of Tactile Stimuli. The Journal of Neuroscience 24(14): 3683. 
  • Harris, J.A. (2006). Localization of Tactile Stimuli Depends on Conscious Detection. Journal of Neuroscience 26(3): 948-952. 
  • Heed, T., Backhaus, J. & Roder, B. (2012). Integration of hand and finger location in external spatial coordinates for tactile localization.. Journal of Experimental Psychology: Human Perception and Performance 38(2): 386-401. 
  • Huttenlocher, J. & Others, A. (1991). Categories and Particulars: Prototype Effects in Estimating Spatial Location.. Psychological Review 98(3): 352–376. 
  • Lackner, J.R. (1988). Some proprioceptive influences on the perceptual representation of body shape and orientation. Brain 111: 281-297. 
  • Lacquaniti, F. et al. (1995). Representing Spatial Information for Limb Movement: Role of Area 5 in the Monkey. Cerebral Cortex 5(5): 391–409. 
  • Longo, M.R. & Haggard, P. (2010). An implicit body representation underlying human position sense. Proceedings of the National Academy of Sciences of the United States of America 107(26): 11727. 
  • Medina, J. & Rapp, B. (2008). Phantom Tactile Sensations Modulated by Body Position. Current Biology 18(24): 1937–1942. 
  • Medina, J. & Coslett, H.B. (2016). What can errors tell us about body representations?. Cognitive Neuropsychology 3294(August): 1–21. 
  • Merzenich, M.M. & Jenkins, W.M. (1993). Reorganization of cortical representations of the hand following alterations of skin inputs induced by nerve injury, skin island transfers, and experience. Journal of Hand Therapy 6(2): 89–104. 
  • Moro, V., Zampini, M. & Aglioti, S.M. (2004). Changes in Spatial Position of Hands Modify Tactile Extinction but not Disownership of Contralesional Hand in Two Right Brain-Damaged Patients. Neurocase 10(6): 437–443. 
  • Overvliet, K.E., Anema., E, Brenner., H, Dijkerman., A. & Smeets., C. (2011). Relative finger position influences whether you can localize tactile stimuli. Experimental Brain Research 208(2): 245-255. 
  • Paillard, J., Michel, F. & Stelmach, G. (1983). Localization without content. A tactile analogue of “blind sight.”. Archives of neurology 40(9): 548. 
  • Ragert, P. et al. (2004). Superior tactile performance and learning in professional pianists: evidence for meta‐plasticity in musicians. European Journal of Neuroscience 19(2): 473–478. 
  • Rapp, B., Hendel, S.K. & Medina, J. (2002). Remodeling of somatosensory hand representations following cerebral lesions in humans. NeuroReport: Cognitive neuroscience and neurophysiology 13(2): 207–211. 
  • Riemer, M. et al. (2010). Body posture affects tactile discrimination and identification of fingers and hands. Experimental Brain Research 206(1): 47-57. 
  • Schwoebel, J. & Coslett, H.B. (2005). Evidence for multiple, distinct representations of the human body. Journal of cognitive neuroscience 17(4): 543. 
  • Steenbergen, P. et al. (2014). Tactile localization depends on stimulus intensity. Experimental Brain Research 232(2): 597–607. 
  • Taylor-Clarke, M., Jacobsen, P. & Haggard, P. (2004). Keeping the world a constant size: object constancy in human touch.. Nature neuroscience 7(3): 219–220. 
  • De Vignemont, F. (2010). Body schema and body image-Pros and cons. Neuropsychologia 48(3): 669–680. 
  • De Vignemont, F., Ehrsson, H.H. & Haggard, P. (2005). Bodily illusions modulate tactile perception. Current Biology 15(14): 1286–1290. 
  • Wang, X. et al. (1995). Remodelling of hand representation in adult cortex determined by timing of tactile stimulation. Nature 378(6552): 71–75. 
  • Yamamoto, S. & Kitazawa, S. (2001). Reversal of subjective temporal order due to arm crossing. Nature neuroscience 4(7): 759. 

Further reading

Related Scholarpedia Articles

Personal tools
Namespaces

Variants
Actions
Navigation
Focal areas
Activity
Tools