Functional Imaging and Robotics for Sensorimotor Transformation

There is evidence that the central nervous system processes tactile information similarly to visual information. However, little is known about the brain networks which process tactile information. We are using robotics and resting state functional magnetic resonance imaging to gain a better understanding of the neural mechanisms involved in using touch for dexterous control of the hand.

Background

Pantograph

Recent advances in neuroscience have provided evidence that the way the central nervous system processes tactile information may be similar to the way it processes visual information. However, much less is known about the networks in the brain that process tactile information than those that process visual information. Given the importance of the hand for activities of daily living, we develop techniques that will lead to a better understanding of the neural mechanisms involved in using touch for dexterous control of the hand.

Objectives

This research is designed to discover how the brain processes sensory information from the hand and uses it to control muscles when using the hand to perform dexterous actions.  In particular, we are attempting to identify the network or networks in the brain that use information about contact with an object to control the force applied to the object or the motion of a finger over the surface of the object. Imagine grasping an object and moving it without looking at it. The fingers can only be placed at certain locations on the object or the grasp will fail. Sufficient force must be applied or the object will slip. If the object is fragile, applying too much force may crush it. To investigate which parts of the brain determine location and force we have designed haptic tasks that focus on these aspects of dexterous actions.

Methods

Haptic tasks are performed with the hand. In this investigation not only must the tasks be performed with the hand, but they must rely purely on sensory information originating from sensory receptors in the skin and muscles of the hand. We use small robotic systems that can move and apply forces to the hand for these tasks.

The lateral tactile display (Latero, Hayward et al.) is used to create a virtual ridge on a flat surface. The sensation of a ridge is produced by placing the fingertip on an array of tiny blades that resemble the teeth on a comb.  As the blades move apart they stretch the skin activating sensory receptors in a way that is similar to the way in which they are activated when the fingertip moves over bump. This virtual ridge is moved to random locations on the flat surface. The haptic task is to locate the ridge.

A small pantograph robot (Pantograph, Hayward et al.) which moves on a horizontal surface is used to apply force to the fingertip only at specific locations. If the locations where force is applied form a continuous curve the applied force creates the sensation of touching a contoured shape. The shape forms a hollow shell with elastic properties like a balloon that can pop if too much force is applied. The haptic task is to trace the shape with the fingertip without popping it or losing contact.

Two Pantographs are used to apply force to the thumb and index finger. This creates a virtual object which can be grasped. The haptic task is to grasp a virtual object shaped like a wedge by applying equal force with the thumb and index finger. The applied force is represented as two orthogonal components. Inequality of one force component is sensed through motion of the thumb. Inequality of the other force component is sensed through vibration of the index finger.

A robot which can be grasped between the thumb and index finger and twisted by turning the wrist is used to create a fragile virtual object that can be rotated. If the thumb and finger are moving too quickly when they contact the object or if they apply a grasping force that is too large the object will be crushed. If no grasping force is applied the object cannot rotate. The haptic task is to rotate the object without crushing it.

In task 1, texture is sensed to control the position the hand. In task 2, force direction is sensed to control the direction of hand motion and stiffness is sensed to control the magnitude of applied force. In task 3, motion and vibration are sensed to independently control the thumb and finger forces. In task 4, the motion and force are sensed to control the speed of the grasp and the magnitude of the grasp force. The four tasks have been designed to create combinations of sensory inputs and motor outputs with sufficient differences to determine whether there is more than one network for transforming tactile information into motor commands to the hand.

Resting-State Networks

Networks are identified using a technique called resting-state network analysis. Our underlying assumption (based on the results of our previous research) is that repetition of a haptic task leaves an activation trace in the brain in areas that were involved in transforming sensory inputs into motor outputs. The hypothesis is that the strength of synaptic connections between these areas change during repeated performance of the task and that these changes are retained for some time following completion of training. As a result, the resting-state activity in these functionally connected areas will be more highly correlated after training than before training due to the increased strength of synaptic connections.

Functional Connectivity

Our measure of brain activity is the so-called BOLD (blood oxygenation level dependent) signal. The BOLD signal is measured by conducting an MRI scan of the brain with the parameters of the scan tuned to detect the amount of deoxyhemoglobin throughout the brain. The BOLD signal throughout the brain is analyzed to find compartments in different regions where it fluctuates in a similar (correlated) manner. Compartments where the BOLD signal is strongly correlated are presumed to be functionally connected, i.e. active (functional) synaptic connections between the compartments are transmitting or processing information. By comparing where changes in the strength of functional connectivity change after performing each task it will be possible to determine whether there is a single network or multiple networks and to describe the network(s). If more than one network is identified it may also be possible to assign specific computations to specific connections.

Preliminary Results

Task 1

Exploratory movements in tactile search can be quickly arrested after encountering a contrast in surface texture over a large range of velocities. Integration of somatosensory information needed to initiate a change in motion occurs within 50 ms of encountering a contrast in texture. Our results suggest that a change in muscle activation can occur in less than 40 ms from the time that a change in texture is encountered. This would indicate that the fastest responses that use tactile information to control muscles of the hand and arm involve neural pathways contained entirely within the spinal cord.

Funding

  • 7th Framework Program of the European Commission through an International Fellowship of the Marie-Curie Actions: Project 624431 Functional Imaging and Robotics for Sensorimotor Transformation (FIRST)
  • ETH Zurich
  • University Hospital Zurich

Publications

JavaScript has been disabled in your browser