Thursday, April 16, 2020

Your skin computes info from touch

A hand on a guitar fretboard against a blue background

Your skin can aid the processing of tactile information from touch, researchers report.

As our body’s largest and most prominent organ, the skin also provides one of our most fundamental connections to the world around us. From the moment we’re born, it is intimately involved in every physical interaction we have.

“The sense of touch is not fully understood, even though it is at the heart of our ability to interact with the world.”

Though scientists have studied the sense of touch, or haptics, for more than a century, many aspects of how it works remain a mystery.

“The sense of touch is not fully understood, even though it is at the heart of our ability to interact with the world,” says haptics researcher Yon Visell of the University of California, Santa Barbara.

“Anything we do with our hands—picking up a glass, signing our name, or finding keys in our bag—none of that is possible without the sense of touch. Yet we don’t fully understand the nature of the sensations captured by the skin or how they are processed in order to enable perception and action.”

We have better models for how our other senses, such as vision and hearing, work, but our understanding of how the sense of touch works is much less complete, he added.

To help fill that gap, Visell and his research team have been studying the physics of touch sensation—how touching an object gives rise to signals in the skin that shape what we feel.

In their new study in Science Advances, the group reveals how the intrinsic elasticity of the skin aids tactile sensing.

To understand this significant but little-known aspect of touch, Visell thinks it is helpful to think about how the eye, our visual organ, processes optical information.

“Human vision relies on the optics of the eye to focus light into an image on the retina,” he says. “The retina contains light-sensitive receptors that translate this image into information that our brain uses to decompose and interpret what we’re looking at.”

An analogous process unfolds when we touch a surface with our skin, Visell continues. Similar to the structures such as the cornea and iris that capture and focus light onto the retina, the skin’s elasticity distributes tactile signals to sensory receptors throughout the skin.

Building on previous work which used an array of tiny accelerometers worn on the hand to sense and catalog the spatial patterns of vibrations generated by actions such as tapping, sliding or grasping, the researchers here employed a similar approach to capture spatial patterns of vibration that are generated as the hand feels the environment.

“We used a custom device consisting of 30 three-axis sensors gently bonded to the skin,” explains lead author Yitian Shao, a PhD candidate in electrical and computer engineering. “And then we asked each participant in our experiments to perform many different touch interactions with their hands.”

The research team collected a dataset of nearly 5000 such interactions, and analyzed that data to interpret how the transmission of touch-produced vibration patterns that were transmitted throughout the hand shaped information content in the tactile signals. The vibration patterns arose from the elastic coupling within the skin itself.

The team then analyzed these patterns in order to clarify how the transmission of vibrations in the hand shaped information in the tactile signals.

“We used a mathematical model in which high-dimensional signals felt throughout the hand were represented as combinations of a small number of primitive patterns,” Shao explains. The primitive patterns provided a compact lexicon, or dictionary, that compressed the size of the information in the signals, enabling them to be encoded more efficiently.

This analysis generated a dozen or fewer primitive wave patterns—vibrations of the skin throughout the hand that could be used to capture information in the tactile signals felt by the hand. The striking feature of these primitive vibration patterns, Visell says, is that they automatically reflected the structure of the hand and the physics of wave transmission in the skin.

“Elasticity plays this very basic function in the skin of engaging thousands of sensory receptors for touch in the skin, even when contact occurs at a small skin area,” he explains.

“This allows us to use far more sensory resources than would otherwise be available to interpret what it is that we’re touching.”

The remarkable finding of their research is that this process also makes it possible to more efficiently capture information in the tactile signals, Visell says. Information processing of this kind is normally considered to be performed by the brain, rather than the skin.

The role played by mechanical transmission in the skin is in some respects similar to the role of the mechanics of the inner ear in hearing, Visell says. In 1961, von Bekesy received the Nobel Prize for his work showing how the mechanics of the inner ear facilitate auditory processing. By spreading sounds with different frequency content to different sensory receptors in the ear they aid the encoding of sounds by the auditory system. The team’s work suggests that similar processes may underly the sense of touch.

These findings, according to the researchers, not only contribute to our understanding of the brain, but may also suggest new approaches for the engineering of future prosthetic limbs for amputees that might be endowed with skin-like elastic materials. Similar methods also could one day be used to improve tactile sensing by next-generation robots.

Vincent Hayward at the Sorbonne also contributed to the work.

Source: UC Santa Barbara

The post Your skin computes info from touch appeared first on Futurity.



from Futurity https://ift.tt/3bfO9HA

No comments:

Post a Comment