On-Skin Interaction Using Body Landmarks

Jürgen Steimle
Joanna Bergstrom-Lehtovirta
Martin Weigel
Aditya Shekhar Nittala
Sebastian Boring
Alex Olwal
Kasper Hornbæk
IEEE Computer, 50 (2017), pp. 19-27

Abstract

Recent research in human–computer interaction (HCI) has recognized the human skin as a promising surface for interacting with computing devices. The human skin is large, always available, and sensitive to touch. Leveraging it as an interface helps overcome the limited surface real estate of today’s wearable devices and allows for input to smart watches, smart glasses, mobile phones, and remote displays.

Various technologies have been presented that transform the human skin into an interactive surface. For instance, touch input has been captured using cameras, body-worn sensors and slim skin-worn electronics. Output has been provided using projectors, thin displays, and computer-induced muscle movement. Researchers have also developed experimental interaction techniques for the human skin; for instance, allowing a user to activate an interface element by tapping on a specific finger location or by grabbing or squeezing the skin.

To keep the design and engineering tractable, most existing work has approached the skin as a more or less planar surface. In that way, principles and models for designing interaction could be transferred from existing touch-based devices to the skin. However, this assumes that the resolution of sensing or visual output on the skin is as uniform and dense as on current touch devices. It is not; current on-skin interaction typically allows
only touch gestures or tapping on a few distinct locations with varying performance and, therefore, greatly
limits possible interaction styles. It might be acceptable
for answering or rejecting a phone call, but it is not powerful enough to allow expressive interaction with a wide
range of user interfaces and applications.

More importantly, this line of
thinking does not consider the fact
that the human skin has unique properties that vary across body locations,
making it fundamentally different from planar touch surfaces. For instance, the skin contains many distinct geometries that users can feel and see during interactions, such as
the curvature of a finger or a protruding knuckle. Skin is also stretchable, which allows novel interactions based
on stretching and deforming. Additionally, skin provides a multitude of
sensory cells for direct tactile feedback, and proprioception guides the
user during interaction on the body.