Monday, September 5, 2011

Paper Reading #3: Pen + Touch = New Tools

Reference Information
Pen + Touch = New Tools
Ken Hinckley, Koji Yatani, Michel Pahud, Nicole Coddington, Jenny Rodenhouse, Andy Wilson, Hrvoje Benko, Bill Buxton
Presented at UIST'10 October 3-6, 2010, New York, New York, USA

Author Bios

  • Ken Hinckley is a principal research at Microsoft Research. He holds a PhD in Computer Science from the University of Virginia.
  • Koji Yatani is a PhD candidate at the University of Toronto and previously worked for Microsoft Research in Redmond
  • Michel Pahud works for Microsoft Research and holds a PhD in parallel computing from the Swiss Federal Institute of Technology.
  • Nicole Coddington worked for Microsoft Research and is now a senior interface designer at HTC.
  • Jenny Rodenhouse works for Microsoft Research. She is currently in their Xbox division.
  • Andy Wilson is a senior researcher at Microsoft Research with a PhD from MIT's Media Lab. He helped found the Surface Computing group.
  • Hrvoje Benko is a researcher of adaptive systems and interaction for Microsoft Research. He received his PhD from Columbia University.
  • Bill Buxton is a principal researcher for Microsoft Research with three honorary doctorates.
Summary
Hypothesis

Can we divide pen, touch, and combination tasks intuitively for UI design?

Methods
A design study asked participants to illustrate a short film storyboard by pasting clippings into a paper notebook. Observed behavior was categorized into nine behavior types. The implemented system was presented in a similar fashion.

Results
The paper experiment found various behaviors: fingers and the pen have specific roles, the pen is tucked when not in use, clippings were held and "framed" by fingers, sheets were held in the non-dominant hand, an extended workplace appeared, users created piles of materials, some users drew along edges of clippings, and users held their place in the notebook with fingers. The principle of the pen writing, touch manipulating, and pen + touch producing complex interactions was fairly natural for users once instructed, though the gestures are not self-revealing. Users enjoyed the stapling and copying features and being able to hold their place, though sometimes they lost their place. The cutting feature versus tearing emphasized that users perceive touch differently from pen + touch.

Contents
The authors tried to combine pen and touch gestures into a combination they called pen + touch. Most of previous work in this area resulted in unintuitive gestures or used buttons. The authors suggested that pen and touch generally operated in different fields, but also that gestures should be based on how users interact with physical materials. Their main design considerations were input types, if the interface should change based the task, how to assign devices to hands, how to best map inputs, how many hands to use, if simultaneous or sequential actions are needed, when to use ink or command mode, and whether inputs would be simple or phrased together. The authors decided that in general the pen should be ink, and touch should manipulate objects. Users naturally interleave pen and touch, allowing for more actions.

The system reflected natural interaction with a notebook, providing whole-screen and dual-screen views. Simultaneous pen and touch usage is supported, but palm inputs were ignored. Manipulating, zooming, and selecting objects is done through touch gestures. Common controls supported both pen and touch to fit with user tendencies. The pen + touch techniques  use the non-preferred hand to hold an item while the pen acts related to the item. A stapling interface was included, with the pen as the stapler to prevent errors. Users could cut or copy images by moving the pen. Objects could be a straightedge instead of having a ruler tool. The various tasks could also be combined. Photos can be used as brushes, and a pen stroke was similar to curving tape. Finger painting broke the "touch manipulates" rule but fit user expectations. A finger-operated bezel menu allowed for objects to be stored off-screen, pages to be flipped, and places to be held.

Discussion
The authors sought a more intuitive usage for touch and pen controls and found one. I am convinced that their system is effective, though I must agree that it is not self-evident.

This work is particularly interesting to me because I find consumer touch-based software at best clunky and at worst unusable. This system has the potential to set in motion a push for natural interfaces that don't necessarily state that they are an emulation of paper but are just as intuitive as paper.

The authors specifically noted that palm touches are ignored, which drops an entire category of interactions. For instance, the dominant hand could rest on the screen while writing, triggering a handwriting recognition program, but when the hand isn't resting, the user intends to leave their writing as is.

No comments:

Post a Comment